my coming up working setup will be a mac book pro plus an ipad pro next to it as a 2nd screen. If I press the key FN, I'd like to have BTT floating window open on the ipadpro so I can touch (!) within that menus any fields with key commands inside for the actual open application. If it's done, BTT disappears and the ipadpro works as a 2nd screen again.
Questions:
right now BTT floating menu do not support finger touch, right? So my future setup isn't possible like I dream about to have it?
the BTT remote comes in: This enables touch, right? But that means, I have to have BTT remote open on my ipadpro all of the time: I am loosing my 2nd screen for my work, right?
Thx! Good to read. And kind of this condition "check if 2nd screen is connected" I can set up in the general BTT preferences? Otherwise I would have to copy&paste this condition into every FN+ key command.
Just have to wrap my mind around why there's a need for the new BTT remote app, again?
it works across devices (iphone, ipad, vision pro) and is in some cases nicer than having to use the ipad as external screen. But if you are already using the ipad as screen, you don’t need it. In that case using the floating menu‘s directly makes more sense.
It also makes integration with Apple Shortcuts and Siri on iOS possible.
You can create a conditional activation group that checks the „active screen resolutions“ variable. That one will differ depending in your screen setup.
as far as I understand it:
the new BTT remote 2 is an app running on the iPad/iPhone and only in such situation you will have "touch" / e.g. your fingers to touch the iPad" for your floating menu.
uuh, I know now what I got "wrong". (I did not use the pen before...)
When I wrote "touch" I mean a touch/click with a finger on the button (the floating menu is on the iPad).
Overhere it works... but only when I click the button using the Apple Pen (or with the Mac touchpad / mouse cursor), but not with a finger (on the iPad).
When using the new BTT remote 2 app, the menu will work with a finger touch or not ?
Could you publish a sort of preview video, to show the potential... and remove the confusion...
you are right, sorry I was confused because I usually use Duet Display instead of sidecar, there touch is supported. Side car really doesn't support touch input for whatever reason (unfortunately Duet Display's subscription for touch input & pencil support is pretty expensive)
Hi, I just read that at least pencil works along with sidecar. Also from the Apple page, but I can't verify it:
Gesten auf dem iPad verwenden: Du kannst grundlegende Gesten wie Tippen, Streichen, Scrollen und Zoomen sowie Gesten zum Eingeben und Bearbeiten von Text verwenden.
One of the WWDC 2024 subjects was "Bring your app to Siri" and I thought it would be nice if we could "talk to Floating menu buttons"...
It is already possible to setup shortcuts/Siri to run BTT triggers.
In my floating menu I have tons of buttons/menu items and I cannot possibly create shortcuts for all of them, and it might be possible (for Siri) to use the "Menu item name/identifier " to trigger it by voice...
With the new BTT remote mobile app (iOS/iPadOS) will it be possible to use Siri (I mean Siri on the iPad, as my MacBook's Siri is disabled) to trigger/press a floating menu button just by talking to Siri ?
Christian, how would you trigger Siri in a studio environment? If your DAW is set to an external interface, you might have to plugin a mic, adding phantompower...
But I like the idea as I wondered how to setup key commands within floating menus to reach row 1, row 2, row 3 without touching the mouse.
The idea was use the Siri on the iPad/iPhone and not Siri on the Mac/DAW machine. (on the Mac/DAW Siri can be disabled). No extra setup/mic needed...
The floating menu will "run" on the upcoming, new BTT Remote mobile on an iPad/iPhone and will use the iPad/iPhone Siri.
.
you would just use voice/Siri to 'click/touch' a menu button.
one way could be as simple as to use the Menu item name (here: play track) as the voice command for Siri.... but that depends how Andreas will implement "the magic"...
In this example you would just say: Siri: play track on the iPad/iPhone and the Siri voice command will execute all actions of that menu item / button.