yes it is.
I think that will not help in my case, as the variable is set with a "Named trigger" which has these actions below (it finds the play image on screen or not...) and that needs to be run before or (only once) after the menu shows (to be sure that the play button shows the correct state..
It will be run (after the menu loads) after user action for example if the 'space bar' is pressed to start or stop the audio playing.
I could run it automatically every 1-2 seconds, but I want to avoid that because of performance....
It's hard to run anything before the menu is loaded because that would affect performance a lot (once loaded the menus are cached until they change and opening them doesn't require network requests). This would only work with dynamically generated menus (Simple JSON Format · GitBook). However you can update items after the menu is being shown once the "Actions Executed On Appear" are supported in BTT Mobile (not yet)
I want to define a simple piano keyboard with floating menu (BTT Mobile) buttons, The button should send a "MIDI Note ON" when the button is pressed down and a "MIDI Note OFF" when the button is released. Pretty simple...
When I define the a "MIDI Note ON" against a "On Trigger event" it will be fired once the button is released.
How could I implement the above piano keyboard ? Any hint would be welcome!
Ha I thought there was already separate "on press" and "on release / trigger " action categories, but apparently I forgot to add it. Will add it tomorrow.
New iOS builds should also be available tomorrow or wednesday
you might remember the above.
I have played a bit with the below script ( not to dynamically generate menus ! ) but to set the (init) variables I needed and it appears to "work" at least for that simple case...
the "Actions Executed On Appear" would be much nicer anyway.
is it easily possible for you to add velocity data (to BTT variable or so) of the button pressed via BTT Mobile touch ?
That would be handy for my "experimental" midi drumpad. That is meant to work to audition sounds/samples only, as midi send by BTT via BTT Mobile touch / Wifi is much too slow to use it 'live' .
That velocity data could be use to play notes harder or softer. Currently velocity is fixed and adjustable for all pads only via the slider...
I think velocity would only be possible with the force touch touchscreens that Apple had until the iPhone X. The new touch screens don’t recognize the impact/force anymore ;-( (I was so sad when they removed that technology)
(the apple pencil could provide such info, but I doubt you are using a pencil for this ;-(
"AI" found an interesting approach...
... by measure how quickly the user touches and releases the button,
*the duration between onTouchDown and onTouchUp/Release – * the shorter the time, the stronger the velocity could be interpreted.