I love the idea of an audio bar visualizer on the touchbar with BTT. I’ve been struggling to figure out where to start with this project though. Can anybody recommend what might be the best way to go about this? I’m comfortable using Python, which I don’t believe would be able to accomplish this for BTT purposes - is there a better language to do this with BTT?
Before determining what language to use, you have to figure out what data should be sent to BTT, I think there’s only two options, image or string.
If data is image/picture, you should calculate the FFT spectrum while the audio is playing,
- save it as a local file, tell BTT the location of the image by JSON string.
- convert image to base64 string, sent the JSON with base64 image to BTT.
You have to figure out how to get the audio clip source that currently playing. Or you generate all images before playing, then index images by current playtime.
It should be noted that the calculation, write files and BTT rending also takes a lot of time, it will be delays and the performance is inefficient.
Thanks for the info! Sounds like this project might not be the best idea due to the potential issues with lag and inefficiency. But I will still give it a shot and see what happens!
This kind of stuff should really be done natively, otherwise it will have pretty bad performance.
BetterTouchTool does allow Swift/Objective-C plugins with custom views, but I'm not sure if you want to go that far
did u ever try this?
That is really really cool!!!!
when do you think it will be done?
will you create a new app for it, or make it downloadable for BTT users on the forum?
Initially it will be a stand alone app but we do have plans to integrate it with BTT.
Tough to give a time frame, depends on how busy myself and partner are over the next few months :).
awesome! can't wait to have it on my own Touch Bar :DDD
Wow this is such a great idea.
Would love to have this.
Even though the final version will be paid, I would open it for beta users to build up some hype around it.