For the Ask AI, can Codex CLI be supported, its the equivalent of Claude CLI? People with an OpenAI plan can use the CLI to generate responses. The auth part shouldn’t be an issue.
I use Codex CLI daily, and i’m so looking forward to using that feature!
Initial codex cli support has been added in 6.256 however I need to do more testing with it. (This version also contains various other refactorings related to h@llo.ai, things might be broken a bit
Whoa, thats fast, it works too! One thing I can suggest could be able to somehow allow use of different model incase the CLI defaults to a heavy reasoning model as that could slow down responses and overthink
Annoyingly there are soooo many CLI tools now. I have a codex CLI account but as I already have claude (not using anthropic servers though) and copilot CLIs I didn’t install codex CLI. Do you support opencode CLI as that seems to be a well respected opensource alternative that supports many auths/providers?
I have configured the opencode cli in btt, but the AI assistant is not responding at all. The opencode cli configuration on my computer is working properly. Where can I check the logs? I don't know what error has occurred.
Could you check where your opencode is installed at? (Terminal command "which opencode") maybe BTT is looking at the wrong location for your installation type
For the crashlog, would be great if you could post the full one from macOS console app's crash report section
I installed opencode using brew, so the path is /opt/homebrew/bin/opencode instead of using curl -fsSL ``https://opencode.ai/install`` | bash. I'm not sure if this is the reason, but I haven't experienced another crash after restarting.
Can you maybe send me one of the logs in ~/Library/Application Support/BetterTouchTool/AI/Conversations ? (andeas@folivora.ai) maybe I can see some error
I think I found an issue with opencode when a model has been specified manually. With 6.274 it gives you a list with available models in Opencode and automatically uses the correct format. In general however it will be faster to use BTT's integrated openai or anthropic compatible API support, because that does not need to go through an extra CLI tool