Request for feature to modify the temperature of the LLM

Hi,

Would it be possible to add a feature to allow modifying the temperature of the LLM? This would enable adjusting the level of creativity and accuracy of the responses according to specific needs.

Thank you in advance for your response.

Best regards,

+1 for the temperature as well as an ability to set Top P

PS: for the context, I'm using the ChatGPT feature with a local Ollama instance.

1 Like

In 5.575 you can now specify extra api parameters as JSON string. For example:

{"reasoning_effort": "low", "verbosity": "low"}

This should also work for temperature

2 Likes