You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
Currently the inference settings are linked to the model instead of the chat.
Use Case
Let's say I want to use the same model for Question Answering and Story Generation. I'd probably want different system prompts as well as different settings (less randomness for QA more for Story Gen).
It would be useful to modify the inference settings for each individual chat regardless of the model.
The text was updated successfully, but these errors were encountered:
Description
Currently the inference settings are linked to the model instead of the chat.
Use Case
Let's say I want to use the same model for Question Answering and Story Generation. I'd probably want different system prompts as well as different settings (less randomness for QA more for Story Gen).
It would be useful to modify the inference settings for each individual chat regardless of the model.
The text was updated successfully, but these errors were encountered: