Replies: 3 comments
-
You can click connect to external endpoint, and then connect it as a custom OpenAI API. But koboldcpp should provide superior results. Why do you need ollama? |
Beta Was this translation helpful? Give feedback.
-
Ollama now supports downloading models directly from hugging faces. There is no need to download them separately in GGUF format for other apps. |
Beta Was this translation helpful? Give feedback.
-
Yes, it did work changing the AI provider to OpenAI Compatible API. Or not. It's asking for the API key to work, but Ollama doesn't use an API key. |
Beta Was this translation helpful? Give feedback.
-
I would like to use a llm model through ollama when using --nomodel with Koboltcpp.
Beta Was this translation helpful? Give feedback.
All reactions