You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ability to change models that were installed to Ollama outside of LOLLMS.
Current Behavior
I have installed models into my Ollama using another source. Some directly within Ollama, some from another AI chatbot app. LOLLMS sees these were installed, but is calling them "custom model"s and does not let me select any of the to switch to it.
When I click any of these "custom model"s, the UI has a faded overlay (as if it is trying to load). When I look at the CMD window, it says this:
INFO: ::1:6408 - "POST /update_setting HTTP/1.1" 400 Bad Request
I can't tell if this is one or two separate issues.
Steps to Reproduce
I don't know how to explain how to reproduce it, other than I did a fresh install, then installed some models outside of LOLLMS. Try installing any specific model within Ollama directly, such as "Llama 3 7B" variant. You could also try using "Msty" app which is here on github, it's a windows app that is a chat manager similar to LOLLMS. I installed some models using that app, same result as installing directly within Ollama.
Possible Solution
I don't know what to suggest, other than perhaps provide some more context in the error message.
Context
This is a fresh install of LOLLMS. I configured nothing. The one model (mistral) that the UI shows is properly installed does work fine when I start a chat with it. I installed mistral outside of LOLLMS. None of the models were installed from within LOLLMS.
Screenshots
The text was updated successfully, but these errors were encountered:
hey, did you test refreshing your page to allow the cache to be updated?
lollms uses hard caching. Each time you update it you need to refresh the webui.
Expected Behavior
Ability to change models that were installed to Ollama outside of LOLLMS.
Current Behavior
I have installed models into my Ollama using another source. Some directly within Ollama, some from another AI chatbot app. LOLLMS sees these were installed, but is calling them "custom model"s and does not let me select any of the to switch to it.
When I click any of these "custom model"s, the UI has a faded overlay (as if it is trying to load). When I look at the CMD window, it says this:
INFO: ::1:6408 - "POST /update_setting HTTP/1.1" 400 Bad Request
I can't tell if this is one or two separate issues.
Steps to Reproduce
I don't know how to explain how to reproduce it, other than I did a fresh install, then installed some models outside of LOLLMS. Try installing any specific model within Ollama directly, such as "Llama 3 7B" variant. You could also try using "Msty" app which is here on github, it's a windows app that is a chat manager similar to LOLLMS. I installed some models using that app, same result as installing directly within Ollama.
Possible Solution
I don't know what to suggest, other than perhaps provide some more context in the error message.
Context
This is a fresh install of LOLLMS. I configured nothing. The one model (mistral) that the UI shows is properly installed does work fine when I start a chat with it. I installed mistral outside of LOLLMS. None of the models were installed from within LOLLMS.
Screenshots
The text was updated successfully, but these errors were encountered: