Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to switch to any models installed outside of LOLLMS #529

Open
profucius opened this issue Apr 29, 2024 · 4 comments
Open

Unable to switch to any models installed outside of LOLLMS #529

profucius opened this issue Apr 29, 2024 · 4 comments

Comments

@profucius
Copy link

profucius commented Apr 29, 2024

Expected Behavior

Ability to change models that were installed to Ollama outside of LOLLMS.

Current Behavior

I have installed models into my Ollama using another source. Some directly within Ollama, some from another AI chatbot app. LOLLMS sees these were installed, but is calling them "custom model"s and does not let me select any of the to switch to it.

When I click any of these "custom model"s, the UI has a faded overlay (as if it is trying to load). When I look at the CMD window, it says this:

INFO: ::1:6408 - "POST /update_setting HTTP/1.1" 400 Bad Request

I can't tell if this is one or two separate issues.

Steps to Reproduce

I don't know how to explain how to reproduce it, other than I did a fresh install, then installed some models outside of LOLLMS. Try installing any specific model within Ollama directly, such as "Llama 3 7B" variant. You could also try using "Msty" app which is here on github, it's a windows app that is a chat manager similar to LOLLMS. I installed some models using that app, same result as installing directly within Ollama.

Possible Solution

I don't know what to suggest, other than perhaps provide some more context in the error message.

Context

This is a fresh install of LOLLMS. I configured nothing. The one model (mistral) that the UI shows is properly installed does work fine when I start a chat with it. I installed mistral outside of LOLLMS. None of the models were installed from within LOLLMS.

Screenshots

image

image

@ParisNeo
Copy link
Owner

This should have been fixed by now
it was a problem with my sanitization function that forbids : in the model name

@profucius
Copy link
Author

This should have been fixed by now
it was a problem with my sanitization function that forbids : in the model name

Interesting. Should I try reinstalling? I'll try what you suggest. I'm capable of debugging if needed.

@hostolis
Copy link

hostolis commented Jun 6, 2024

Getting similar problems.
Fresh install, nothing configured manually.
image

@ParisNeo
Copy link
Owner

ParisNeo commented Jun 7, 2024

hey, did you test refreshing your page to allow the cache to be updated?
lollms uses hard caching. Each time you update it you need to refresh the webui.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants