-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LocalAI not working - OPENAI_API_KEY required error #316
Comments
maybe this was a typo:
but this should have the environment variable used to find the openai api key if
|
You are right. I've now set it to
|
I can somewhat confirm this behaviour. Even when removing all API configurations other than the one for LocalAI I get asked for an OpenAI API key (which I do not have). I guess that's because OpenAI is the default API configuration ( I think the environment variable does not need to hold your actual OpenAI API key, any value should do to pass this check. I got past this point by setting OPENAI_API_KEY to a random value, and it looks like mods is making a request to the configured API endpoint now. There's now answer displayed after the "Generating" message finishes, but I am not sure if that is a problem with mods or with my setup of LocalAI. The link they give for setup (https://github.com/mudler/LocalAI#example-use-gpt4all-j-model) is dead, so I'm not sure if my LocalAI is set up correctly. |
Reporting the same issue, can't use local AI api, even after deleting all openai api in the config, it always asked for OpenAI AI API Key! |
Yeah I'm also getting the same error in 1.6 as you @securegh . Tried both without openai keys and with valid ones. |
Same issue here ... |
Describe the bug
Using the latest version, for some reason, I cannot use my LocalAI endpoints at all.
Having first carried over a configuration from an older version and then completely reset settings and only added my localai endpoint (either keeping or deleting the configuration for other apis), whatever I do, I keep getting:
I have tried:
OPENAI_API_KEY
via config/settings (it should not be required if using localai).OPENAI_API_KEY
The behavior is the same regardless of command. If I go with
mods -M
, I am able to select my model and type a prompt and am later present with that error again (see attached GIF)Setup
To Reproduce
Steps to reproduce the behavior:
Source Code
Config file:
Alternatively:
Expected behavior
OPENAI_KEY should be ignored if set or not if using LocalAI as API.
Screenshots
Behavior as per the second config file:
Additional context
I'm not sure if I am missing something, but even having generated a fresh config and having looked a the code I see two issues:
The text was updated successfully, but these errors were encountered: