Updated embedding to work with OpenAI's Python v1.3+ and with Azure O… #569
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hi!
This is meant to correct problems with embedding in the latest versions of the OpenAI Python Library and GPTCache.
The latest versions of the OpenAI Python API do not include the
.api_base
function; even if you manually override it, this leads to other issues. Moreover, the best practice now is to instantiate a client and to use it instead of the global context.The Azure OpenAI part has further complications related to "Azure Deployments" and different os.environ variables.
Rather than keep track of these within GPTCache, my suggestion is that it might make sense to have the user pass in an instance of the OpenAI client object.
I have access to both OpenAI and Azure OpenAI so I can offer additional help with that if necessary.
One of the issues I am running into is that there is an async version of the OpenAI objects; obviously we can't use async with those. Is there a method you have in mind to deal with this, or would you prefer only to work with the synchronous items?
I am in the process of testing this, so if you are interested but would like to make chances, please don't hesitate to let me know.