-
Hi. I can get PandasQueryEngine working fine with openai. But I can't figure out how to get this to use a different llm. I'd like to use ollama and have it already specified as per https://docs.llamaindex.ai/en/stable/examples/llm/ollama.html but can't seem to get PandasQueryEngine to use it. It keeps asking me for the Openai Can someone help point me in the right direction please? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
🤖 Hello @edwin-nz! 👋 I'm Dosu, a friendly bot here to help you with LlamaIndex while the human maintainers are away. I'm equipped to assist you with bugs, answer your queries, and guide you on becoming a contributor. So, if you've got questions or need some help, don't hesitate to ask. Just a reminder, I'm not a human, but I'll do my best to assist you. Let's get started, shall we? 😊 Based on the information from similar issues in the LlamaIndex repository, it seems you need to create an instance of the ollama LLM and pass it to the service context when creating the query engine. Here's an example of how you might do this: from llama_index.llms.ollama import Ollama # Assuming ollama is in the llms directory
from llama_index.query_engine import ServiceContext, NLSQLTableQueryEngine
# Create an instance of the ollama LLM
ollama_llm = Ollama(model="ollama-model-name") # Replace "ollama-model-name" with the actual model name
# Create a service context with the ollama LLM
service_context = ServiceContext.from_defaults(llm=ollama_llm, embed_model="local:BAAI/bge-base-en-v1.5")
# Create an instance of the query engine with the service context
query_engine = NLSQLTableQueryEngine(sql_database=sql_database, tables=["users",], service_context=service_context) Please note that this is a hypothetical example and the actual code may vary depending on the implementation of the ollama LLM. You can refer to these issues for more context: [Bug]: cannot select model other than openai models in openai llm integration and Impossible to use any other LLM than openai for querying SQL. If you're still having trouble, could you please provide more details about the error message you're seeing and the steps you're taking to set up the ollama LLM? This will help me better understand the issue and provide a more accurate solution. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Thanks. I've set the following for PandasQueryEngine and it works now.
from llama_index import ServiceContext
from llama_index.query_engine import PandasQueryEngine
service_context = ServiceContext.from_defaults(llm=ollama, embed_model="local")
query_engine = PandasQueryEngine(df=df, verbose=True, service_context=service_context)