Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

local-ai-packaged Chat workflow does not use Qdrant data and python error #28

Open
dan-burt opened this issue Dec 12, 2024 · 1 comment

Comments

@dan-burt
Copy link

I followed the guide for the project using the Apple Silicon M1 pathway. I have Ollama manually installed, with the 2 models coded into the default config, as well as a few others.

I create the 3 credentials as instructed, but I change the workflow to watch a local (to the n8n container) folder. I can see this data in the Qdrant UI.

When I chat with the model, 2x pictures of the workflow to show that it is reading in the correct data.

1st image shows it summarising a list of events in the text file. There is a returned error as well.

< python_tag >tool call: get_christmas_plans()

SCR-20241212-sftj

2nd image shows the result of the Chat workflow's Ollama Model node, in case it showed any more detail(?)
SCR-20241212-sgao

@mp3pintyo
Copy link

Unfortunately, Ai Agent, Qdrant, and Ollama do not work well together.
https://community.n8n.io/t/ai-agent-vector-store-tool-generates-strange-responses-and-seems-to-miss-the-tool-information/54952/7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants