You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed the guide for the project using the Apple Silicon M1 pathway. I have Ollama manually installed, with the 2 models coded into the default config, as well as a few others.
I create the 3 credentials as instructed, but I change the workflow to watch a local (to the n8n container) folder. I can see this data in the Qdrant UI.
When I chat with the model, 2x pictures of the workflow to show that it is reading in the correct data.
1st image shows it summarising a list of events in the text file. There is a returned error as well.
< python_tag >tool call: get_christmas_plans()
2nd image shows the result of the Chat workflow's Ollama Model node, in case it showed any more detail(?)
The text was updated successfully, but these errors were encountered:
I followed the guide for the project using the Apple Silicon M1 pathway. I have Ollama manually installed, with the 2 models coded into the default config, as well as a few others.
I create the 3 credentials as instructed, but I change the workflow to watch a local (to the n8n container) folder. I can see this data in the Qdrant UI.
When I chat with the model, 2x pictures of the workflow to show that it is reading in the correct data.
1st image shows it summarising a list of events in the text file. There is a returned error as well.
< python_tag >tool call: get_christmas_plans()
2nd image shows the result of the Chat workflow's Ollama Model node, in case it showed any more detail(?)
The text was updated successfully, but these errors were encountered: