Replies: 1 comment
-
Similar to this issue |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I have a difficulty making sense of something I see happening and was hoping to gain some clarity here.
A bit of context
Python version: 3.11.4
llama-index version: 0.8.50
openai version: 0.27.9
system: Windows 10
I use the gpt 3.5 model for a chat engine with an index. I have modified the text_qa and refine prompts slightly, mostly so that they work slightly better for my native language.The index I try to chat with is is a combination of multiple indexes that are all between 1-5 pages. There are about 20 of such smaller indexes in this combined index, so it spans roughly 60 pages.
Observation
When I chat with the index, simply using the
as_chat_engine
and thenchat
, I find that sometimes there is no context in my source nodes, even though I havesimilarity_top_k =3
. Other times, when asking a question that is very similar to the one where not source nodes seemed to be queried, it does give 3 source nodes in the response. I have logged my queries to see if indeed no context strings are in my prompts or if it was just a quirk in the reponse, but indeed no context strings are sent through the OpenAI API. I logged using the following method, explained in the docs here`import logging
import sys
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))`
Question
What is the reason for this happening, I would prefer that there is always some context queried and that it doesnt automatically jump to an answer from the llm prior knowledge, or simply says "I dont know it is not in the context", when the context is empty. I looked for some other discusions, but could not find any, I saw this post and wondered if maybe this was a feature that was already implement and if it is, how I could turn it off.
Hopefully I have provided enough information, if not pleae let me know. And thank you in advance!
Beta Was this translation helpful? Give feedback.
All reactions