Replies: 3 comments
-
There is no text qa prompt for a context engine. It just calls the retriever for your index, and inserts the retrieved text into the system prompt (along with the system prompt you provided) You cant pass in both memory and chat history. Add that chat history to the memory, and then pass that in |
Beta Was this translation helpful? Give feedback.
-
🤖 Hello, To customize the system prompt, CHAT_TEXT_QA_PROMPT, and CHAT_REFINE_PROMPT in the Context Chat Engine, you can use the from llama_index.llms import ChatMessage, MessageRole
from llama_index.prompts import ChatPromptTemplate
# Text QA Prompt
chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"You are an expert Q&A system that is trusted around the world.\n"
"Always answer the query using the provided context information, "
"and not prior knowledge.\n"
"Some rules to follow:\n"
"1. Never directly reference the given context in your answer.\n"
"2. Avoid statements like 'Based on the context, ...' or "
"'The context information ...' or anything along "
"those lines."
),
),
ChatMessage(
role=MessageRole.USER,
content=(
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query.\n"
"Query: {query_str}\n"
"Answer: "
),
),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)
# Refine Prompt
chat_refine_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"You are an expert Q&A system that strictly operates in two modes "
"when refining existing answers:\n"
"1. **Rewrite** an original answer using the new context.\n"
"2. **Repeat** the original answer if the new context isn't useful.\n"
"Never reference the original answer or context directly in your answer.\n"
"When in doubt, just repeat the original answer."
),
),
ChatMessage(
role=MessageRole.USER,
content=(
"New Context: {context_msg}\n"
"Query: {query_str}\n"
"Original Answer: {existing_answer}\n"
"New Answer: "
),
),
]
refine_template = ChatPromptTemplate(chat_refine_msgs) You can pass these custom prompts into the pipeline when constructing the index or executing the query engine. For example: print(
index.as_query_engine(
text_qa_template=text_qa_template, refine_template=refine_template
).query("Who is Joe Biden?")
) As for the history = engine.chat_history This will return a list of For more details, you can refer to the following sources:
I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
🤖 Hello, To customize the system prompt, CHAT_TEXT_QA_PROMPT, and CHAT_REFINE_PROMPT in the Context Chat Engine, you can use the from llama_index.llms import ChatMessage, MessageRole
from llama_index.prompts import ChatPromptTemplate
# Text QA Prompt
chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"You are an expert Q&A system that is trusted around the world.\n"
"Always answer the query using the provided context information, "
"and not prior knowledge.\n"
"Some rules to follow:\n"
"1. Never directly reference the given context in your answer.\n"
"2. Avoid statements like 'Based on the context, ...' or "
"'The context information ...' or anything along "
"those lines."
),
),
ChatMessage(
role=MessageRole.USER,
content=(
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query.\n"
"Query: {query_str}\n"
"Answer: "
),
),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)
# Refine Prompt
chat_refine_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"You are an expert Q&A system that strictly operates in two modes "
"when refining existing answers:\n"
"1. **Rewrite** an original answer using the new context.\n"
"2. **Repeat** the original answer if the new context isn't useful.\n"
"Never reference the original answer or context directly in your answer.\n"
"When in doubt, just repeat the original answer."
),
),
ChatMessage(
role=MessageRole.USER,
content=(
"New Context: {context_msg}\n"
"Query: {query_str}\n"
"Original Answer: {existing_answer}\n"
"New Answer: "
),
),
]
refine_template = ChatPromptTemplate(chat_refine_msgs) You can pass these custom prompts into the pipeline when constructing the index or executing the query engine. For example: print(
index.as_query_engine(
text_qa_template=text_qa_template, refine_template=refine_template
).query("Who is Joe Biden?")
) As for the history = engine.chat_history This will return a list of For more details, you can refer to the following sources:
I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Question Validation
Question
I am using the currently high level below APIs for Context Chat Engine.
``chat_engine = index.as_chat_engine(
chat_mode='context',
llm = my_llm, memory=chat_memory, similarity_top_k=top_k,
system_prompt=system_prompt, verbose=True) `
I would like to give my own prompt template of system prompt, CHAT_TEXT_QA_PROMPT, CHAT_REFINE_PROMPT, as well as a context template. Request help on that.
When, I attempted with
ContextChatEngine
, Unable to provideCHAT_TEXT_QA_PROMPT. Moreover, I need to explicitly give chat_history, as the memory.chat_history is empty always.Beta Was this translation helpful? Give feedback.
All reactions