Replies: 1 comment
-
🤖 Hello @semoal! I'm Dosu, a friendly bot here to assist you with your LlamaIndex queries or issues while we're waiting for a human maintainer. I'm here to help you navigate through bug fixes, answer your questions, and even guide you on becoming a contributor. Let's get started! Based on the information you've provided, it seems like you're trying to change the If you want to change this attribute dynamically, you can do so by directly setting the attribute to the new value. Here's how you can do it: llm._completion_to_prompt = new_function Where However, please note that class Replicate:
...
def set_completion_to_prompt(self, new_function):
self._completion_to_prompt = new_function
... Then you can use this method to change llm.set_completion_to_prompt(new_function) This is a safer and more Pythonic way to change the attribute. However, this would require modifying the If you're still having issues after trying these solutions, please provide more details about your use case and the exact error message you're getting, and I'll be happy to help further. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
We have an use case that we start the LLM class
And runs fine, but then we want to change it dynamically given an input that comes from a websocket, we have tried to reinitialize this LLM class and do
ServiceContext.from_defaults()
and run again the set_global_service_context but when trying to run the engine, it's still with the old prompt.We run the application with
uvicorn
if this is useful for diagnosing what happensWe use by the way this way of querying:
I would expect that could be possible to pass
completion_to_prompt
onas_chat_engine
function call, but is not possible.. Should be?Is something we can do for handling dynamic prompts easily?
Beta Was this translation helpful? Give feedback.
All reactions