Replies: 4 comments
-
🤖 Hi, Based on your question, it seems you want to merge multiple indices into a single one. You can achieve this using the from llama_index import GPTListIndex
index = GPTListIndex([index1, index2]) After merging the indices, you can build a graph and save it to disk: from llama_index import ComposableGraph
all_indices = [index1, index2, index3, index4]
index_summaries=[index1_summary, index2_summary, index3_summary, index4_summary]
graph = ComposableGraph.from_indices(GPTTreeIndex, all_indices, index_summaries=index_summaries)
graph.save_to_disk("graph.json") To load the saved graph, use the following code: index = ComposableGraph.load_from_disk("graph.json") This solution was suggested in a similar issue: How can merge multi index file Regarding your issue with ChromaDB, LlamaIndex handles the storage and retrieval of vectors in ChromaDB through the If you could provide more details about the error you're encountering (any error messages, the data you're trying to store, etc.), it would be easier to diagnose the problem. I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@dosu-bot |
Beta Was this translation helpful? Give feedback.
-
🤖 Hello, The error you're encountering is due to the fact that the There are a couple of ways to resolve this issue:
import pickle
class ComposableGraph:
# existing code...
def save_to_disk(self, file_path):
"""Save the graph to disk."""
with open(file_path, 'wb') as f:
pickle.dump(self, f) In this example, the
# persist storage to disk
storage_context.persist(str(tmp_path))
# load storage context
new_storage_context = StorageContext.from_defaults(persist_dir=str(tmp_path))
# load index
new_graph = load_graph_from_storage(
new_storage_context, root_id=graph.root_id, service_context=mock_service_context
) You can find this code in the tests/indices/test_loading_graph.py file in the LlamaIndex repository. I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@byronz3d You can persist the graph using below code
You can find the code snippet from core module document: https://docs.llamaindex.ai/en/stable/core_modules/data_modules/index/composability.html#optional-persisting-the-graph Yes, it should have been easier to find in the documentation. I myself found it today. Hope it helps. |
Beta Was this translation helpful? Give feedback.
-
Question Validation
Question
Hi,
I want to create several collections in my Zep server, and then when a prompt/query arrives, I want to combine several collections into one index and be able to query the combined index.
Is this even possible?
What I tried to do, unsuccessfully, is something like:
The logic behind this is I have various user levels and based on the user I want to give them a different index. Higher level users gets more indexes, while low level user gets a single index to query. I don't want to create the same index over and over if I can combine them somehow.
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions