llama-index tools just for open LLMs? #9356
Replies: 2 comments
-
🤖 Hello @silvacarl2! I'm Dosu, a bot here to assist you with any bugs, questions, and contributions to LlamaIndex while we wait for a human maintainer. Don't hesitate to reach out if you need anything. Yes, the LlamaIndex tools are compatible with open source LLMs. The LlamaIndex framework integrates with various open-source language models, including Konko AI and OpenAI models. The code includes functions for handling API calls, retries, and model selection, which can be used in various sectors, including healthcare. Here are some relevant parts of the code:
LLAMA_MODELS = {
"meta-llama/Llama-2-13b-chat-hf": 4096,
"meta-llama/Llama-2-70b-chat-hf": 4096,
}
OPEN_AI_MODELS = {
"gpt-4": 8192,
"gpt-4-0314": 8192,
"gpt-3.5-turbo-0613": 4097,
"gpt-4-0613": 8192,
"gpt-3.5-turbo-0301": 4097,
"gpt-3.5-turbo": 4097,
"gpt-3.5-turbo-16k": 16385,
"gpt-3.5-turbo-16k-0613": 16385,
}
def konko_modelname_to_contextsize(modelname: str) -> int:
...
context_size = ALL_AVAILABLE_MODELS.get(modelname, None)
...
return context_size
def completion_with_retry(is_chat_model: bool, max_retries: int, **kwargs: Any) -> Any:
...
return _completion_with_retry(**kwargs)
async def acompletion_with_retry(
is_chat_model: bool, max_retries: int, **kwargs: Any
) -> Any:
...
return await _completion_with_retry(**kwargs) You can find these in the llama_index/llms/konko_utils.py file. In terms of open source LLMs that are compatible with the LlamaIndex tools, here are some:
This information is derived from the import statements and the For more detailed information on how to use a custom LLM model with LlamaIndex, you can refer to the documentation at https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-using-a-custom-llm-model. I hope this helps! If you have any more questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
🤖 Hello @silvacarl2! I'm Dosu, a friendly bot here to assist with your LlamaIndex queries, bug reports, and contributions. Don't worry, a human maintainer will join us soon. In the meantime, I'm here to assist you. Let's tackle this together! Yes, LlamaIndex tools can work with open source LLMs. There have been similar queries in the past and solutions have been provided. For instance, in the issue How can I search for local documents without using OpenAI?, the solution suggested setting up both an embed model and an LLM. The documentation for setting up a custom LLM model can be found here and for setting up a local embedding model can be found here. In another issue run llama_index with locally llm that have compatible OpenAI's API, the user was directed to the documentation at https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-using-a-custom-llm-model for using a custom LLM model with LlamaIndex. As for the compatibility of LlamaIndex tools with open source LLMs, the LlamaIndex framework integrates with the Konko AI and OpenAI models, which are open-source language models. The code includes functions for handling API calls, retries, and model selection, which can be used in various sectors, including healthcare. You can refer to the llama_index/llms/konko_utils.py for more details. Moreover, the LlamaIndex tools are compatible with a wide range of open source LLMs, including AI21, Anthropic, Anyscale, AzureOpenAI, Bedrock, Clarifai, Cohere, CustomLLM, EverlyAI, GradientBaseModelLLM, GradientModelAdapterLLM, HuggingFaceInferenceAPI, HuggingFaceLLM, Konko, LangChainLLM, LiteLLM, LlamaCPP, LocalAI, MockLLM, MonsterLLM, Ollama, OpenAI, OpenAILike, OpenLLM, OpenLLMAPI, PaLM, Perplexity, Portkey, PredibaseLLM, Replicate, WatsonX, Xinference, Vllm, Vertex. This information is derived from the import statements and the I hope this information helps. If you have any more questions or need further clarification, feel free to ask. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
this may seem like a pretty basic quesitons but we need to find llama-index tools that work with open source LLMs. we are inthe healthcare space and we cannot use openai, etc.
Beta Was this translation helpful? Give feedback.
All reactions