Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update API keys to use GROQ instead of OpenAI #46

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,9 +59,9 @@ $ export API_ENV_VAR="your-api-key-here"
PS> $env:API_ENV_VAR = "your-api-key-here"
```

### Set OpenAI API key
* If you don't have an OpenAI API key, you can sign up [here](https://openai.com/index/openai-api/).
* Set `OPENAI_API_KEY` in your environment
### Set GROQ API key
* If you don't have an GROQ API key, you can sign up [here](https://console.groq.com/keys).
* Set `GROQ_API_KEY` in your environment

### Sign up and Set LangSmith API
* Sign up for LangSmith [here](https://smith.langchain.com/), find out more about LangSmith
Expand Down Expand Up @@ -91,7 +91,7 @@ Graphs for LangGraph Studio are in the `module-x/studio/` folders.
```
$ for i in {1..4}; do
cp module-$i/studio/.env.example module-$i/studio/.env
echo "OPENAI_API_KEY=\"$OPENAI_API_KEY\"" > module-$i/studio/.env
echo "GROQ_API_KEY=\"$GROQ_API_KEY\"" > module-$i/studio/.env
done
$ echo "TAVILY_API_KEY=\"$TAVILY_API_KEY\"" >> module-4/studio/.env
```
83 changes: 30 additions & 53 deletions module-0/basics.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
"metadata": {},
"source": [
"# LangChain Academy\n",
"\n",
"Welcome to LangChain Academy! \n",
"\n",
"## Context\n",
Expand All @@ -35,25 +34,25 @@
"\n",
"## Chat models\n",
"\n",
"In this course, we'll be using [Chat Models](https://python.langchain.com/v0.2/docs/concepts/#chat-models), which do a few things take a sequence of messages as inputs and return chat messages as outputs. LangChain does not host any Chat Models, rather we rely on third party integrations. [Here](https://python.langchain.com/v0.2/docs/integrations/chat/) is a list of 3rd party chat model integrations within LangChain! By default, the course will use [ChatOpenAI](https://python.langchain.com/v0.2/docs/integrations/chat/openai/) because it is both popular and performant. As noted, please ensure that you have an `OPENAI_API_KEY`.\n",
"In this course, we'll be using [Chat Models](https://python.langchain.com/v0.2/docs/concepts/#chat-models), which do a few things take a sequence of messages as inputs and return chat messages as outputs. LangChain does not host any Chat Models, rather we rely on third party integrations. [Here](https://python.langchain.com/docs/integrations/chat/) is a list of 3rd party chat model integrations within LangChain! By default, the course will use [ChatQroq](https://python.langchain.com/docs/integrations/chat/groq/) because it is both popular and performant and provides free tier for new users.\n",
"\n",
"Let's check that your `OPENAI_API_KEY` is set and, if not, you will be asked to enter it."
"Let's check that your `QROQ_API_KEY` is set and, if not, you will be asked to enter it."
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 1,
"id": "0f9a52c8",
"metadata": {},
"outputs": [],
"source": [
"%%capture --no-stderr\n",
"%pip install --quiet -U langchain_openai langchain_core langchain_community tavily-python"
"%pip install --quiet -U langchain_groq langchain_core langchain_community tavily-python"
]
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": 19,
"id": "c2a15227",
"metadata": {},
"outputs": [],
Expand All @@ -64,15 +63,15 @@
" if not os.environ.get(var):\n",
" os.environ[var] = getpass.getpass(f\"{var}: \")\n",
"\n",
"_set_env(\"OPENAI_API_KEY\")"
"_set_env(\"GROQ_API_KEY\")"
]
},
{
"cell_type": "markdown",
"id": "a326f35b",
"metadata": {},
"source": [
"[Here](https://python.langchain.com/v0.2/docs/how_to/#chat-models) is a useful how-do for all the things that you can do with chat models, but we'll show a few highlights below. If you've run `pip install -r requirements.txt` as noted in the README, then you've installed the `langchain-openai` package. With this, we can instantiate our `ChatOpenAI` model object. If you are signing up for the API for the first time, you should receive [free credits](https://community.openai.com/t/understanding-api-limits-and-free-tier/498517) that can be applied to any of the models. You can see pricing for various models [here](https://openai.com/api/pricing/). The notebooks will default to `gpt-4o` because it's a good balance of quality, price, and speed [see more here](https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4-gpt-4-turbo-gpt-4o-and-gpt-4o-mini), but you can also opt for the lower priced `gpt-3.5` series models. \n",
"[Here](https://python.langchain.com/v0.2/docs/how_to/#chat-models) is a useful how-do for all the things that you can do with chat models, but we'll show a few highlights below. If you've run `pip install -r requirements.txt` as noted in the README, then you've installed the `langchain-groq` package. With this, we can instantiate our `ChatGroq` model object. If you are signing up for the API for the first time, you should receive [free credits](https://console.groq.com/) that can be applied to any of the models. You can see pricing for various models [here](https://groq.com/pricing/). The notebooks will default to `llama3-8b-8192` because it's a good balance of quality, price, and speed.\n",
"\n",
"There are [a few standard parameters](https://python.langchain.com/v0.2/docs/concepts/#chat-models) that we can set with chat models. Two of the most common are:\n",
"\n",
Expand All @@ -84,14 +83,13 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 31,
"id": "e19a54d3",
"metadata": {},
"outputs": [],
"source": [
"from langchain_openai import ChatOpenAI\n",
"gpt4o_chat = ChatOpenAI(model=\"gpt-4o\", temperature=0)\n",
"gpt35_chat = ChatOpenAI(model=\"gpt-3.5-turbo-0125\", temperature=0)"
"from langchain_groq import ChatGroq\n",
"llama_chat = ChatGroq(model=\"llama3-8b-8192\", temperature=0)"
]
},
{
Expand All @@ -109,17 +107,17 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 32,
"id": "b1280e1b",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='Hello! How can I assist you today?', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 9, 'prompt_tokens': 11, 'total_tokens': 20}, 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_157b3831f5', 'finish_reason': 'stop', 'logprobs': None}, id='run-d3c4bc85-ef14-49f6-ba7e-91bf455cffee-0', usage_metadata={'input_tokens': 11, 'output_tokens': 9, 'total_tokens': 20})"
"AIMessage(content=\"Hello world! It's nice to meet you. Is there something I can help you with, or would you like to chat?\", additional_kwargs={}, response_metadata={'token_usage': {'completion_tokens': 27, 'prompt_tokens': 12, 'total_tokens': 39, 'completion_time': 0.0225, 'prompt_time': 0.001680411, 'queue_time': 0.011593389, 'total_time': 0.024180411}, 'model_name': 'llama3-8b-8192', 'system_fingerprint': 'fp_6a6771ae9c', 'finish_reason': 'stop', 'logprobs': None}, id='run-f28ddcdf-72fc-4a47-a418-c4f555b48380-0', usage_metadata={'input_tokens': 12, 'output_tokens': 27, 'total_tokens': 39})"
]
},
"execution_count": 3,
"execution_count": 32,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -134,7 +132,7 @@
"messages = [msg]\n",
"\n",
"# Invoke the model with a list of messages \n",
"gpt4o_chat.invoke(messages)"
"llama_chat.invoke(messages)"
]
},
{
Expand All @@ -147,44 +145,23 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 22,
"id": "f27c6c9a",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='Hello! How can I assist you today?', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 9, 'prompt_tokens': 9, 'total_tokens': 18}, 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_157b3831f5', 'finish_reason': 'stop', 'logprobs': None}, id='run-d6f6b682-e29a-44de-b45e-79fad1e405e5-0', usage_metadata={'input_tokens': 9, 'output_tokens': 9, 'total_tokens': 18})"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"gpt4o_chat.invoke(\"hello world\")"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "fdc2f0ca",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='Hello! How can I assist you today?', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 9, 'prompt_tokens': 9, 'total_tokens': 18}, 'model_name': 'gpt-3.5-turbo-0125', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-c75d3f0f-2d71-47be-b14c-42b8dd2b4b08-0', usage_metadata={'input_tokens': 9, 'output_tokens': 9, 'total_tokens': 18})"
"AIMessage(content=\"Hello world! It's nice to meet you. Is there something I can help you with, or would you like to chat?\", additional_kwargs={}, response_metadata={'token_usage': {'completion_tokens': 27, 'prompt_tokens': 12, 'total_tokens': 39, 'completion_time': 0.0225, 'prompt_time': 0.001662041, 'queue_time': 0.011706418, 'total_time': 0.024162041}, 'model_name': 'llama3-8b-8192', 'system_fingerprint': 'fp_a97cfe35ae', 'finish_reason': 'stop', 'logprobs': None}, id='run-fad3d1ad-5eaf-4126-b1e2-033f8ce4c70e-0', usage_metadata={'input_tokens': 12, 'output_tokens': 27, 'total_tokens': 39})"
]
},
"execution_count": 5,
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"gpt35_chat.invoke(\"hello world\")"
"llama_chat.invoke(\"hello world\")"
]
},
{
Expand All @@ -209,7 +186,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 23,
"id": "091dff13",
"metadata": {},
"outputs": [],
Expand All @@ -219,7 +196,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 24,
"id": "52d69da9",
"metadata": {},
"outputs": [],
Expand All @@ -231,22 +208,22 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 25,
"id": "d06f87e6",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[{'url': 'https://www.datacamp.com/tutorial/langgraph-tutorial',\n",
" 'content': 'LangGraph is a library within the LangChain ecosystem designed to tackle these challenges head-on. LangGraph provides a framework for defining, coordinating, and executing multiple LLM agents (or chains) in a structured manner.'},\n",
" {'url': 'https://langchain-ai.github.io/langgraph/',\n",
" 'content': 'Overview LangGraph is a library for building stateful, multi-actor applications with LLMs, used to create agent and multi-agent workflows. Compared to other LLM frameworks, it offers these core benefits: cycles, controllability, and persistence. LangGraph allows you to define flows that involve cycles, essential for most agentic architectures, differentiating it from DAG-based solutions. As a ...'},\n",
" {'url': 'https://www.youtube.com/watch?v=nmDFSVRnr4Q',\n",
" 'content': 'LangGraph is an extension of LangChain enabling Multi-Agent conversation and cyclic chains. This video explains the basics of LangGraph and codesLangChain in...'}]"
"[{'url': 'https://www.langchain.com/langgraph',\n",
" 'content': 'LangGraph is a stateful, orchestration framework that brings added control to agent workflows. LangGraph Cloud is a service for deploying and scaling LangGraph applications, with a built-in Studio for prototyping, debugging, and sharing LangGraph applications.'},\n",
" {'url': 'https://medium.com/@cplog/introduction-to-langgraph-a-beginners-guide-14f9be027141',\n",
" 'content': 'LangGraph is a library built on top of LangChain, designed to add cyclic computational capabilities to your LLM applications.'},\n",
" {'url': 'https://towardsdatascience.com/from-basics-to-advanced-exploring-langgraph-e8c1cf4db787',\n",
" 'content': 'LangGraph (as you might guess from the name) models all interactions as cyclical graphs. These graphs enable the development of advanced workflows and interactions with multiple loops and if-statements, making it a handy tool for creating both agent and multi-agent workflows.'}]"
]
},
"execution_count": 7,
"execution_count": 25,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -266,7 +243,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "base",
"language": "python",
"name": "python3"
},
Expand All @@ -280,7 +257,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.1"
"version": "3.11.5"
}
},
"nbformat": 4,
Expand Down
Loading