This is an image created with DALL-E 2. Use it for your Slatbot profile image.
This repository connects the LLM API to Slack. It currently supports implementations using OpenAI's ChatGPT and Google's Gemini model. The basic structure is straightforward. When a message arrives through Slack, we generate a response using the LLM's API. It has multimodal capabilities, enabling us to process and analyze images.
All settings are set via environment variables. See here.
envrionment | description | default |
---|---|---|
slack_token | A Slack token that begins with XOXB |
required |
openai_token | An OpenAI token that begins with sk |
required |
number_of_messages_to_keep | Set how many conversation histories to keep | 5 |
max_token | The maximum number of tokens | 2048 |
system_content | Enter the system content for ChatGPT | N/A |
gpt_model | GPT Model | gpt-3.5-turbo |
gemini_model | Gemini Model | gemini-1.5-pro-001 |
claude_model | Claude Model | claude-3-5-sonnet@20240620 |
- OpenAI GPT
- Google Gemini
- Anthropic Claude
- Docker
Before running the application, make sure that Docker is installed and running on your system.
important: Set and use all the environment variables in app/config/constants.py.
- First, to run this application in your local environment, please execute the following command to install the required libraries.
pip install -r requirements.txt
- Once the necessary libraries have been installed, execute the following command to run the application.
uvicorn app.main:app --reload
This command will run the application based on the app object in the main module of the app package.
You can use the --reload
option to automatically reload the application when file changes are detected.
- Clone the repository:
https://github.com/jybaek/llm-with-slack.git
cd llm-with-slack
- Build the Docker image:
docker build -t llm-api .
- Run the Docker container:
docker run --rm -it -p8000:8000 llm-api
- Open your web browser and go to
http://localhost:8000/docs
to access the Swagger UI and test the API.
Gemini | GPT |
---|---|
The API documentation can be found at http://localhost:8000/docs
once the Docker container is running.
This project is licensed under the terms of the Apache license. See LICENSE for more information.