Skip to content

Commit

Permalink
update to v0.3 - add anthropic, rework params, rename class
Browse files Browse the repository at this point in the history
  • Loading branch information
BenderV committed Aug 28, 2024
1 parent effdb0b commit b762a31
Show file tree
Hide file tree
Showing 9 changed files with 540 additions and 465 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,4 @@ autochat/__pycache__/
build/
dist/
.DS_Store
draft
78 changes: 44 additions & 34 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# AutoChat

AutoChat is an assistant interface to OpenAI and alternative providers, to simplify the process of creating interactive agents.
AutoChat is an assistant library, that supports OpenAI/Anthropic, to simplify the process of creating interactive agents.

- **ChatGPT Class**: Conversation wrapper to store instruction, context and messages histories.
- **Autochat Class**: Conversation wrapper to store instruction, context and messages histories.
- **Message Class**: Message wrapper to handle format/parsing automatically.
- **Function Calls**: Capability to handle function calls within the conversation, allowing complex interactions and responses.
- **Template System**: A straightforward text-based template system for defining the behavior of the chatbot, making it easy to customize its responses and actions.
Expand All @@ -20,14 +20,45 @@ Please note that this package requires Python 3.6 or later.
## Simple Example

```python
> from autochat import ChatGPT
> chat = ChatGPT(instruction="You are a parot")
> from autochat import Autochat
> chat = Autochat(instruction="You are a parot")
> chat.ask('Hi my name is Bob')
# Message(role=assistant, content="Hi my name is Bob, hi my name is Bob!")
> chat.ask('Can you tell me my name?')
# Message(role=assistant, content="Your name is Bob, your name is Bob!")
```

## Function Calls Handling

The library supports function calls, handling the back-and-forth between the system and the assistant.

```python
from autochat import Autochat, Message
import json

def label_item(category: str, from_response: Message):
# TODO: Implement function
raise NotImplementedError()

with open("./examples/function_label.json") as f:
FUNCTION_LABEL_ITEM = json.load(f)

classifierGPT = Autochat.from_template("./examples/classify_template.txt")
classifierGPT.add_function(label_item, FUNCTION_LABEL_ITEM)

text = "The new iPhone is out"
for message in classifierGPT.run_conversation(text):
print(message.to_markdown())

# > ## assistant
# > It's about \"Technology\" since it's about a new iPhone.
# > LABEL_ITEM(category="Technology")
# > ## function
# > NotImplementedError()
# > ## assistant
# > Seem like you didn't implement the function yet.
```

## Template System

We provide a simple template system for defining the behavior of the chatbot, using markdown-like syntax.
Expand All @@ -52,52 +83,31 @@ Your name is Bob, your name is Bob!
You can then load the template file using the `from_template` method:

```python
parrotGPT = ChatGPT.from_template("./parrot_template.txt")
parrotGPT = Autochat.from_template("./parrot_template.txt")
```

The template system also supports function calls. Check out the [examples/classify.py](examples/classify.py) for a complete example.

## Function Calls Handling
The template system also supports function calls. Check out the [examples/demo_label.py](examples/demo_label.py) for a complete example.

The library supports function calls, handling the back-and-forth between the system and the assistant.

```python
from autochat import ChatGPT, Message
import json
## Use different API providers (only anthropic and openai are supported for now)

def label_item(category: str, from_response: Message):
# TODO: Implement function
raise NotImplementedError()
Default provider is openai.

with open("./examples/function_label.json") as f:
FUNCTION_LABEL_ITEM = json.load(f)
Anthropic:

classifierGPT = ChatGPT.from_template("./examples/classify_template.txt")
classifierGPT.add_function(label_item, FUNCTION_LABEL_ITEM)

text = "The new iPhone is out"
for message in classifierGPT.run_conversation(text):
print(message.to_markdown())

# > ## assistant
# > It's about \"Technology\" since it's about a new iPhone.
# > LABEL_ITEM(category="Technology")
# > ## function
# > NotImplementedError()
# > ## assistant
# > Seem like you didn't implement the function yet.
```python
chat = Autochat(provider="anthropic")
```

## Environment Variables

The `AUTOCHAT_DEFAULT_MODEL` environment variable specifies the model to use. If not set, it defaults to "gpt-4-turbo".
The `AUTOCHAT_MODEL` environment variable specifies the model to use. If not set, it defaults to "gpt-4-turbo".

```bash
export AUTOCHAT_MODEL="gpt-4-turbo"
export OPENAI_API_KEY=<your-key>
```

Use `AUTOCHAT_HOST` to use alternative provider that are openai compatible (openpipe, llama_cpp, ...)
Use `AUTOCHAT_HOST` to use alternative provider (openai, anthropic, openpipe, llama_cpp, ...)

## Support

Expand Down
Empty file added __init__.py
Empty file.
Loading

0 comments on commit b762a31

Please sign in to comment.