Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: move the prompt to role module #189

Merged
merged 3 commits into from
Dec 13, 2023
Merged

feat: move the prompt to role module #189

merged 3 commits into from
Dec 13, 2023

Conversation

imotai
Copy link
Contributor

@imotai imotai commented Nov 12, 2023

PR-Codex overview

This PR focuses on updating the code in the og_roles module.

Detailed summary:

  • Added a new role module.
  • Updated the chat function in llama_client.py to include a default value for the stop parameter.
  • Updated the run_task function in agent_api_server.py for better code readability.
  • Added a new test case in tokenizer_test.py.
  • Updated the create_new_memory_with_default_prompt function in base_agent.py for better code readability.
  • Updated the extract_message function in base_agent.py for better code readability.
  • Updated the setup.py file to include package information.
  • Updated the FUNCTION_EXECUTE and FUNCTION_DIRECT_MESSAGE variables in prompt.py.
  • Added a new test case in base_agent.py.
  • Updated the llama_agent.py file for better code readability.

The following files were skipped due to too many changes: agent/src/og_agent/llama_agent.py

✨ Ask PR-Codex anything about this PR by commenting with /codex {your question}

@imotai imotai merged commit e931df1 into main Dec 13, 2023
2 checks passed
@imotai imotai deleted the feat/llama_new_action branch December 13, 2023 07:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant