Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR]: Integration with crew ai #953

Open
rrfaria opened this issue Dec 22, 2024 · 3 comments
Open

[FR]: Integration with crew ai #953

rrfaria opened this issue Dec 22, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@rrfaria
Copy link

rrfaria commented Dec 22, 2024

Proposal summary

I would like to ask a native integration with crewAi

Now I did this tracking data using opik
but I did this by using decorator @ track and also OpikTracer on model using callbacks, it works but doesn't shows tokens spent, tool called and many other crewai functionalits

Motivation

CrewAi is one of the most used frameworks used to create agents

@rrfaria rrfaria added the enhancement New feature or request label Dec 22, 2024
@alexkuzmik
Copy link
Collaborator

Hi @rrfaria! That's a good suggestion, thanks, we'll look into adding this soon.
We'll keep you posted!

@alexkuzmik
Copy link
Collaborator

@rrfaria to give us more user context, could you please share an example code snippet showing how you are trying to use crewAi with Opik today?

@rrfaria
Copy link
Author

rrfaria commented Dec 24, 2024

@alexkuzmik
My crewAi implementation is a little bit customized, and doesn't follows all their patterns.
you can find crewai standard examples here

crewai have 2 main ways to implement it, using crews or using flow. In this repo you can find more examples.

they have this documentation with many other integrations.

The creator is a brazilian @joaomdmoura.
if you need you can get in touch with him in linkedin, he is super friendly.

Here is an example how I got it working with opik. it doesn't track token amount and many other informations. I guess because I'm using a local llm using llm studio as a server.

from opik.integrations.langchain import OpikTracer
from langchain_openai import ChatOpenAI
from opik import track
from crewai import Agent, Task

opik_tracer = OpikTracer()

model = ChatOpenAI(
    temperature=0.7,
    model="gpt-4o",
    callbacks=[opik_tracer],
)

@track(name="my_first_agent")
def agent_1(llm):
    agent = Agent(
        role="your role",
        goal="your goal",
        llm=llm,
        backstory="your backstory"
    )

    task = Task(
        description="your task description",
        expected_output="your expected output"
    )
    output = task.execute()
    return output

@track("my_project")
def main():
    result_ag_1 = agent_1(model)
    # other agents you desire
    return result_ag_1


if __name__ == "__main__":
    main()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants