Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: litellm integration logs tag more than once #528

Open
1 of 4 tasks
dsblank opened this issue Oct 31, 2024 · 2 comments
Open
1 of 4 tasks

[Bug]: litellm integration logs tag more than once #528

dsblank opened this issue Oct 31, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@dsblank
Copy link
Contributor

dsblank commented Oct 31, 2024

Willingness to contribute

Yes. I can contribute a fix for this bug independently.

What component(s) are affected?

  • Python SDK
  • Opik UI
  • Opik Server
  • Documentation

Opik version

  • Opik version: 1.0.3

Describe the problem

When I run the below code, the tag "openai" is repeated multiple times.

image

Above you can see two "openai". Sometimes it shows 15.

And two other items can be seen when running the code:

Traceback (most recent call last):
  File "/home/dblank/miniconda3/envs/py310/lib/python3.10/site-packages/litellm/integrations/opik/opik.py", line 68, in __init__
    asyncio.create_task(self.periodic_flush())
  File "/home/dblank/miniconda3/envs/py310/lib/python3.10/asyncio/tasks.py", line 336, in create_task
    loop = events.get_running_loop()
RuntimeError: no running event loop

and:

/home/dblank/miniconda3/envs/py310/lib/python3.10/site-packages/litellm/integrations/opik/opik.py:74: RuntimeWarning: coroutine 'CustomBatchLogger.periodic_flush' was never awaited
  self.flush_lock = None
RuntimeWarning: Enable tracemalloc to get the object allocation traceback

But it appears to log everything.

Reproduction steps

Here is the stripped down code:

import litellm
from litellm.integrations.opik.opik import OpikLogger
import opik
import os
from uuid import uuid4

model = "openai/gpt-3.5-turbo"

# Set these:
os.environ["OPIK_WORKSPACE"] = "WORKSPACE"
os.environ["OPIK_API_KEY"] = "COMET-API-KEY"
os.environ["OPIK_PROJECT_NAME"] = "bug-test"
os.environ["OPENAI_API_KEY"] = "OPENAI-API-KEY"

def get_response(stream):
    for chunk in stream:
        if chunk["choices"][0]["delta"]["content"]:
            yield chunk["choices"][0]["delta"]["content"]

opik.configure(use_local=False)

logger = OpikLogger()
    
litellm.callbacks = [logger]

# Get a new prompt, and stream the output:
prompt = "Why do birds sing?"

stream = litellm.completion(
    model=model,
    messages=[
        {"role": "user", "content": prompt}
    ],
    metadata={
        "opik": {
            "tags": [model],
            "chat_id": str(uuid4()),
        },
    },
    stream=True,
)
response = "".join(list(get_response(stream)))
print(response)

This shows the tag "openai" added from litellm integration twice. Sometimes it adds 15 or more.

@dsblank dsblank added the bug Something isn't working label Oct 31, 2024
@dsblank dsblank changed the title [Bug]: [Bug]: litellm integration logs tag more than once Oct 31, 2024
@dsblank
Copy link
Contributor Author

dsblank commented Oct 31, 2024

Also, it might be related to the logger. It looks like if I reuse the logger, it keeps add the "openai" tag over and over.

@dsblank
Copy link
Contributor Author

dsblank commented Nov 1, 2024

Regarding the traceback, and looking at https://github.com/BerriAI/litellm/blob/main/litellm/integrations/opik/opik.py#L67-L74

It doesn't seem like this is a situation where we should show a traceback, but just info. I don't understand the consequences, but it seems everything works correctly if async periodic flushing is not enabled.

        try:
            asyncio.create_task(self.periodic_flush())
            self.flush_lock = asyncio.Lock()
        except Exception as e:
            verbose_logger.exception(
                f"OpikLogger - Asynchronous processing not initialized as we are not running in an async context {str(e)}"
            )
            self.flush_lock = None

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant