Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug Report: Issue with Dify AI Version 0.14.1 - OpenAI Bad Request Error (400) #11981

Open
5 tasks done
officialsuyogdixit opened this issue Dec 23, 2024 · 2 comments
Open
5 tasks done
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers

Comments

@officialsuyogdixit
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.14.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Create a chatbot:

  1. Configure it to use the latest O1 model.
  2. Try using the chatbot in the Preview Panel.
  3. The error is triggered immediately in this mode.

Test the chatbot in "Run App" mode:
The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows in case of chatbot.

Create a workflow:

  1. Configure a workflow using the same O1 model.
  2. Test the workflow in the workflow editor or through the API.
  3. The same 400 Bad Request error is encountered here as well.
Screenshot 2024-12-23 at 8 12 38 AM

✔️ Expected Behavior

  • The chatbot should function seamlessly in both the Preview Panel and Run App modes.
  • The workflow should execute without error, including when tested via the API.

❌ Actual Behavior

Description

While using the latest release of Dify AI (Version 0.14.1) with OpenAI O1 models, the following error is consistently encountered. Meanwhile I observed that o1-preview works fine.:

[openai] Bad Request Error, Error code:
400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Supported values are: false.,
'type': 'invalid_request_error', 'param':
'stream', 'code': 'unsupported_value'}}

This issue occurs under specific circumstances (described below) despite the correct setup. The error suggests an incompatibility with the stream=true configuration for the model.

Steps to Reproduce

  1. Create a chatbot:

    • Configure it to use the latest O1 model.
    • Try using the chatbot in the Preview Panel.
    • The error is triggered immediately in this mode.
  2. Test the chatbot in "Run App" mode:

    • The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows.
  3. Create a workflow:

    • Configure a workflow using the same O1 model.
    • Test the workflow in the workflow editor or through the API.
    • The same 400 Bad Request error is encountered here as well.

Expected Behavior

  • The chatbot should function seamlessly in both the Preview Panel and Run App modes.
  • The workflow should execute without error, including when tested via the API.

Actual Behavior

  • The chatbot fails in the Preview Panel but works in the Run App mode.
  • The workflow fails consistently, both in the editor and through API testing, throwing the following error:
    [openai] Bad Request Error, Error code:
    400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Supported values are: false.,
    'type': 'invalid_request_error', 'param':
    'stream', 'code': 'unsupported_value'}}
    

Environment

  • Dify Version: 0.14.1
  • OpenAI Model: O1 (latest)
  • Configuration: Default configurations with stream enabled. But I also tried with blocking mode with the API, it still did not worked.

Additional Details

  • The error appears to be related to the stream=true configuration, which the O1 model does not support.
  • The discrepancy between Preview Panel/Workflow testing and Run App functionality indicates that the handling of the stream parameter might be inconsistent across these modes.

Suggested Fix

  1. Update the Dify configuration validation to handle models that do not support streaming (stream=true) by:

    • Automatically setting stream=false for unsupported models.
    • Providing clear error messages and configuration guidance in the UI.
  2. Ensure consistency in behavior between:

    • Preview Panel and Run App modes.
    • Workflow Editor and Workflow API Testing.
  3. Update the documentation to clearly list models and their compatibility with the stream parameter.

References

Copy link

dosubot bot commented Dec 23, 2024

Hello @officialsuyogdixit! I'm here to help you with bug reports, questions, and contributing. Let's get this issue sorted out together!

There is a similar open issue regarding the handling of the stream parameter in Dify. The suggested solution involves ensuring that the stream parameter is set to False for models that do not support streaming. In Dify, the UI differentiates between models that support streaming and those that do not by using a boolean parameter stream in the inference function. The default value is not_supported, indicating that streaming is not supported unless explicitly set to supported. You might want to check the configuration in your Dify UI to ensure that the streaming option is set to not_supported for models that do not support it [1].

Additionally, the stream parameter is set or modified in the _invoke method of the LargeLanguageModel base class, where it defaults to True. You may need to adjust this setting in your configuration to avoid the 400 Bad Request error when using models that do not support streaming [2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the 🐞 bug Something isn't working label Dec 23, 2024
@crazywoola crazywoola added the good first issue Good first issue for newcomers label Dec 23, 2024
@laipz8200 laipz8200 removed their assignment Dec 23, 2024
@laipz8200
Copy link
Member

Oops, it looks like streaming mode isn’t supported for o1 yet. We’ll need to update the model configuration to make it work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants