You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Create a chatbot:
Configure it to use the latest O1 model.
Try using the chatbot in the Preview Panel.
The error is triggered immediately in this mode.
Test the chatbot in "Run App" mode: The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows in case of chatbot.
Create a workflow:
Configure a workflow using the same O1 model.
Test the workflow in the workflow editor or through the API.
The same 400 Bad Request error is encountered here as well.
✔️ Expected Behavior
The chatbot should function seamlessly in both the Preview Panel and Run App modes.
The workflow should execute without error, including when tested via the API.
❌ Actual Behavior
Description
While using the latest release of Dify AI (Version 0.14.1) with OpenAI O1 models, the following error is consistently encountered. Meanwhile I observed that o1-preview works fine.:
[openai] Bad Request Error, Error code:
400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Supported values are: false.,
'type': 'invalid_request_error', 'param':
'stream', 'code': 'unsupported_value'}}
This issue occurs under specific circumstances (described below) despite the correct setup. The error suggests an incompatibility with the stream=true configuration for the model.
Steps to Reproduce
Create a chatbot:
Configure it to use the latest O1 model.
Try using the chatbot in the Preview Panel.
The error is triggered immediately in this mode.
Test the chatbot in "Run App" mode:
The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows.
Create a workflow:
Configure a workflow using the same O1 model.
Test the workflow in the workflow editor or through the API.
The same 400 Bad Request error is encountered here as well.
Expected Behavior
The chatbot should function seamlessly in both the Preview Panel and Run App modes.
The workflow should execute without error, including when tested via the API.
Actual Behavior
The chatbot fails in the Preview Panel but works in the Run App mode.
The workflow fails consistently, both in the editor and through API testing, throwing the following error:
[openai] Bad Request Error, Error code:
400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Supported values are: false.,
'type': 'invalid_request_error', 'param':
'stream', 'code': 'unsupported_value'}}
Environment
Dify Version: 0.14.1
OpenAI Model: O1 (latest)
Configuration: Default configurations with stream enabled. But I also tried with blocking mode with the API, it still did not worked.
Additional Details
The error appears to be related to the stream=true configuration, which the O1 model does not support.
The discrepancy between Preview Panel/Workflow testing and Run App functionality indicates that the handling of the stream parameter might be inconsistent across these modes.
Suggested Fix
Update the Dify configuration validation to handle models that do not support streaming (stream=true) by:
Automatically setting stream=false for unsupported models.
Providing clear error messages and configuration guidance in the UI.
Ensure consistency in behavior between:
Preview Panel and Run App modes.
Workflow Editor and Workflow API Testing.
Update the documentation to clearly list models and their compatibility with the stream parameter.
Hello @officialsuyogdixit! I'm here to help you with bug reports, questions, and contributing. Let's get this issue sorted out together!
There is a similar open issue regarding the handling of the stream parameter in Dify. The suggested solution involves ensuring that the stream parameter is set to False for models that do not support streaming. In Dify, the UI differentiates between models that support streaming and those that do not by using a boolean parameter stream in the inference function. The default value is not_supported, indicating that streaming is not supported unless explicitly set to supported. You might want to check the configuration in your Dify UI to ensure that the streaming option is set to not_supported for models that do not support it [1].
Additionally, the stream parameter is set or modified in the _invoke method of the LargeLanguageModel base class, where it defaults to True. You may need to adjust this setting in your configuration to avoid the 400 Bad Request error when using models that do not support streaming [2].
Self Checks
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Create a chatbot:
Test the chatbot in "Run App" mode:
The chatbot functions without any issues in this mode, suggesting the issue may be specific to preview/test workflows in case of chatbot.
Create a workflow:
✔️ Expected Behavior
❌ Actual Behavior
Description
While using the latest release of Dify AI (
Version 0.14.1
) with OpenAI O1 models, the following error is consistently encountered. Meanwhile I observed that o1-preview works fine.:This issue occurs under specific circumstances (described below) despite the correct setup. The error suggests an incompatibility with the
stream=true
configuration for the model.Steps to Reproduce
Create a chatbot:
Test the chatbot in "Run App" mode:
Create a workflow:
400 Bad Request
error is encountered here as well.Expected Behavior
Actual Behavior
Environment
stream
enabled. But I also tried withblocking
mode with the API, it still did not worked.Additional Details
stream=true
configuration, which the O1 model does not support.stream
parameter might be inconsistent across these modes.Suggested Fix
Update the Dify configuration validation to handle models that do not support streaming (
stream=true
) by:stream=false
for unsupported models.Ensure consistency in behavior between:
Update the documentation to clearly list models and their compatibility with the
stream
parameter.References
The text was updated successfully, but these errors were encountered: