Skip to content

Commit

Permalink
Add model parameter enable_enhance for hunyuan llm model (#6847)
Browse files Browse the repository at this point in the history
Co-authored-by: sun <[email protected]>
  • Loading branch information
maybemaynot and sun authored Jul 31, 2024
1 parent 13f5867 commit 4b41049
Show file tree
Hide file tree
Showing 4 changed files with 32 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,16 @@ parameter_rules:
default: 1024
min: 1
max: 32000
- name: enable_enhance
label:
zh_Hans: 功能增强
en_US: Enable Enhancement
type: boolean
help:
zh_Hans: 功能增强(如搜索)开关,关闭时将直接由主模型生成回复内容,可以降低响应时延(对于流式输出时的首字时延尤为明显)。但在少数场景里,回复效果可能会下降。
en_US: Allow the model to perform external search to enhance the generation results.
required: false
default: true
pricing:
input: '0.03'
output: '0.10'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,16 @@ parameter_rules:
default: 1024
min: 1
max: 256000
- name: enable_enhance
label:
zh_Hans: 功能增强
en_US: Enable Enhancement
type: boolean
help:
zh_Hans: 功能增强(如搜索)开关,关闭时将直接由主模型生成回复内容,可以降低响应时延(对于流式输出时的首字时延尤为明显)。但在少数场景里,回复效果可能会下降。
en_US: Allow the model to perform external search to enhance the generation results.
required: false
default: true
pricing:
input: '0.015'
output: '0.06'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,16 @@ parameter_rules:
default: 1024
min: 1
max: 32000
- name: enable_enhance
label:
zh_Hans: 功能增强
en_US: Enable Enhancement
type: boolean
help:
zh_Hans: 功能增强(如搜索)开关,关闭时将直接由主模型生成回复内容,可以降低响应时延(对于流式输出时的首字时延尤为明显)。但在少数场景里,回复效果可能会下降。
en_US: Allow the model to perform external search to enhance the generation results.
required: false
default: true
pricing:
input: '0.0045'
output: '0.0005'
Expand Down
3 changes: 2 additions & 1 deletion api/core/model_runtime/model_providers/hunyuan/llm/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@ def _invoke(self, model: str, credentials: dict, prompt_messages: list[PromptMes

custom_parameters = {
'Temperature': model_parameters.get('temperature', 0.0),
'TopP': model_parameters.get('top_p', 1.0)
'TopP': model_parameters.get('top_p', 1.0),
'EnableEnhancement': model_parameters.get('enable_enhance', True)
}

params = {
Expand Down

0 comments on commit 4b41049

Please sign in to comment.