Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
3a3ca8e6
Unverified
Commit
3a3ca8e6
authored
Mar 07, 2024
by
Yeuoly
Committed by
GitHub
Mar 07, 2024
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix: max tokens can only up to 2048 (#2734)
parent
27e67848
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
0 deletions
+1
-0
llm.py
api/core/model_runtime/model_providers/xinference/llm/llm.py
+1
-0
No files found.
api/core/model_runtime/model_providers/xinference/llm/llm.py
View file @
3a3ca8e6
...
@@ -308,6 +308,7 @@ class XinferenceAILargeLanguageModel(LargeLanguageModel):
...
@@ -308,6 +308,7 @@ class XinferenceAILargeLanguageModel(LargeLanguageModel):
type
=
ParameterType
.
INT
,
type
=
ParameterType
.
INT
,
use_template
=
'max_tokens'
,
use_template
=
'max_tokens'
,
min
=
1
,
min
=
1
,
max
=
credentials
.
get
(
'context_length'
,
2048
),
default
=
512
,
default
=
512
,
label
=
I18nObject
(
label
=
I18nObject
(
zh_Hans
=
'最大生成长度'
,
zh_Hans
=
'最大生成长度'
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment