Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
2f7b234c
Unverified
Commit
2f7b234c
authored
Aug 17, 2023
by
takatost
Committed by
GitHub
Aug 17, 2023
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix: max token not exist in generate summary when calc rest tokens (#891)
parent
4f5f9506
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
0 deletions
+1
-0
llm_generator.py
api/core/generator/llm_generator.py
+1
-0
No files found.
api/core/generator/llm_generator.py
View file @
2f7b234c
...
...
@@ -51,6 +51,7 @@ class LLMGenerator:
prompt_with_empty_context
=
prompt
.
format
(
context
=
''
)
prompt_tokens
=
model_instance
.
get_num_tokens
([
PromptMessage
(
content
=
prompt_with_empty_context
)])
max_context_token_length
=
model_instance
.
model_rules
.
max_tokens
.
max
max_context_token_length
=
max_context_token_length
if
max_context_token_length
else
1500
rest_tokens
=
max_context_token_length
-
prompt_tokens
-
max_tokens
-
1
context
=
''
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment