Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
7898937e
Unverified
Commit
7898937e
authored
Aug 13, 2023
by
takatost
Committed by
GitHub
Aug 13, 2023
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
feat: optimize message return (#822)
parent
1bd0a76a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
3 deletions
+6
-3
base.py
api/core/model_providers/models/llm/base.py
+6
-3
No files found.
api/core/model_providers/models/llm/base.py
View file @
7898937e
...
...
@@ -218,15 +218,18 @@ class BaseLLM(BaseProviderModel):
def
_get_prompt_from_messages
(
self
,
messages
:
List
[
PromptMessage
],
model_mode
:
Optional
[
ModelMode
]
=
None
)
->
Union
[
str
|
List
[
BaseMessage
]]:
if
len
(
messages
)
==
0
:
raise
ValueError
(
"prompt must not be empty."
)
if
not
model_mode
:
model_mode
=
self
.
model_mode
if
model_mode
==
ModelMode
.
COMPLETION
:
if
len
(
messages
)
==
0
:
return
''
return
messages
[
0
]
.
content
else
:
if
len
(
messages
)
==
0
:
return
[]
chat_messages
=
[]
for
message
in
messages
:
if
message
.
type
==
MessageType
.
HUMAN
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment