Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
4350bb9a
Unverified
Commit
4350bb9a
authored
May 23, 2023
by
John Wang
Committed by
GitHub
May 23, 2023
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Fix/human in answer (#174)
parent
fe688b50
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
8 deletions
+8
-8
completion.py
api/core/completion.py
+8
-8
No files found.
api/core/completion.py
View file @
4350bb9a
from
typing
import
Optional
,
List
,
Union
from
typing
import
Optional
,
List
,
Union
,
Tuple
from
langchain.callbacks
import
CallbackManager
from
langchain.callbacks
import
CallbackManager
from
langchain.chat_models.base
import
BaseChatModel
from
langchain.chat_models.base
import
BaseChatModel
...
@@ -97,7 +97,7 @@ class Completion:
...
@@ -97,7 +97,7 @@ class Completion:
)
)
# get llm prompt
# get llm prompt
prompt
=
cls
.
get_main_llm_prompt
(
prompt
,
stop_words
=
cls
.
get_main_llm_prompt
(
mode
=
mode
,
mode
=
mode
,
llm
=
final_llm
,
llm
=
final_llm
,
pre_prompt
=
app_model_config
.
pre_prompt
,
pre_prompt
=
app_model_config
.
pre_prompt
,
...
@@ -115,7 +115,7 @@ class Completion:
...
@@ -115,7 +115,7 @@ class Completion:
mode
=
mode
mode
=
mode
)
)
response
=
final_llm
.
generate
([
prompt
])
response
=
final_llm
.
generate
([
prompt
]
,
stop_words
)
return
response
return
response
...
@@ -123,7 +123,7 @@ class Completion:
...
@@ -123,7 +123,7 @@ class Completion:
def
get_main_llm_prompt
(
cls
,
mode
:
str
,
llm
:
BaseLanguageModel
,
pre_prompt
:
str
,
query
:
str
,
inputs
:
dict
,
def
get_main_llm_prompt
(
cls
,
mode
:
str
,
llm
:
BaseLanguageModel
,
pre_prompt
:
str
,
query
:
str
,
inputs
:
dict
,
chain_output
:
Optional
[
str
],
chain_output
:
Optional
[
str
],
memory
:
Optional
[
ReadOnlyConversationTokenDBBufferSharedMemory
])
->
\
memory
:
Optional
[
ReadOnlyConversationTokenDBBufferSharedMemory
])
->
\
Union
[
str
|
List
[
BaseMessage
]]:
Tuple
[
Union
[
str
|
List
[
BaseMessage
]],
Optional
[
List
[
str
]
]]:
# disable template string in query
# disable template string in query
query_params
=
OutLinePromptTemplate
.
from_template
(
template
=
query
)
.
input_variables
query_params
=
OutLinePromptTemplate
.
from_template
(
template
=
query
)
.
input_variables
if
query_params
:
if
query_params
:
...
@@ -165,9 +165,9 @@ And answer according to the language of the user's question.
...
@@ -165,9 +165,9 @@ And answer according to the language of the user's question.
if
isinstance
(
llm
,
BaseChatModel
):
if
isinstance
(
llm
,
BaseChatModel
):
# use chat llm as completion model
# use chat llm as completion model
return
[
HumanMessage
(
content
=
prompt_content
)]
return
[
HumanMessage
(
content
=
prompt_content
)]
,
None
else
:
else
:
return
prompt_content
return
prompt_content
,
None
else
:
else
:
messages
:
List
[
BaseMessage
]
=
[]
messages
:
List
[
BaseMessage
]
=
[]
...
@@ -236,7 +236,7 @@ And answer according to the language of the user's question.
...
@@ -236,7 +236,7 @@ And answer according to the language of the user's question.
messages
.
append
(
human_message
)
messages
.
append
(
human_message
)
return
messages
return
messages
,
[
'
\n
Human:'
]
@
classmethod
@
classmethod
def
get_llm_callback_manager
(
cls
,
llm
:
Union
[
StreamableOpenAI
,
StreamableChatOpenAI
],
def
get_llm_callback_manager
(
cls
,
llm
:
Union
[
StreamableOpenAI
,
StreamableChatOpenAI
],
...
@@ -323,7 +323,7 @@ And answer according to the language of the user's question.
...
@@ -323,7 +323,7 @@ And answer according to the language of the user's question.
)
)
# get llm prompt
# get llm prompt
original_prompt
=
cls
.
get_main_llm_prompt
(
original_prompt
,
_
=
cls
.
get_main_llm_prompt
(
mode
=
"completion"
,
mode
=
"completion"
,
llm
=
llm
,
llm
=
llm
,
pre_prompt
=
pre_prompt
,
pre_prompt
=
pre_prompt
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment