Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
3823ae58
Commit
3823ae58
authored
Mar 12, 2024
by
Joel
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
chore: prompt to promt template
parent
14d71fb5
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
5 additions
and
5 deletions
+5
-5
default.ts
web/app/components/workflow/nodes/llm/default.ts
+1
-1
panel.tsx
web/app/components/workflow/nodes/llm/panel.tsx
+1
-1
types.ts
web/app/components/workflow/nodes/llm/types.ts
+1
-1
use-config.ts
web/app/components/workflow/nodes/llm/use-config.ts
+2
-2
No files found.
web/app/components/workflow/nodes/llm/default.ts
View file @
3823ae58
...
...
@@ -12,7 +12,7 @@ const nodeDefault: NodeDefault<LLMNodeType> = {
},
},
variables
:
[],
prompt
:
[{
prompt
_template
:
[{
role
:
PromptRole
.
system
,
text
:
''
,
}],
...
...
web/app/components/workflow/nodes/llm/panel.tsx
View file @
3823ae58
...
...
@@ -160,7 +160,7 @@ const Panel: FC<NodePanelProps<LLMNodeType>> = ({
<
ConfigPrompt
readOnly=
{
readOnly
}
isChatModel=
{
isChatModel
}
payload=
{
inputs
.
prompt
}
payload=
{
inputs
.
prompt
_template
}
variables=
{
inputs
.
variables
.
map
(
item
=>
item
.
variable
)
}
onChange=
{
handlePromptChange
}
/>
...
...
web/app/components/workflow/nodes/llm/types.ts
View file @
3823ae58
...
...
@@ -4,7 +4,7 @@ import type { CommonNodeType, Memory, ModelConfig, PromptItem, ValueSelector, Va
export
type
LLMNodeType
=
CommonNodeType
&
{
model
:
ModelConfig
variables
:
Variable
[]
prompt
:
PromptItem
[]
|
PromptItem
prompt
_template
:
PromptItem
[]
|
PromptItem
memory
:
Memory
context
:
{
enabled
:
boolean
...
...
web/app/components/workflow/nodes/llm/use-config.ts
View file @
3823ae58
...
...
@@ -27,7 +27,7 @@ const useConfig = (id: string, payload: LLMNodeType) => {
draft
.
model
.
mode
=
model
.
mode
!
const
isModeChange
=
model
.
mode
!==
inputs
.
model
.
mode
if
(
isModeChange
)
draft
.
prompt
=
model
.
mode
===
'chat'
?
[{
role
:
PromptRole
.
system
,
text
:
''
}]
:
{
text
:
''
}
draft
.
prompt
_template
=
model
.
mode
===
'chat'
?
[{
role
:
PromptRole
.
system
,
text
:
''
}]
:
{
text
:
''
}
})
setInputs
(
newInputs
)
},
[
inputs
,
setInputs
])
...
...
@@ -65,7 +65,7 @@ const useConfig = (id: string, payload: LLMNodeType) => {
const
handlePromptChange
=
useCallback
((
newPrompt
:
PromptItem
[]
|
PromptItem
)
=>
{
const
newInputs
=
produce
(
inputs
,
(
draft
)
=>
{
draft
.
prompt
=
newPrompt
draft
.
prompt
_template
=
newPrompt
})
setInputs
(
newInputs
)
},
[
inputs
,
setInputs
])
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment