Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
d7209d90
Unverified
Commit
d7209d90
authored
Jan 16, 2024
by
crazywoola
Committed by
GitHub
Jan 16, 2024
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
feat: add abab5.5s-chat (#2063)
parent
5960103c
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
43 additions
and
1 deletion
+43
-1
abab5.5-chat.yaml
...del_runtime/model_providers/minimax/llm/abab5.5-chat.yaml
+6
-0
abab5.5s-chat.yaml
...el_runtime/model_providers/minimax/llm/abab5.5s-chat.yaml
+35
-0
chat_completion_pro.py
...untime/model_providers/minimax/llm/chat_completion_pro.py
+1
-1
llm.py
api/core/model_runtime/model_providers/minimax/llm/llm.py
+1
-0
No files found.
api/core/model_runtime/model_providers/minimax/llm/abab5.5-chat.yaml
View file @
d7209d90
...
@@ -10,8 +10,14 @@ model_properties:
...
@@ -10,8 +10,14 @@ model_properties:
parameter_rules
:
parameter_rules
:
-
name
:
temperature
-
name
:
temperature
use_template
:
temperature
use_template
:
temperature
min
:
0.01
max
:
1
default
:
0.9
-
name
:
top_p
-
name
:
top_p
use_template
:
top_p
use_template
:
top_p
min
:
0.01
max
:
1
default
:
0.95
-
name
:
max_tokens
-
name
:
max_tokens
use_template
:
max_tokens
use_template
:
max_tokens
required
:
true
required
:
true
...
...
api/core/model_runtime/model_providers/minimax/llm/abab5.5s-chat.yaml
0 → 100644
View file @
d7209d90
model
:
abab5.5s-chat
label
:
en_US
:
Abab5.5s-Chat
model_type
:
llm
features
:
-
agent-thought
model_properties
:
mode
:
chat
context_size
:
8192
parameter_rules
:
-
name
:
temperature
use_template
:
temperature
min
:
0.01
max
:
1
default
:
0.9
-
name
:
top_p
use_template
:
top_p
min
:
0.01
max
:
1
default
:
0.95
-
name
:
max_tokens
use_template
:
max_tokens
required
:
true
default
:
3072
min
:
1
max
:
8192
-
name
:
presence_penalty
use_template
:
presence_penalty
-
name
:
frequency_penalty
use_template
:
frequency_penalty
pricing
:
input
:
'
0.00'
output
:
'
0.005'
unit
:
'
0.001'
currency
:
RMB
api/core/model_runtime/model_providers/minimax/llm/chat_completion_pro.py
View file @
d7209d90
...
@@ -22,7 +22,7 @@ class MinimaxChatCompletionPro(object):
...
@@ -22,7 +22,7 @@ class MinimaxChatCompletionPro(object):
"""
"""
generate chat completion
generate chat completion
"""
"""
if
model
!=
'abab5.5-chat'
:
if
model
not
in
[
'abab5.5-chat'
,
'abab5.5s-chat'
]
:
raise
BadRequestError
(
f
'Invalid model: {model}'
)
raise
BadRequestError
(
f
'Invalid model: {model}'
)
if
not
api_key
or
not
group_id
:
if
not
api_key
or
not
group_id
:
...
...
api/core/model_runtime/model_providers/minimax/llm/llm.py
View file @
d7209d90
...
@@ -18,6 +18,7 @@ from core.model_runtime.model_providers.minimax.llm.types import MinimaxMessage
...
@@ -18,6 +18,7 @@ from core.model_runtime.model_providers.minimax.llm.types import MinimaxMessage
class
MinimaxLargeLanguageModel
(
LargeLanguageModel
):
class
MinimaxLargeLanguageModel
(
LargeLanguageModel
):
model_apis
=
{
model_apis
=
{
'abab5.5s-chat'
:
MinimaxChatCompletionPro
,
'abab5.5-chat'
:
MinimaxChatCompletionPro
,
'abab5.5-chat'
:
MinimaxChatCompletionPro
,
'abab5-chat'
:
MinimaxChatCompletion
'abab5-chat'
:
MinimaxChatCompletion
}
}
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment