Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
bf704556
Unverified
Commit
bf704556
authored
Jan 11, 2024
by
takatost
Committed by
GitHub
Jan 11, 2024
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix: azure openai model parameters wrong when using hosting credentials (#1993)
parent
ebd11e74
Changes
4
Show whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
20 additions
and
5 deletions
+20
-5
provider_configuration.py
api/core/entities/provider_configuration.py
+16
-1
_constant.py
...e/model_runtime/model_providers/azure_openai/_constant.py
+2
-2
gpt-4-1106-preview.yaml
...untime/model_providers/openai/llm/gpt-4-1106-preview.yaml
+1
-1
gpt-4-vision-preview.yaml
...time/model_providers/openai/llm/gpt-4-vision-preview.yaml
+1
-1
No files found.
api/core/entities/provider_configuration.py
View file @
bf704556
...
@@ -60,7 +60,22 @@ class ProviderConfiguration(BaseModel):
...
@@ -60,7 +60,22 @@ class ProviderConfiguration(BaseModel):
:return:
:return:
"""
"""
if
self
.
using_provider_type
==
ProviderType
.
SYSTEM
:
if
self
.
using_provider_type
==
ProviderType
.
SYSTEM
:
return
self
.
system_configuration
.
credentials
restrict_models
=
[]
for
quota_configuration
in
self
.
system_configuration
.
quota_configurations
:
if
self
.
system_configuration
.
current_quota_type
!=
quota_configuration
.
quota_type
:
continue
restrict_models
=
quota_configuration
.
restrict_models
copy_credentials
=
self
.
system_configuration
.
credentials
.
copy
()
if
restrict_models
:
for
restrict_model
in
restrict_models
:
if
(
restrict_model
.
model_type
==
model_type
and
restrict_model
.
model
==
model
and
restrict_model
.
base_model_name
):
copy_credentials
[
'base_model_name'
]
=
restrict_model
.
base_model_name
return
copy_credentials
else
:
else
:
if
self
.
custom_configuration
.
models
:
if
self
.
custom_configuration
.
models
:
for
model_configuration
in
self
.
custom_configuration
.
models
:
for
model_configuration
in
self
.
custom_configuration
.
models
:
...
...
api/core/model_runtime/model_providers/azure_openai/_constant.py
View file @
bf704556
...
@@ -296,7 +296,7 @@ LLM_BASE_MODELS = [
...
@@ -296,7 +296,7 @@ LLM_BASE_MODELS = [
name
=
'frequency_penalty'
,
name
=
'frequency_penalty'
,
**
PARAMETER_RULE_TEMPLATE
[
DefaultParameterName
.
FREQUENCY_PENALTY
],
**
PARAMETER_RULE_TEMPLATE
[
DefaultParameterName
.
FREQUENCY_PENALTY
],
),
),
_get_max_tokens
(
default
=
512
,
min_val
=
1
,
max_val
=
128000
),
_get_max_tokens
(
default
=
512
,
min_val
=
1
,
max_val
=
4096
),
ParameterRule
(
ParameterRule
(
name
=
'seed'
,
name
=
'seed'
,
label
=
I18nObject
(
label
=
I18nObject
(
...
@@ -369,7 +369,7 @@ LLM_BASE_MODELS = [
...
@@ -369,7 +369,7 @@ LLM_BASE_MODELS = [
name
=
'frequency_penalty'
,
name
=
'frequency_penalty'
,
**
PARAMETER_RULE_TEMPLATE
[
DefaultParameterName
.
FREQUENCY_PENALTY
],
**
PARAMETER_RULE_TEMPLATE
[
DefaultParameterName
.
FREQUENCY_PENALTY
],
),
),
_get_max_tokens
(
default
=
512
,
min_val
=
1
,
max_val
=
128000
),
_get_max_tokens
(
default
=
512
,
min_val
=
1
,
max_val
=
4096
),
ParameterRule
(
ParameterRule
(
name
=
'seed'
,
name
=
'seed'
,
label
=
I18nObject
(
label
=
I18nObject
(
...
...
api/core/model_runtime/model_providers/openai/llm/gpt-4-1106-preview.yaml
View file @
bf704556
...
@@ -22,7 +22,7 @@ parameter_rules:
...
@@ -22,7 +22,7 @@ parameter_rules:
use_template
:
max_tokens
use_template
:
max_tokens
default
:
512
default
:
512
min
:
1
min
:
1
max
:
128000
max
:
4096
-
name
:
seed
-
name
:
seed
label
:
label
:
zh_Hans
:
种子
zh_Hans
:
种子
...
...
api/core/model_runtime/model_providers/openai/llm/gpt-4-vision-preview.yaml
View file @
bf704556
...
@@ -21,7 +21,7 @@ parameter_rules:
...
@@ -21,7 +21,7 @@ parameter_rules:
use_template
:
max_tokens
use_template
:
max_tokens
default
:
512
default
:
512
min
:
1
min
:
1
max
:
128000
max
:
4096
-
name
:
seed
-
name
:
seed
label
:
label
:
zh_Hans
:
种子
zh_Hans
:
种子
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment