Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
D
dify
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
ai-tech
dify
Commits
d93365d4
Unverified
Commit
d93365d4
authored
May 24, 2023
by
John Wang
Committed by
GitHub
May 24, 2023
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix: azure embedding not support batch (#188)
parent
8b44dba9
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
14 additions
and
0 deletions
+14
-0
openai_embedding.py
api/core/embedding/openai_embedding.py
+14
-0
No files found.
api/core/embedding/openai_embedding.py
View file @
d93365d4
...
...
@@ -173,6 +173,13 @@ class OpenAIEmbedding(BaseEmbedding):
Can be overriden for batch queries.
"""
if
self
.
openai_api_type
and
self
.
openai_api_type
==
'azure'
:
embeddings
=
[]
for
text
in
texts
:
embeddings
.
append
(
self
.
_get_text_embedding
(
text
))
return
embeddings
if
self
.
deployment_name
is
not
None
:
engine
=
self
.
deployment_name
else
:
...
...
@@ -187,6 +194,13 @@ class OpenAIEmbedding(BaseEmbedding):
async
def
_aget_text_embeddings
(
self
,
texts
:
List
[
str
])
->
List
[
List
[
float
]]:
"""Asynchronously get text embeddings."""
if
self
.
openai_api_type
and
self
.
openai_api_type
==
'azure'
:
embeddings
=
[]
for
text
in
texts
:
embeddings
.
append
(
await
self
.
_aget_text_embedding
(
text
))
return
embeddings
if
self
.
deployment_name
is
not
None
:
engine
=
self
.
deployment_name
else
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment