-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat(model): Add support for Azure openAI (#1137)
Co-authored-by: like <[email protected]> Co-authored-by: csunny <[email protected]>
- Loading branch information
1 parent
208d91d
commit 3f70da4
Showing
3 changed files
with
179 additions
and
7 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
164 changes: 164 additions & 0 deletions
164
docs/docs/installation/advanced_usage/More_proxyllms.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,164 @@ | ||
# ProxyLLMs | ||
DB-GPT can be deployed on servers with lower hardware through proxy LLMs, and now dbgpt support many proxy llms, such as OpenAI、Azure、Wenxin、Tongyi、Zhipu and so on. | ||
|
||
### Proxy model | ||
|
||
import Tabs from '@theme/Tabs'; | ||
import TabItem from '@theme/TabItem'; | ||
|
||
<Tabs | ||
defaultValue="openai" | ||
values={[ | ||
{label: 'Open AI', value: 'openai'}, | ||
{label: 'Azure', value: 'Azure'}, | ||
{label: 'Qwen', value: 'qwen'}, | ||
{label: 'ChatGLM', value: 'chatglm'}, | ||
{label: 'WenXin', value: 'erniebot'}, | ||
]}> | ||
<TabItem value="openai" label="open ai"> | ||
Install dependencies | ||
|
||
```python | ||
pip install -e ".[openai]" | ||
``` | ||
|
||
Download embedding model | ||
|
||
```python | ||
cd DB-GPT | ||
mkdir models and cd models | ||
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese | ||
``` | ||
|
||
Configure the proxy and modify LLM_MODEL, PROXY_API_URL and API_KEY in the `.env`file | ||
|
||
```python | ||
# .env | ||
LLM_MODEL=chatgpt_proxyllm | ||
PROXY_API_KEY={your-openai-sk} | ||
PROXY_SERVER_URL=https://api.openai.com/v1/chat/completions | ||
# If you use gpt-4 | ||
# PROXYLLM_BACKEND=gpt-4 | ||
``` | ||
</TabItem> | ||
|
||
<TabItem value="Azure" label="Azure"> | ||
Install dependencies | ||
|
||
```python | ||
pip install -e ".[openai]" | ||
``` | ||
|
||
Download embedding model | ||
|
||
```python | ||
cd DB-GPT | ||
mkdir models and cd models | ||
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese # change this to other embedding model if needed. | ||
``` | ||
|
||
Configure the proxy and modify LLM_MODEL, PROXY_API_URL and API_KEY in the `.env`file | ||
|
||
```python | ||
# .env | ||
LLM_MODEL=proxyllm | ||
PROXY_API_KEY=xxxx | ||
PROXY_API_BASE=https://xxxxxx.openai.azure.com/ | ||
PROXY_API_TYPE=azure | ||
PROXY_SERVER_URL=xxxx | ||
PROXY_API_VERSION=2023-05-15 | ||
PROXYLLM_BACKEND=gpt-35-turbo | ||
API_AZURE_DEPLOYMENT=xxxx[deployment_name] | ||
``` | ||
</TabItem> | ||
|
||
<TabItem value="qwen" label="通义千问"> | ||
Install dependencies | ||
|
||
```python | ||
pip install dashscope | ||
``` | ||
|
||
Download embedding model | ||
|
||
```python | ||
cd DB-GPT | ||
mkdir models and cd models | ||
|
||
# embedding model | ||
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese | ||
or | ||
git clone https://huggingface.co/moka-ai/m3e-large | ||
``` | ||
|
||
Configure the proxy and modify LLM_MODEL, PROXY_API_URL and API_KEY in the `.env`file | ||
|
||
```python | ||
# .env | ||
# Aliyun tongyiqianwen | ||
LLM_MODEL=tongyi_proxyllm | ||
TONGYI_PROXY_API_KEY={your-tongyi-sk} | ||
PROXY_SERVER_URL={your_service_url} | ||
``` | ||
</TabItem> | ||
<TabItem value="chatglm" label="chatglm" > | ||
Install dependencies | ||
|
||
```python | ||
pip install zhipuai | ||
``` | ||
|
||
Download embedding model | ||
|
||
```python | ||
cd DB-GPT | ||
mkdir models and cd models | ||
|
||
# embedding model | ||
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese | ||
or | ||
git clone https://huggingface.co/moka-ai/m3e-large | ||
``` | ||
|
||
Configure the proxy and modify LLM_MODEL, PROXY_API_URL and API_KEY in the `.env`file | ||
|
||
```python | ||
# .env | ||
LLM_MODEL=zhipu_proxyllm | ||
PROXY_SERVER_URL={your_service_url} | ||
ZHIPU_MODEL_VERSION={version} | ||
ZHIPU_PROXY_API_KEY={your-zhipu-sk} | ||
``` | ||
</TabItem> | ||
|
||
<TabItem value="erniebot" label="文心一言" default> | ||
|
||
Download embedding model | ||
|
||
```python | ||
cd DB-GPT | ||
mkdir models and cd models | ||
|
||
# embedding model | ||
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese | ||
or | ||
git clone https://huggingface.co/moka-ai/m3e-large | ||
``` | ||
|
||
Configure the proxy and modify LLM_MODEL, MODEL_VERSION, API_KEY and API_SECRET in the `.env`file | ||
|
||
```python | ||
# .env | ||
LLM_MODEL=wenxin_proxyllm | ||
WEN_XIN_MODEL_VERSION={version} # ERNIE-Bot or ERNIE-Bot-turbo | ||
WEN_XIN_API_KEY={your-wenxin-sk} | ||
WEN_XIN_API_SECRET={your-wenxin-sct} | ||
``` | ||
</TabItem> | ||
</Tabs> | ||
|
||
|
||
:::info note | ||
|
||
⚠️ Be careful not to overwrite the contents of the `.env` configuration file | ||
::: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters