Skip to content

Commit

Permalink
doc:add how to use proxy api (#731)
Browse files Browse the repository at this point in the history
  • Loading branch information
csunny authored Oct 26, 2023
2 parents 0960c85 + 8698f07 commit 02de3d4
Show file tree
Hide file tree
Showing 6 changed files with 300 additions and 127 deletions.
2 changes: 1 addition & 1 deletion docs/getting_started/install/deploy/deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ macos:brew install git-lfs
```
##### Download LLM Model and Embedding Model

If you use OpenAI llm service, see [LLM Use FAQ](https://db-gpt.readthedocs.io/en/latest/getting_started/faq/llm/llm_faq.html)
If you use OpenAI llm service, see [How to Use LLM REST API](https://db-gpt.readthedocs.io/en/latest/getting_started/faq/llm/proxyllm/proxyllm.html)

```{tip}
If you use openai or Axzure or tongyi llm api service, you don't need to download llm model.
Expand Down
1 change: 1 addition & 0 deletions docs/getting_started/install/llm/llm.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ Multi LLMs Support, Supports multiple large language models, currently supportin
:name: llama_cpp
:hidden:

./proxyllm/proxyllm.md
./llama/llama_cpp.md
./quantization/quantization.md
./vllm/vllm.md
74 changes: 74 additions & 0 deletions docs/getting_started/install/llm/proxyllm/proxyllm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
Proxy LLM API
==================================
Now DB-GPT supports connect LLM service through proxy rest api.

LLM rest api now supports
```{note}
* OpenAI
* Azure
* Aliyun tongyi
* Baidu wenxin
* Zhipu
* Baichuan
* Bard
```


### How to Integrate LLM rest API, like OpenAI, Azure, tongyi, wenxin llm api service?
update your `.env` file
```commandline
#OpenAI
LLM_MODEL=chatgpt_proxyllm
PROXY_API_KEY={your-openai-sk}
PROXY_SERVER_URL=https://api.openai.com/v1/chat/completions
#Azure
LLM_MODEL=chatgpt_proxyllm
PROXY_API_KEY={your-openai-sk}
PROXY_SERVER_URL=https://xx.openai.azure.com/v1/chat/completions
#Aliyun tongyi
LLM_MODEL=tongyi_proxyllm
TONGYI_PROXY_API_KEY={your-tongyi-sk}
PROXY_SERVER_URL={your_service_url}
## Baidu wenxin
LLM_MODEL=wenxin_proxyllm
PROXY_SERVER_URL={your_service_url}
WEN_XIN_MODEL_VERSION={version}
WEN_XIN_API_KEY={your-wenxin-sk}
WEN_XIN_SECRET_KEY={your-wenxin-sct}
## Zhipu
LLM_MODEL=zhipu_proxyllm
PROXY_SERVER_URL={your_service_url}
ZHIPU_MODEL_VERSION={version}
ZHIPU_PROXY_API_KEY={your-zhipu-sk}
## Baichuan
LLM_MODEL=bc_proxyllm
PROXY_SERVER_URL={your_service_url}
BAICHUN_MODEL_NAME={version}
BAICHUAN_PROXY_API_KEY={your-baichuan-sk}
BAICHUAN_PROXY_API_SECRET={your-baichuan-sct}
## bard
LLM_MODEL=bard_proxyllm
PROXY_SERVER_URL={your_service_url}
# from https://bard.google.com/ f12-> application-> __Secure-1PSID
BARD_PROXY_API_KEY={your-bard-token}
```
```{tip}
Make sure your .env configuration is not overwritten
```

### How to Integrate Embedding rest API, like OpenAI, Azure api service?

```commandline
## Openai embedding model, See /pilot/model/parameter.py
EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://api.openai.com/v1
proxy_openai_proxy_api_key={your-openai-sk}
proxy_openai_proxy_backend=text-embedding-ada-002
```

Loading

0 comments on commit 02de3d4

Please sign in to comment.