English | 简体中文
Azure OpenAI Proxy is a tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure Open AI.
An Azure OpenAI account is required to use Azure OpenAI Proxy.
Remember to:
- Select the region that matches your Azure OpenAI resource for best performance.
- If deployment fails because the 'proxywebapp' name is already taken, change the resource prefix and redeploy.
- The deployed proxy app is part of a B1 pricing tier Azure web app plan, which can be modified in the Azure Portal after deployment.
To deploy using Docker, execute the following command:
docker run -d -p 3000:3000 scalaone/azure-openai-proxyFollow these steps:
- Install NodeJS 20.
- Clone the repository in the command line window.
- Run npm installto install the dependencies.
- Run npm startto start the application.
- Use the script below for testing. Replace AZURE_RESOURCE_ID,AZURE_MODEL_DEPLOYMENT, andAZURE_API_KEYbefore running. The default value forAZURE_API_VERSIONis2024-02-01and is optional.
Test script
```bash curl -X "POST" "http://localhost:3000/v1/chat/completions" \ -H 'Authorization: AZURE_RESOURCE_ID:AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY:AZURE_API_VERSION' \ -H 'Content-Type: application/json; charset=utf-8' \ -d $'{ "messages": [ { "role": "system", "content": "You are an AI assistant that helps people find information." }, { "role": "user", "content": "hi." } ], "temperature": 1, "model": "gpt-3.5-turbo", "stream": false }' ```The azure-openai-proxy has been tested and confirmed to work with the following applications:
| Application Name | Docker-compose File for E2E Test | 
|---|---|
| chatgpt-lite | docker-compose.yml | 
| chatgpt-minimal | docker-compose.yml | 
| chatgpt-next-web | docker-compose.yml | 
| chatbot-ui | docker-compose.yml | 
| chatgpt-web | docker-compose.yml | 
To test locally, follow these steps:
- Clone the repository in a command-line window.
- Update the OPENAI_API_KEYenvironment variable withAZURE_RESOURCE_ID:AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY. Alternatively, update the OPENAI_API_KEY value in the docker-compose.yml file directly.
- Navigate to the directory containing the docker-compose.ymlfile for the application you want to test.
- Execute the build command: docker-compose build.
- Start the service: docker-compose up -d.
- Access the application locally using the port defined in the docker-compose.yml file. For example, visit http://localhost:3000.
Q: What are `AZURE_RESOURCE_ID`,`AZURE_MODEL_DEPLOYMENT`, and `AZURE_API_KEY`?
A: These can be found in the Azure management portal. See the image below for reference: Q: How can I use gpt-4 and gpt-4-32k models?
A: To use gpt-4 and gpt-4-32k models, follow the key format below: `AZURE_RESOURCE_ID:gpt-3.5-turbo|AZURE_MODEL_DEPLOYMENT,gpt-4|AZURE_MODEL_DEPLOYMENT,gpt-4-32k|AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY:AZURE_API_VERSION`We welcome all PR submissions.