Skip to content

Commit

Permalink
docs: fix the ai-proxy override.endpoint (#11700)
Browse files Browse the repository at this point in the history
  • Loading branch information
vvidovic authored Nov 4, 2024
1 parent 45679ae commit a91e79a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/en/latest/plugins/ai-proxy.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Proxying requests to OpenAI is supported now. Other LLM services will be support
| model.options.temperature | No | Number | Matching temperature for models. Range: 0.0 - 5.0 |
| model.options.top_p | No | Number | Top-p probability mass. Range: 0 - 1 |
| model.options.stream | No | Boolean | Stream response by SSE. Default: false |
| model.override.endpoint | No | String | Override the endpoint of the AI provider |
| override.endpoint | No | String | Override the endpoint of the AI provider |
| passthrough | No | Boolean | If enabled, the response from LLM will be sent to the upstream. Default: false |
| timeout | No | Integer | Timeout in milliseconds for requests to LLM. Range: 1 - 60000. Default: 3000 |
| keepalive | No | Boolean | Enable keepalive for requests to LLM. Default: true |
Expand Down

0 comments on commit a91e79a

Please sign in to comment.