Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(config): add base_url for open_ai in config #58

Open
S1M0N38 opened this issue Jan 27, 2024 · 3 comments
Open

feat(config): add base_url for open_ai in config #58

S1M0N38 opened this issue Jan 27, 2024 · 3 comments

Comments

@S1M0N38
Copy link

S1M0N38 commented Jan 27, 2024

The ability to set a base_url facilitates the use of this plugin with any OpenAI-like API.

For instance, if we set base_url to the address of LiteLLM proxy, it would become possible to use this plugin with a multitude of different LLMs (including open-source local models with Ollama).

Naturally, to prevent breaking changes, the default value for such a setting should be "https://api.openai.com/v1".

In my opinion, this setting should be incorporated in the open_ai settings. This is because, similar to api_key, it's a parameter found in OpenAI’s official Python API:

@S1M0N38
Copy link
Author

S1M0N38 commented Jan 27, 2024

Here is the PR: #59

@ishaan-jaff
Copy link

Hi @S1M0N38 do you use LiteLLM Proxy ? Can we hop on a call to learn how we can make litellm better for you?

Link to my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat?month=2024-02

@christianjuth
Copy link

@S1M0N38 do you think a change like this would allow integration with something that runs locally like LM Studio? That application allows you to interact with the model via a port on localhost. Would love to try a totally offline and free AI coding experience.
https://lmstudio.ai/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants