Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request for Custom API Endpoint Support in MatGPT #30

Open
Mingzefei opened this issue Feb 26, 2024 · 8 comments
Open

Request for Custom API Endpoint Support in MatGPT #30

Mingzefei opened this issue Feb 26, 2024 · 8 comments
Labels
enhancement New feature or request wontfix This will not be worked on

Comments

@Mingzefei
Copy link

Hi,

I'm exploring MatGPT's integration capabilities with LLMs and am interested in extending its utility to custom models, particularly those deployed locally.

In Python projects, customizing the base_url, as seen in openai-python issue #913, is a straightforward approach to support custom API endpoints.

Although I'm not familiar with the specific construction of MatGPT, could a similar method be applied here to enable the use of in-house or custom LLMs via user-defined API endpoints? Your insights or guidance on this possibility would be greatly appreciated.

@toshiakit
Copy link
Owner

Hello, MatGPT runs on LLMs with MATLAB library, and unless the library supports custom API endpoints, MatGPT cannot. Please open an issue on that repo.

@toshiakit
Copy link
Owner

Hi @Mingzefei , can you provide more details about your use case?

@jonasendc
Copy link

Hi @toshiakit ,
maybe something like:
client = openai.AzureOpenAI(
api_version="2024-03-01-preview",
azure_endpoint="",
api_key=api_key,
)

Where azure_endpoint is some other url than openAIs.
So somewhere in this app the URL is hardcoded. This should be dynamic.

@toshiakit
Copy link
Owner

@jonasendc

I passed your comment to the maintainer of LLMs with MATLAB.

matlab-deep-learning/llms-with-matlab#14

@Mingzefei
Copy link
Author

Hi @Mingzefei , can you provide more details about your use case?

Hi, sorry for the late reply. @jonasendc has provided a case for AzureOpenAI, and I'd like to add another case about locally deployed LLMs.

For example, projects like olloma allow for easy local deployment and utilization of many open-source LLMs. After a successful deployment, olloma by default starts an API service on local port 11434. This API service is quite similar to calling the OpenAI API, as shown below:

from openai import OpenAI

client = OpenAI(
    base_url = 'http://localhost:11434/v1',
    api_key='ollama', # required, but unused
)

response = client.chat.completions.create(
  model="llama3",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
    {"role": "assistant", "content": "The LA Dodgers won in 2020."},
    {"role": "user", "content": "Where was it played?"}
  ]
)
print(response.choices[0].message.content)

As you can see, generally only the base_url needs to be modified.

Additionally, being able to call ollama or other locally deployed LLMs might have the following advantages:

  1. The ability to use LLMs fine-tuned for specific tasks like MATLAB, which might yield better performance and lower costs than ChatGPT.
  2. Providing a solution for countries or regions where ChatGPT is not available.

I hope this adds to the discussion and look forward to your thoughts.

@toshiakit
Copy link
Owner

@Mingzefei and @jonasendc

As I said earlier, MatGPT depends on the library called "LLMs with MATLAB", and the endpoint is defined in that library. There is nothing I can do unless the maintainer of the LLMs with MATLAB makes the requested change.

Even if that library supports the custom endpoints, MatGPT may not support it because it goes beyond the intended scope of the project.

The endpoint is hard coded in callOpenAIChatAPI.m

MatGPT
├── helpers
│       ├── llms-with-matlab
│       │       ├── openAIChat.m
|       │       ├── +llms
|       │       │     ├── +internal
|       │       │     │        ├── callOpenAIChatAPI.m

I suggest that you fork "LLMs with MATLAB" and customize the code there. I don't think you need MatGPT for that.

@toshiakit
Copy link
Owner

@jonasendc You expressed frustration in the other thread about the pace of the progress. I am just a contributor to the other thread, but my concern about how the custom API endpoint support was implemented is that it is so open-ended and therefore untestable. I would be much more comfortable if it is implemented in a testable way Hence, I am advocating more use-case-specific approach.

@toshiakit
Copy link
Owner

Azure and ollama support was added to "LLMs with MATLAB", and for those who needs this capability, please use LLMs with MATLAB. Supporting those platforms goes beyond the intended scope of MatGPT.

@toshiakit toshiakit added the wontfix This will not be worked on label Jun 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants