Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is it possible to access the underlying model via APIs, for example using the ollama APIs? #60

Open
msartiano opened this issue Sep 22, 2024 · 2 comments

Comments

@msartiano
Copy link

Question: is it possible to access the underlying model via APIs, for example using the ollama APIs?

Thanks!

@hassan4702
Copy link

hassan4702 commented Oct 14, 2024

Yes, it is possible to access underlying AI models through APIs, such as those provided by Ollama. Ollama offers an API that allows developers to integrate and interact with large language models (LLMs) within their applications.
To access Ollama APIs, you typically need to:

  • Set up an API key from the Ollama platform.
  • Use the provided endpoints for interacting with the model.
  • Send API requests in a defined format (e.g., providing input text) and process the returned model output (e.g., summaries, completions, etc.).

https://github.com/ollama/ollama/blob/main/docs/api.md
More info here

@tborchmann
Copy link

If the author would expose the 11434 port this should be possible

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants