Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

api_base not working #57

Open
smjure opened this issue May 27, 2024 · 3 comments
Open

api_base not working #57

smjure opened this issue May 27, 2024 · 3 comments

Comments

@smjure
Copy link

smjure commented May 27, 2024

Hey there! Thanks for the awesome app!!

I have a problem with defining api_base, namely I have ollama3 model working well on different port, e.g. 12345 and not the default one 11434, so in this regard I modified/defined the ~/.config/elia/config.toml file as follows:

default_model = "gpt-4o"
system_prompt = "You are a helpful assistant who talks like a pirate."
message_code_theme = "dracula"

[[models]]
name = "ollama/llama3"
api_base = "http://localhost:12345" # I need this working coz will change it to remote server

i.e. I simply added: api_base = "http://localhost:12345" and with it, the elia does not work when selected (ctrl+o). If I remove this line, it works fine with my defined llama3 model. Could this be fixed to work on a custom api_base? FYI I installed/clone the latest elia version.

@darrenburns
Copy link
Owner

When you say it doesn't work, what happens?

@smjure
Copy link
Author

smjure commented May 27, 2024

Sorry. Here it goes:
image

@tanc
Copy link

tanc commented Oct 9, 2024

To get this working using llama3.1 in ollama with a custom api_base you can use the openai method as Ollama server has an Open AI compatible API:

default_model = "openai/llama3.1"

[[models]]
name = "openai/llama3.1"
api_base = "http://192.168.1.145:11434/v1"
api_key = "test_or_anything_should_be_fine"

Change the api_base to wherever your ollama server is and make sure it ends in /v1. The api_key can't be empty but can be anything.

Using the documented ollama method didn't work for me with a custom api_base

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants