You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a problem with defining api_base, namely I have ollama3 model working well on different port, e.g. 12345 and not the default one 11434, so in this regard I modified/defined the ~/.config/elia/config.toml file as follows:
default_model = "gpt-4o"
system_prompt = "You are a helpful assistant who talks like a pirate."
message_code_theme = "dracula"
[[models]]
name = "ollama/llama3"
api_base = "http://localhost:12345" # I need this working coz will change it to remote server
i.e. I simply added: api_base = "http://localhost:12345" and with it, the elia does not work when selected (ctrl+o). If I remove this line, it works fine with my defined llama3 model. Could this be fixed to work on a custom api_base? FYI I installed/clone the latest elia version.
The text was updated successfully, but these errors were encountered:
Hey there! Thanks for the awesome app!!
I have a problem with defining
api_base
, namely I haveollama3
model working well on different port, e.g. 12345 and not the default one 11434, so in this regard I modified/defined the~/.config/elia/config.toml
file as follows:i.e. I simply added:
api_base = "http://localhost:12345"
and with it, the elia does not work when selected (ctrl+o
). If I remove this line, it works fine with my definedllama3
model. Could this be fixed to work on a customapi_base
? FYI I installed/clone the latestelia
version.The text was updated successfully, but these errors were encountered: