Would like to connect with ollama running on a different server #235
-
I stumbled upon this after installing ollama on my server at home. I am just curious about how to configure the extension so that I am able to connect the extension in VSC on my laptop with the ollama instance on my server for example. From the starter guide I understood that it is meant to be running both on the same machine. Am I right? Or am I missing something? I think my goal and my setup should be pretty common for people who really want to use local AI stuff. So I would assume there are more people out there with this use case. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
It's possible for sure. You'd have to expose your desktop Ollama API and configure twinny. It does not have to be on the same server. |
Beta Was this translation helpful? Give feedback.
-
In the provider manager. See here: #216 (comment) |
Beta Was this translation helpful? Give feedback.
In the provider manager. See here: #216 (comment)