Replies: 4 comments 5 replies
-
Hey @borthwick, thanks for reporting! This is entirely on me, I wasn't being clear about local use: currently you still need to have internet connection for QA mode, because the QA needs an embedding model to work, and there's no good local embedding provider solution at the moment. One solution we used to have in Copilot was "LocalAI", but it's quite involved to set up for the average user. I'm actively looking into a local embedding solution right now and will publish a release as soon as I find one. In the meantime, you can use Chat mode with the small "upload" button to send your notes directly to the prompt. As long as you run a local model with a long context window, it's completely offline and it actually works better than QA mode for more in-depth questions. Let me know if this clears up the confusion! |
Beta Was this translation helpful? Give feedback.
-
Still got the same error, the notification info should be clearer... Fetch Error mean nothing to the user |
Beta Was this translation helpful? Give feedback.
-
I'm having the same issue. |
Beta Was this translation helpful? Give feedback.
-
I am having the same issue, my internet is turned on, I am running ollama from a WSL2 Ubuntu Host and using Obsdian from windows to connect, I see the chat request going to Ollama server log but response code is 403 , expecting 200 so not sure what the issue is. In the installation for local use there is a comment "Remember that you MUST set this parameter for Ollama models, or they will silently fail and you will think your long prompt successfully reached the model!" but not sure if this is related to this setting for Ollama models causing the error. If yes can you guide on steps to set the parameters from the Unix prompt, sorry for a newbie question. |
Beta Was this translation helpful? Give feedback.
-
I installed and configured copilot plugin on Mac m2, using LM Studio and "mistral instruct vO 1 7B Q4_K_M gguf"
When I configure the plugin with QA as described in the documentation, I get the error ".LangChain error: TypeError: Failed to fetch ".
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions