Skip to content

Latest commit

 

History

History
46 lines (32 loc) · 1.94 KB

OLLAMA_GUIDE.md

File metadata and controls

46 lines (32 loc) · 1.94 KB

Instructions on how to get LLocalSearch working with your Ollama instance

You're running Ollama on your host machine (without docker)

You're using Linux or macOS

  1. Make sure Ollama is listening on all interfaces (0.0.0.0, or at least the docker network).
  2. Add the following to the .env file (create one if it doesn't exist) in the root of the project:
OLLAMA_HOST=host.docker.internal:11434

Warning

Some linux users reported that this solution requires docker desktop to be installed. Please report back if that's the case for you. I don't have this issue on NixOS or my Ubuntu 22.04 test box.

You're using Windows

Try the above and tell me if it worked, I will update these docs.

You're running Ollama in a docker container on the same machine as LLocalSearch

  1. Make sure your exposing Ollama on port 11434.
  2. Add the following to the .env file (create one if it doesn't exist) in the root of the project:
OLLAMA_HOST=host.docker.internal:11434

You're running Ollama on a Server or different machine

  1. Make sure Ollama is reachable from the container.
  2. Add the following to the .env file (create one if it doesn't exist) in the root of the project:
OLLAMA_HOST=ollama-server-ip:11434