A simple chat app using Flask and Ollama (local LLM).
-
Clone the repo and navigate to the folder:
git clone <repo-url> cd localchat
-
Create and activate a virtual environment:
python -m venv venv # On Windows: venv\Scripts\activate # On macOS/Linux: source venv/bin/activate
-
Install dependencies:
pip install -r requirements.txt
-
Install Ollama and the phi4 model:
- Install Ollama
- Run:
ollama pull phi4:latest
-
Start the Flask app:
python app.py
-
Open
index.html
in your browser.
- Type your message and click "Send" to chat with the local AI model.
- Ollama must be running and the
phi4:latest
model must be available. - The backend listens on
localhost:5000
.