Skip to content

learnsol/localchat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalChat

A simple chat app using Flask and Ollama (local LLM).

Setup

  1. Clone the repo and navigate to the folder:

    git clone <repo-url>
    cd localchat
  2. Create and activate a virtual environment:

    python -m venv venv
    # On Windows:
    venv\Scripts\activate
    # On macOS/Linux:
    source venv/bin/activate
  3. Install dependencies:

    pip install -r requirements.txt
  4. Install Ollama and the phi4 model:

  5. Start the Flask app:

    python app.py
  6. Open index.html in your browser.

Usage

  • Type your message and click "Send" to chat with the local AI model.

Notes

  • Ollama must be running and the phi4:latest model must be available.
  • The backend listens on localhost:5000.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published