A ChatGPT-like web interface powered by Ollama for running large language models locally. This application provides a familiar chat interface while leveraging the power of locally hosted AI models through Ollama.
Cointab.webm
- Node.js/Express.js - Server-side runtime and web framework
- Ollama - Local LLM inference engine
- Database - SQLite/PostgreSQL for chat history and user sessions
- WebSocket/Socket.io - Real-time communication for streaming responses
- Next.js - Frontend framework
- Tailwind CSS - Utility-first CSS framework
- Axios - HTTP client for API requests
- React Router - Client-side routing
- Vite - Build tool and development server
- ESLint & Prettier - Code linting and formatting
- Docker - Containerization (optional)
Before setting up the project, ensure you have the following installed:
- Node.js (v18.0.0 or higher)
- npm or yarn package manager
- Git for version control
- Ollama (see setup instructions below)
# For macOS
brew install ollama
# For Linux
curl -fsSL https://ollama.com/install.sh | sh
# For Windows
# Download installer from https://ollama.com/download/windows
ollama serve
# Pull a lightweight model (recommended for testing)
ollama pull gamma3:1b
# Or pull a more capable model (requires more resources)
ollama pull llama3.2:3b
ollama pull mistral:7b
ollama list
# Install PostgreSQL
# macOS
brew install postgresql
brew services start postgresql
# Ubuntu/Debian
sudo apt update
sudo apt install postgresql postgresql-contrib
# Create database
createdb chatgpt_ollama
git clone https://github.com/Cvr421/ChatGPT-ollama.git
cd ChatGPT-ollama
# Install backend dependencies
cd backend
npm install
# Install frontend dependencies
cd ../frontend
npm install
Create .env
files in both backend and frontend directories:
Backend .env
:
PORT=3001
OLLAMA_URL=http://localhost:11434
DB_TYPE=postgres
DB_PATH=./database.sqlite
# For PostgreSQL (uncomment if using)
# DB_TYPE=postgres
# DB_HOST=localhost
# DB_PORT=5432
# DB_NAME=chatgpt_ollama
# DB_USER=your_username
# DB_PASSWORD=your_password
NODE_ENV=development
CORS_ORIGIN=http://localhost:3000
Frontend .env
:
VITE_API_URL=http://localhost:3001
VITE_WS_URL=ws://localhost:3001
ollama serve
cd backend
npm run dev
The backend will start on http://localhost:3001
cd frontend
npm run dev
The frontend will start on http://localhost:3000
Open your browser and navigate to http://localhost:3000
cd frontend
npm run build
cd backend
npm start
docker-compose up --build
This will start all services including the application and database.
- Select Model: Choose from available Ollama models in the interface
- Start Chatting: Type your message and press Enter or click Send