Skip to content

cgoinglove/better-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

thumbnail

MCP Supported Local First Discord

Deploy with Vercel

Our goal is to create the best possible chatbot UX β€” focusing on the joy and intuitiveness users feel when calling and interacting with AI tools.

See the experience in action in the preview below!

Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. It leverages the power of the Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience. You can also create custom workflows that become callable tools in chat, allowing you to chain multiple MCP tools, LLM interactions, and logic into powerful automated sequences.

Quick Start πŸš€

# 1. Clone the repository
git clone https://github.com/cgoinglove/better-chatbot.git
cd better-chatbot

# 2. (Optional) Install pnpm if you don't have it
npm install -g pnpm

# 3. Install dependencies
pnpm i

# 4. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.

# 5. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg

# 6. Run database migrations
pnpm db:migrate

# 7. Start the development server
pnpm dev

# 8. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings

⚠️ Important: When updating to a new version of the project (after git pull), always run pnpm db:migrate to ensure your database schema is up to date.

Open http://localhost:3000 in your browser to get started.

Table of Contents

This project is evolving at lightning speed! ⚑️ We're constantly shipping new features and smashing bugs. Star this repo to join the ride and stay in the loop with the latest updates!

Preview

Get a feel for the UX β€” here's a quick look at what's possible.

🧩 Browser Automation with Playwright MCP

preview

Example: Control a web browser using Microsoft's playwright-mcp tool.

  • The LLM autonomously decides how to use tools from the MCP server, calling them multiple times to complete a multi-step task and return a final message.

Sample prompt:

1. Use the @tool('web-search') to look up information about β€œmodelcontetprotocol.”  

2. Then, using : @mcp("playwright")
   - navigate Google (https://www.google.com)  
   - Click the β€œLogin” button  
   - Enter my email address ([email protected])  
   - Clock the "Next"  button
   - Close the browser

πŸ”— Visual Workflows as Custom Tools

workflow workflow-mention

Example: Create custom workflows that become callable tools in your chat conversations.

  • Build visual workflows by connecting LLM nodes (for AI reasoning) and Tool nodes (for MCP tool execution)
  • Publish workflows to make them available as @workflow_name tools in chat
  • Chain complex multi-step processes into reusable, automated sequences

πŸŽ™οΈ Realtime Voice Assistant + MCP Tools

demo-video.mov

This demo showcases a realtime voice-based chatbot assistant built with OpenAI's new Realtime API β€” now extended with full MCP tool integration. Talk to the assistant naturally, and watch it execute tools in real time.

⚑️ Quick Tool Mentions (@) & Presets

image

Quickly call tool during chat by typing @toolname. No need to memorize β€” just type @ and pick from the list!

Tool Selection vs. Mentions (@) β€” When to Use What:

  • Tool Selection: Make frequently used tools always available to the LLM across all chats. Great for convenience and maintaining consistent context over time.
  • Mentions (@): Temporarily bind only the mentioned tools for that specific response. Since only the mentioned tools are sent to the LLM, this saves tokens and can improve speed and accuracy.

Each method has its own strengths β€” use them together to balance efficiency and performance.

You can also create tool presets by selecting only the MCP servers or tools you need. Switch between presets instantly with a click β€” perfect for organizing tools by task or workflow.

🧭 Tool Choice Mode

image

Control how tools are used in each chat with Tool Choice Mode β€” switch anytime with ⌘P.

  • Auto: The model automatically calls tools when needed.
  • Manual: The model will ask for your permission before calling a tool.
  • None: Tool usage is disabled completely.

This lets you flexibly choose between autonomous, guided, or tool-free interaction depending on the situation.

πŸ› οΈ Default Tools

🌐 Web Search

web-search

Built-in web search powered by Tavily API. Search the web and extract content from URLs directly in your chats.

  • Optional: Add TAVILY_API_KEY to .env to enable web search
  • Free Tier: 1,000 requests/month at no cost
  • Easy Setup: Get your API key with one click at app.tavily.com

⚑️ JS Executor

js-executor-preview

It is a simple JS execution tool.

Additionally, many basic tools are provided, such as visualization tools for drawing charts and tables, and an HTTP tool.


…and there's even more waiting for you. Try it out and see what else it can do!


Getting Started

This project uses pnpm as the recommended package manager.

# If you don't have pnpm:
npm install -g pnpm

Quick Start (Docker Compose Version) 🐳

# 1. Install dependencies
pnpm i

# 2. Enter only the LLM PROVIDER API key(s) you want to use in the .env file at the project root.
# Example: The app works with just OPENAI_API_KEY filled in.
# (The .env file is automatically created when you run pnpm i.)

# 3. Build and start all services (including PostgreSQL) with Docker Compose
pnpm docker-compose:up

Quick Start (Local Version) πŸš€

# 1. Install dependencies
pnpm i

# 2. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.

# 3. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg

# 4. Run database migrations
pnpm db:migrate

# 5. Start the development server
pnpm dev

# 6. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings

Open http://localhost:3000 in your browser to get started.


Environment Variables

The pnpm i command generates a .env file. Add your API keys there.

# === LLM Provider API Keys ===
# You only need to enter the keys for the providers you plan to use
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
XAI_API_KEY=****
ANTHROPIC_API_KEY=****
OPENROUTER_API_KEY=****
OLLAMA_BASE_URL=http://localhost:11434/api


# Secret for Better Auth (generate with: npx @better-auth/cli@latest secret)
BETTER_AUTH_SECRET=****

# (Optional)
# URL for Better Auth (the URL you access the app from)
BETTER_AUTH_URL=

# === Database ===
# If you don't have PostgreSQL running locally, start it with: pnpm docker:pg
POSTGRES_URL=postgres://your_username:your_password@localhost:5432/your_database_name

# (Optional)
# === Tools ===
# Tavily for web search and content extraction (optional, but recommended for @web and research features)
TAVILY_API_KEY=your_tavily_api_key_here


# Whether to use file-based MCP config (default: false)
FILE_BASED_MCP_CONFIG=false

# (Optional)
# === OAuth Settings ===
# Fill in these values only if you want to enable Google/GitHub login
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=

# Set this to 1 to disable user sign-ups.
DISABLE_SIGN_UP=

# Set this to 1 to disallow adding MCP servers.
NOT_ALLOW_ADD_MCP_SERVERS=

πŸ“˜ Guides

Step-by-step setup guides for running and configuring better-chatbot.

  • How to add and configure MCP servers in your environment
  • How to self-host the chatbot using Docker, including environment configuration.
  • Deploy the chatbot to Vercel with simple setup steps for production use.
  • Personalize your chatbot experience with custom system prompts, user preferences, and MCP tool instructions
  • Configure Google and GitHub OAuth for secure user login support.
  • Adding openAI like ai providers

πŸ’‘ Tips

Advanced use cases and extra capabilities that enhance your chatbot experience.

  • Use MCP servers and structured project instructions to build a custom assistant that helps with specific tasks.
  • Open lightweight popup chats for quick side questions or testing β€” separate from your main thread.

πŸ—ΊοΈ Roadmap

Planned features coming soon to better-chatbot:

  • File Attach & Image Generation
  • Collaborative Document Editing (like OpenAI Canvas: user & assistant co-editing)
  • RAG (Retrieval-Augmented Generation)
  • Web-based Compute (with WebContainers integration)

πŸ’‘ If you have suggestions or need specific features, please create an issue!

πŸ™Œ Contributing

We welcome all contributions! Bug reports, feature ideas, code improvements β€” everything helps us build the best local AI assistant.

⚠️ Please read our Contributing Guide before submitting any Pull Requests or Issues. This helps us work together more effectively and saves time for everyone.

For detailed contribution guidelines, please see our Contributing Guide.

Language Translations: Help us make the chatbot accessible to more users by adding new language translations. See language.md for instructions on how to contribute translations.

Let's build it together πŸš€

πŸ’¬ Join Our Discord

Discord

Connect with the community, ask questions, and get support on our official Discord server!