Skip to content

XeracAI/Xerac

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

⚠️ WORK IN PROGRESS


Next.js 15/React 19 and App Router-ready AI chatbot.

An Open-Source AI Chatbot Template Built With Next.js and the AI SDK by Vercel.

Features · Model Providers · Deploy Your Own · Running locally · Extra Features · Future Roadmap


Features

  • Next.js App Router
    • Advanced routing for seamless navigation and performance
    • React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
  • AI SDK
    • Unified API for generating text, structured objects, and tool calls with LLMs
    • Hooks for building dynamic chat and generative user interfaces
    • Supports xAI (default), OpenAI, Fireworks, and other model providers
  • shadcn/ui
  • Data Persistence
  • NextAuth.js
    • Simple and secure authentication

Model Providers

This template ships with xAI grok-2-1212 as the default chat model. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.

Deploy Your Own

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

Deploy with Vercel

Running locally

You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.

Note: You should not commit your .env file or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.

  1. Install Vercel CLI: npm i -g vercel
  2. Link local instance with Vercel and GitHub accounts (creates .vercel directory): vercel link
  3. Download your environment variables: vercel env pull
pnpm install
pnpm dev

Your app template should now be running on localhost:3000.

Extra Features

  • Appropriate RTL support
  • Message editing and conversation branching
  • Support for different model providers
  • Dynamic model list fetched from database instead of hard-coded options, with user customization
  • Support for different modalities
    • Image generation
      • DALL-E
      • Leonardo AI (Coming soon)
    • Voice mode (WIP)
  • Infinite scroll pagination for chat history
  • Authentication with phone number + OTP verification
  • User limit (WIP, currently static limit)
  • Telemetry and analytics

Future Roadmap

  • SMS OTP rate limit
  • Model, provider and global rate limits
  • PDF (and other file types) support
    • Models that support it out-of-the-box will receive the file
    • Models that don't support it will receive a converted text version using Markitdown
    • Image PDFs can be converted to text using OCR models
  • File manager (to avoid re-uploads)
  • Input area notifications
  • I18n
  • Organization, teams and projects
  • Custom themes and color palletes
  • Pro mode: a toggle to add more features for power users
    • Token count
    • Parameter tuning (e.g. temperature)
    • Claude and Gemini cache control
    • Setting your own API key

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published