This project is a demonstration of building a companion app using Mem0 for memory management and OpenRouter for accessing Large Language Models (LLMs).
- Interactive chat interface with a customizable personal AI companion
- Memory management for both user and agent using Mem0 API
- LLM integration via OpenRouter API
- Real-time memory display and refresh
- Clone the repository
- Install dependencies:
npm install
- Set up your Mem0 and OpenRouter API keys in the settings panel
- Run the development server:
npm run dev
- Open http://localhost:3000 with your browser to see the result
page.js
: Main chat interface and logicsettings-panel.js
: Settings management for API keys and companion customizationmemories-panel.js
: Display and management of user and agent memories
If you encounter issues:
- Verify API keys are correct and have necessary permissions.
- Check your internet connection.
- For persistent problems, inspect the browser console for error messages.
- Try refreshing the page or restarting the application.
This Next.js app can be easily deployed on platforms like Vercel. Make sure to set up your environment variables for API keys securely.