This project features a chat interface similar to OpenAI's website, utilizing the Llama 3.1 model and Ollama for generating responses. It includes a frontend built with React and TypeScript, and a backend developed using Django.
- Frontend: Located in the
frontend
directory. Frontend README - Backend: Located in the
backend
directory. Backend README
- Real-time chat interface with a modern design.
- Integration with the Llama 3.1 model for chat responses.
- Ability to create new chat threads and manage existing ones.
- User registration and login functionality for personalized experiences.
-
Navigate to the Frontend Directory:
cd frontend
-
Install Frontend Dependencies: Make sure you have all necessary dependencies installed for the frontend. Run:
npm install
-
Navigate to the Backend Directory:
cd ../backend/llama_chatbot
-
Install Backend Dependencies: Install the necessary Python packages for the backend. Run:
pip install -r requirements.txt
-
Navigate Back to the Frontend Directory:
cd ../../frontend
-
Start the Application: Use the following command to start both the backend server and the frontend development server, and open the frontend application in your browser:
npm start
-
Interact with the Chat Interface: Once the servers are running, your default web browser will open with the frontend application. You can now register/login and interact with the chat interface.
This project is licensed under the MIT License. See LICENSE for details.