Your AI Paralegal Assistant , Revolutionize your legal operations with our open-source AI assistant. Ingest, summarize, question, and analyze thousands of case files with ease.
graph LR
subgraph "Document Processing"
A[User Uploads Documents PDF/Word] --> B(LLaMA 3.1 Summarization & Chunking)
B --> C(BGE-en-small Sentence Embeddings)
C --> D[ChromaDB Vector Storage]
D --> E{Organize into Collections Cases}
end
graph LR
A[User Queries the Database] --> B(BGE-en-small Question Embedding)
B --> C[ChromaDB Similarity Search]
C --> D{Retrieve Relevant Chunk}
D --> E[LLaMA 3.1 Answer Generation with Context]
E --> F[Precise Answer & Source Reference]
-
Install Dependencies
Install the required packages by running:
pip install -r requirements.txt
-
Backend Setup
The backend is organized into two subfolders:
Gradio
andFastapi
.-
Gradio Folder
The main server logic resides in the
Gradio
folder. To start the Gradio server, navigate to theGradio
folder and run:python3 app.py
-
FastAPI Folder
After starting the Gradio server, head to the
Fastapi
folder. Here, you will start the FastAPI server which provides REST API routes to interact with the Gradio interface. Run the following command:uvicorn main:app --reload
-
-
Frontend Integration
With both servers running, you can now build a frontend to interact with the HTTP routes provided by the FastAPI server.
- Add a frontend template.
- Implement the option to perform Q&A on specific documents.
You can try out a hosted demo at rachelai.vercel.app.
Contributions are welcome! Feel free to submit issues or pull requests.