A powerful local RAG (Retrieval Augmented Generation) application that lets you chat with your PDF documents using Ollama and LangChain. This project includes both a Jupyter notebook for experimentation and a Streamlit web interface for easy interaction.
ollama_pdf_rag/
โโโ src/ # Source code
โ โโโ app/ # Streamlit application
โ โ โโโ components/ # UI components
โ โ โ โโโ chat.py # Chat interface
โ โ โ โโโ pdf_viewer.py # PDF display
โ โ โ โโโ sidebar.py # Sidebar controls
โ โ โโโ main.py # Main app
โ โโโ core/ # Core functionality
โ โโโ document.py # Document processing
โ โโโ embeddings.py # Vector embeddings
โ โโโ llm.py # LLM setup
โ โโโ rag.py # RAG pipeline
โโโ data/ # Data storage
โ โโโ pdfs/ # PDF storage
โ โ โโโ sample/ # Sample PDFs
โ โโโ vectors/ # Vector DB storage
โโโ notebooks/ # Jupyter notebooks
โ โโโ experiments/ # Experimental notebooks
โโโ tests/ # Unit tests
โโโ docs/ # Documentation
โโโ run.py # Application runner
- ๐ Fully local processing - no data leaves your machine
- ๐ PDF processing with intelligent chunking
- ๐ง Multi-query retrieval for better context understanding
- ๐ฏ Advanced RAG implementation using LangChain
- ๐ฅ๏ธ Clean Streamlit interface
- ๐ Jupyter notebook for experimentation
-
Install Ollama
- Visit Ollama's website to download and install
- Pull required models:
ollama pull llama3.2 # or your preferred model ollama pull nomic-embed-text
-
Clone Repository
git clone https://github.com/tonykipkemboi/ollama_pdf_rag.git cd ollama_pdf_rag
-
Set Up Environment
python -m venv venv source venv/bin/activate # On Windows: .\venv\Scripts\activate pip install -r requirements.txt
Key dependencies and their versions:
ollama==0.4.4 streamlit==1.40.0 pdfplumber==0.11.4 langchain==0.1.20 langchain-core==0.1.53 langchain-ollama==0.0.2 chromadb==0.4.22
python run.py
Then open your browser to http://localhost:8501
Streamlit interface showing PDF viewer and chat functionality
jupyter notebook
Open updated_rag_notebook.ipynb
to experiment with the code
- Upload PDF: Use the file uploader in the Streamlit interface or try the sample PDF
- Select Model: Choose from your locally available Ollama models
- Ask Questions: Start chatting with your PDF through the chat interface
- Adjust Display: Use the zoom slider to adjust PDF visibility
- Clean Up: Use the "Delete Collection" button when switching documents
Feel free to:
- Open issues for bugs or suggestions
- Submit pull requests
- Comment on the YouTube video for questions
- Star the repository if you find it useful!
- Ensure Ollama is running in the background
- Check that required models are downloaded
- Verify Python environment is activated
- For Windows users, ensure WSL2 is properly configured if using Ollama
If you encounter this error:
DLL load failed while importing onnx_copy2py_export: a dynamic link Library (DLL) initialization routine failed.
Try these solutions:
-
Install Microsoft Visual C++ Redistributable:
- Download and install both x64 and x86 versions from Microsoft's official website
- Restart your computer after installation
-
If the error persists, try installing ONNX Runtime manually:
pip uninstall onnxruntime onnxruntime-gpu pip install onnxruntime
If you're running on a CPU-only system:
-
Ensure you have the CPU version of ONNX Runtime:
pip uninstall onnxruntime-gpu # Remove GPU version if installed pip install onnxruntime # Install CPU-only version
-
You may need to modify the chunk size in the code to prevent memory issues:
- Reduce
chunk_size
to 500-1000 if you experience memory problems - Increase
chunk_overlap
for better context preservation
- Reduce
Note: The application will run slower on CPU-only systems, but it will still work effectively.
# Run all tests
python -m unittest discover tests
# Run tests verbosely
python -m unittest discover tests -v
The project uses pre-commit hooks to ensure code quality. To set up:
pip install pre-commit
pre-commit install
This will:
- Run tests before each commit
- Run linting checks
- Ensure code quality standards are met
The project uses GitHub Actions for CI. On every push and pull request:
- Tests are run on multiple Python versions (3.9, 3.10, 3.11)
- Dependencies are installed
- Ollama models are pulled
- Test results are uploaded as artifacts
This project is open source and available under the MIT License.
Built with โค๏ธ by Tony Kipkemboi!