Simple Chainlit app to have interaction with your documents.
- Huggingface model as Large Language model
- LangChain as a Framework for LLM
- Chainlit for deploying.
- GGML to run in commodity hardware (cpu)
- CTransformers to load the model.
You must have Python 3.9 or later installed. Earlier versions of python may not compile.
-
Fork this repository and create a codespace in GitHub as I showed you in the youtube video OR Clone it locally.
git clone https://github.com/sudarshan-koirala/llama2-chat-with-documents.git cd llama2-chat-with-documents
-
Rename example.env to .env with
cp example.env .env
and input the HuggingfaceHub API token as follows. Get HuggingfaceHub API key from this URL. You need to create an account in Huggingface webiste if you haven't already.HUGGINGFACEHUB_API_TOKEN=your_huggingface_api_token
-
Create a virtualenv with conda and activate it. First make sure that you have conda installed. Then run the following command.
conda create -n .venv python=3.11 -y && source activate .venv
-
Run the following command in the terminal to install necessary python packages:
pip install -r requirements.txt
-
Run the following command in your terminal to create the embeddings and store it locally:
python3 ingest.py
-
Run the following command in your terminal to run the app UI:
chainlit run main.py -w