Deployed on Streamlit
https://questiongenerator-cagyflei86fyppgorbqk8p.streamlit.app/
Warning donot run both the chat and question Generator Together
- Pinecone - For vector store
- Langchain - For Rag with GPT
- StreamLit - For deployment
- ChatGpt with RAG
- Langchain chain- QAGenerationChain (Looked into the code Basically it is RAG model which makes question by splitiing Docs and making Questions for each split)
Streamlit Appliaction- contains my streamlit files used for deployment
Question Generator ipynb - contains code that generates the questions
questions.txt- question answer generated by Question Generator ipynb
rag.ipynb- Chat bot that uses RAG memory to answer question about the file
In your Enviroment Variables set OPENAI_API_KEY
Install the requirements.txt
pip install -r requirements.txt
- Memory Rag Where you can ask it question about the document
- Question Generator it generates 3 types of Question
Either the Chat bot Runs or the Question Generator
The ipynb File is Question Generator.ipynb
It generates the question in TXT format it is questions.txt
Streamlit Demo
Multiple Questions Can be Generated of Different Types
You can download the TXT file with the button or read the generation
TXT file
Download the TXT file with the button
The ipynb File is rag.ipynb
Streamlit Demo