Skip to content

mrunal-z/llm-qna-system

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Local LLM-based Question-Answering System

This project is a simple Question Answering system powered by local Large Language Models (LLMs), built with LangChain.

Overview

You can feed this system with any text document and then ask questions related to the topics in the document. For demonstration purposes, "stellarcorp.txt" has been included, which contains information about a fictional company and its employees. The system accepts one text document about any topic, then you can query that document using a local LLM of your choice. As an example, the document "stellarcorp.txt" has been used, which contains information about a fictional company and its employees. The primary LLM used is gpt4all-j-v1.3-groovy.

Getting Started

Requirements

Python 3.10 or newer.

Setup & Run

  1. Start by installing the required dependencies:
pip install -r requirements.txt
  1. Add your text document and update its path in 'run-llm.py' as INPUT_DOCUMENT_PATH.

  2. Place your desired LLM in the models/LLM_NAME directory. Ensure that the path is updated in 'run-llm.py' under MODEL_PATH.

  3. Run the file:

python run-llm.py

Note: It might take some time to run.

  1. Ask your query and wait for the LLM to respond!

Example

image

About

Simple question answering system using local LLMS

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages