Skip to content

This repo integrates Streamlit with locally installed LLMs, facilitating active interaction with them which was designed using Streamlit.

Notifications You must be signed in to change notification settings

20481A5450/Chat_With_LLM-s

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 Ollama x Streamlit Chat UI

This project demonstrates how to run and manage models locally using Ollama by creating an interactive UI with Streamlit.

The app has a page for running chat-based models.

App in Action

GIF

Features

  • Interactive UI: Utilize Streamlit to create a user-friendly interface.
  • Local Model Execution: Run your Ollama models locally without the need for external APIs.
  • Real-time Responses: Get real-time responses from your models directly in the UI.

Installation

Before running the app, ensure you have Python installed on your machine. Then, clone this repository and install the required packages using pip:

git clone https://github.com/20481A5450/Chat_With_LLM's.git
cd Chat_With_LLM's  
pip install -r requirements.txt

Usage

To start the app, run the following command in your terminal:

streamlit run 01_💬_Chat_Demo.py

Navigate to the URL provided by Streamlit in your browser to interact with the app.

NB: Make sure you have downloaded Ollama to your system.

Contributing

Interested in contributing to this app?

  • Great!
  • I welcome contributions from everyone.

Got questions or suggestions?

  • Feel free to open an issue or submit a pull request.

Acknowledgments

👏 Kudos to the Ollama team for their efforts in making open-source models more accessible!

About

This repo integrates Streamlit with locally installed LLMs, facilitating active interaction with them which was designed using Streamlit.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published