Skip to content

Streaming and Non Streaming Chat app with different models using Ollama backend

Notifications You must be signed in to change notification settings

valhallac/chat-app-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Ollama Chat App

This Python application leverages the power of Ollama large language models (LLMs) to create a dynamic and engaging chat experience. Users can interact with various Ollama models directly from the interface, providing a fun and informative way to explore their capabilities.

Requirements

Optional (for local development):

  • Ollama Server: If you prefer running Ollama locally, set up a server instance according to the Ollama documentation. Alternatively, you can provide a remote server URL in the BACKEND_URL variable.

Usage

  1. Clone or download the repository.
  2. Install dependencies: Run pip install -r requirements.txt (assuming you have a requirements.txt file listing the necessary packages).
  3. (Optional) Configure Ollama server: If using a local Ollama server, ensure it's running. Otherwise, update the BACKEND_URL variable in the code with your remote server's address.
  4. Run the application: Execute streamlit run chat-app-ollama.py.

Features

  • Model Selection: Choose from a variety of Ollama models using the sidebar dropdown.
  • Streaming: Select between streaming responses (real-time updates as the model generates text) or non-streaming (displays the complete response after processing).
  • Chat Interface: Enter messages in the chat input box and receive responses from the chosen Ollama model.

Customization

  • You can add more Ollama models to the model list in the code.
  • Consider incorporating error handling for potential API issues or unexpected responses.
  • Enhance the visual design using Streamlit's layout and styling options.

Further Development

  • Explore integrating with other Ollama functionalities beyond chat.
  • Implement user authentication for personalized interactions.
  • Build a more robust server-side component if needed for scalability or security.

About

Streaming and Non Streaming Chat app with different models using Ollama backend

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages