This Streamlit application provides a user interface for interacting with Unify models through chat. It allows users to select models and providers, input text, and view the conversation history with AI assistants.
![Screenshot](/BozHris/LLM_playground/raw/main/Screenshot 2024-04-23 205316.png)
-
Clone this repository:
git clone https://github.com/samthakur587/LLM_playground
-
Install the required dependencies:
pip install -r requirements.txt
-
Run the Streamlit app:
streamlit run stream.py
from unify import AsyncUnify
import os
import asyncio
async_unify = AsyncUnify(
# This is the default and optional to include.
api_key=os.environ.get("UNIFY_KEY"),
endpoint="llama-2-13b-chat@anyscale"
)
async def main():
responses = await async_unify.generate(user_prompt="Hello Llama! Who was Isaac Newton?")
asyncio.run(main())
-
Input Unify API Key: Enter your Unify API key in the provided text input box on the sidebar.
-
Select endpoints : Choose the models and providers from the sidebar dropdown menus.
-
Start Chatting: Type your message in the chat input box and press "Enter" or click the "Send" button.
-
View Conversation History: The conversation history with the AI assistant for each model is displayed in separate containers.
-
Clear History: You can clear the conversation history by clicking the "Clear History" button.
- Chat UI: Interactive chat interface to communicate with AI assistants.
- Endpoint from Unify: Choose from a variety of models and providers.
- Conversation History: View and track the conversation history with each model.
- Clear History: Option to clear the conversation history for a fresh start.
- Streamlit
- Pandas
- Unify
This project is licensed under the MIT License - see the LICENSE file for details.