Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 0.0.10 Demos and LangChain Integration #15

Merged
merged 8 commits into from
May 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.0.9"
}
}
27 changes: 27 additions & 0 deletions examples/cli/mirascope/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Simple CLI Chatbot with Mirascope

This is a quick demo that shows how to create a chatbot with MiraScope using
Honcho as the storage engine.

It uses the command line as an interface and uses GPT-4o as the underlying
model. Follow the below steps to setup the demo.

1. Install the dependencies with `poetry`

```bash
poetry shell
poetry install
```

2. Add your OpenAI API key to an `.env` file

```bash
echo "OPENAI_API_KEY=<YOUR_API_KEY>" > .env
```

3. Run the demo from your poetry demo

```bash
poetry shell
python main.py
```
78 changes: 78 additions & 0 deletions examples/cli/mirascope/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
import asyncio
from typing import List

from dotenv import load_dotenv
from mirascope.openai import OpenAICall, OpenAICallParams

from honcho import Honcho

load_dotenv()
honcho = Honcho(environment="demo") # initialize the honcho client
app = honcho.apps.get_or_create("Mirascope Test") # Get your app instance
user = honcho.apps.users.get_or_create(app_id=app.id, name="test_user") # Get your user
session = honcho.apps.users.sessions.create(app_id=app.id, user_id=user.id, location_id="cli") # Make a new session


# Set up your OpenAI Call
class Conversation(OpenAICall):
prompt_template = """
SYSTEM:
You are a helpful assistant that provides incredibly short and efficient
responses.

MESSAGES: {history}

USER:
{user_input}
"""
user_input: str
session_id: str
app_id: str
user_id: str

@property
def history(self) -> List[dict]:
"""Get the conversation history from Honcho"""
history_list = []
iter = honcho.apps.users.sessions.messages.list(
session_id=self.session_id, app_id=self.app_id, user_id=self.user_id
)
for message in iter:
if message.is_user:
history_list.append({"role": "user", "content": message.content})
else:
history_list.append({"role": "assistant", "content": message.content})
return history_list

# context: str
call_params = OpenAICallParams(model="gpt-4o-2024-05-13", temperature=0.4)


conversation = Conversation(user_input="", app_id=app.id, user_id=user.id, session_id=session.id)


async def chat():
while True:
conversation.user_input = input(">>> ")
if conversation.user_input == "exit":
honcho.apps.users.sessions.delete(session_id=session.id, app_id=app.id, user_id=user.id)
break
response = ""
cstream = conversation.stream_async()
print("\033[96mAI:\033[0m")
async for chunk in cstream:
print(f"\033[96m{chunk.content}\033[0m", end="", flush=True)
response += chunk.content
print("\n")

# Save User and AI messages to Honcho
honcho.apps.users.sessions.messages.create(
session_id=session.id, app_id=app.id, user_id=user.id, content=conversation.user_input, is_user=True
)
honcho.apps.users.sessions.messages.create(
session_id=session.id, app_id=app.id, user_id=user.id, content=response, is_user=False
)


if __name__ == "__main__":
asyncio.run(chat())
Loading
Loading