Skip to content

Commit

Permalink
deploy: 8f647c9
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions[bot] committed Oct 18, 2024
0 parents commit c12afea
Show file tree
Hide file tree
Showing 27 changed files with 7,864 additions and 0 deletions.
Empty file added .nojekyll
Empty file.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added 00_core_files/figure-html/cell-26-output-1.jpeg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,209 changes: 1,209 additions & 0 deletions core.html

Large diffs are not rendered by default.

703 changes: 703 additions & 0 deletions core.html.md

Large diffs are not rendered by default.

783 changes: 783 additions & 0 deletions index.html

Large diffs are not rendered by default.

209 changes: 209 additions & 0 deletions index.html.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,209 @@
# msglm


<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->

### Installation

Install the latest version from pypi

``` sh
$ pip install msglm
```

## Usage

To use an LLM we need to structure our messages in a particular format.

Here’s an example of a text chat from the OpenAI docs.

``` python
from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": [{"type": "text", "text": "What's the Wild Atlantic Way?"}]}
]
)
```

Generating the correct format for a particular API can get tedious. The
goal of *msglm* is to make it easier.

The examples below will show you how to use *msglm* for text and image
chats with OpenAI and Anthropic.

### Text Chats

For a text chat simply pass a list of strings and the api format
(e.g. “openai”) to **mk_msgs** and it will generate the correct format.

``` python
mk_msgs(["Hello, world!", "some LLM response"], api="openai")
```

``` js
[
{"role": "user", "content": [{"type": "text", "text": "Hello, world!"}]},
{"role": "assistant", "content": [{"type": "text", "text": "Some assistant response"}]}
]
```

#### anthropic

``` python
from msglm import mk_msgs_anthropic as mk_msgs
from anthropic import Anthropic
client = Anthropic()

r = client.messages.create(
model="claude-3-haiku-20240307",
max_tokens=1024,
messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.content[0].text)
```

#### openai

``` python
from msglm import mk_msgs_openai as mk_msgs
from openai import OpenAI

client = OpenAI()
r = client.chat.completions.create(
model="gpt-4o-mini",
messages=[mk_msgs(["Hello, world!", "some LLM response"])]
)
print(r.choices[0].message.content)
```

### Image Chats

For an image chat simply pass the raw image bytes in a list with your
question to *mk_msgs* and it will generate the correct format.

``` python
mk_msg([img, "What's in this image?"], api="anthropic")
```

``` js
[
{
"role": "user",
"content": [
{"type": "image", "source": {"type": "base64", "media_type": media_type, "data": img}}
{"type": "text", "text": "What's in this image?"}
]
}
]
```

#### anthropic

``` python
import httpx
from msglm import mk_msg_anthropic as mk_msg
from anthropic import Anthropic

client = Anthropic()

img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content

r = client.messages.create(
model="claude-3-haiku-20240307",
max_tokens=1024,
messages=[mk_msg([img, "Describe the image"])]
)
print(r.content[0].text)
```

#### openai

``` python
import httpx
from msglm import mk_msg_openai as mk_msg
from openai import OpenAI

img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
img = httpx.get(img_url).content

client = OpenAI()
r = client.chat.completions.create(
model="gpt-4o-mini",
messages=[mk_msg([img, "Describe the image"])]
)
print(r.choices[0].message.content)
```

### API Wrappers

To make your life a little easier msglm comes with api specific wrappers
for [`mk_msg`](https://AnswerDotAI.github.io/msglm/core.html#mk_msg) and
[`mk_msgs`](https://AnswerDotAI.github.io/msglm/core.html#mk_msgs).

For Anthropic use

``` python
from msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs
```

For OpenAI use

``` python
from msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs
```

### Other use-cases

#### Prompt Caching

*msglm* supports [prompt
caching](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching)
for Anthropic models. Simply pass *cache=True* to *mk_msg* or *mk_msgs*.

``` python
from msglm import mk_msg_anthropic as mk_msg

mk_msg("please cache my message", cache=True)
```

This generates the expected cache block below

``` js
{
"role": "user",
"content": [
{"type": "text", "text": "Please cache my message", "cache_control": {"type": "ephemeral"}}
]
}
```

#### Text only models

*msglm* supports text only models such as DeepSeek that use the OpenAI
API format. Simply pass *text_only=True* to *mk_msg* or *mk_msgs*

``` python
from msglm import mk_msg_openai as mk_msg

mk_msg("please format my text only prompt", text_only=True)
```

This generates the expected cache block below

``` js
{
"role": "user",
"content": "please format my text only prompt"
}
```

### Summary

We hope *msglm* will make your life a little easier when chatting to
LLMs. To learn more about the package please read this
[doc](./core.html).
1 change: 1 addition & 0 deletions robots.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Sitemap: https://AnswerDotAI.github.io/msglm/sitemap.xml
42 changes: 42 additions & 0 deletions search.json

Large diffs are not rendered by default.

Loading

0 comments on commit c12afea

Please sign in to comment.