diff --git a/00_core.ipynb b/00_core.ipynb
index 84ace39..cf1b971 100644
--- a/00_core.ipynb
+++ b/00_core.ipynb
@@ -58,7 +58,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "eaf89997",
+ "id": "18b892d5",
"metadata": {},
"outputs": [],
"source": [
@@ -78,7 +78,7 @@
},
{
"cell_type": "markdown",
- "id": "1f5aafff",
+ "id": "5251db7f",
"metadata": {},
"source": [
":::{.callout-tip}\n",
@@ -103,7 +103,7 @@
},
{
"cell_type": "markdown",
- "id": "896fdd9e",
+ "id": "a9be03ab",
"metadata": {},
"source": [
"These are the current versions of Anthropic's model at the time of writing."
@@ -166,7 +166,7 @@
},
{
"cell_type": "markdown",
- "id": "b0ad6fc9",
+ "id": "ea12a62a",
"metadata": {},
"source": [
":::{.callout-tip}\n",
@@ -184,8 +184,24 @@
"outputs": [
{
"data": {
+ "text/markdown": [
+ "- It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\n",
+ "\n",
+ "\n",
+ "\n",
+ "id: msg_01UTL4MnT7dYWULP2mEoZkDt\n",
+ "- content: [{'text': \"It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\", 'type': 'text'}]\n",
+ "- model: claude-3-haiku-20240307\n",
+ "- role: assistant\n",
+ "- stop_reason: end_turn\n",
+ "- stop_sequence: None\n",
+ "- type: message\n",
+ "- usage: {'input_tokens': 10, 'output_tokens': 54}\n",
+ "\n",
+ " "
+ ],
"text/plain": [
- "Message(id='msg_016cPzvCzt1nrLvGaQUCDBvC', content=[TextBlock(text=\"Nice to meet you, Jeremy! As an AI assistant, I don't have a specific identity, but I'm here to help you with any questions or tasks you might have. Please let me know if there's anything I can assist you with.\", type='text')], model='claude-3-haiku-20240307', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=Usage(input_tokens=10, output_tokens=53))"
+ "Message(id='msg_01UTL4MnT7dYWULP2mEoZkDt', content=[TextBlock(text=\"It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\", type='text')], model='claude-3-haiku-20240307', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=In: 10; Out: 54; Total: 64)"
]
},
"execution_count": null,
@@ -201,7 +217,7 @@
},
{
"cell_type": "markdown",
- "id": "5213058e",
+ "id": "443a1542",
"metadata": {},
"source": [
"### Formatting output"
@@ -218,7 +234,7 @@
},
{
"cell_type": "markdown",
- "id": "cf3386f6",
+ "id": "5a68b838",
"metadata": {},
"source": [
":::{.callout-tip}\n",
@@ -237,7 +253,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "10f6a359",
+ "id": "9a2269ac",
"metadata": {},
"outputs": [],
"source": [
@@ -249,7 +265,7 @@
},
{
"cell_type": "markdown",
- "id": "90d61490",
+ "id": "c2018448",
"metadata": {},
"source": [
"This makes it easier to grab the needed parts of Claude's responses, which can include multiple pieces of content. By default, we look for the first text block. That will generally have the content we want to display."
@@ -258,13 +274,13 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "c5c1497b",
+ "id": "4ae08004",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
- "TextBlock(text=\"Nice to meet you, Jeremy! As an AI assistant, I don't have a specific identity, but I'm here to help you with any questions or tasks you might have. Please let me know if there's anything I can assist you with.\", type='text')"
+ "TextBlock(text=\"It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\", type='text')"
]
},
"execution_count": null,
@@ -279,7 +295,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "9adbeae6",
+ "id": "94f488b4",
"metadata": {},
"outputs": [],
"source": [
@@ -293,7 +309,7 @@
},
{
"cell_type": "markdown",
- "id": "95dd0ac6",
+ "id": "48e549b2",
"metadata": {},
"source": [
"For display purposes, we often just want to show the text itself."
@@ -302,13 +318,13 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "f121b143",
+ "id": "85d4c5b9",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
- "\"Nice to meet you, Jeremy! As an AI assistant, I don't have a specific identity, but I'm here to help you with any questions or tasks you might have. Please let me know if there's anything I can assist you with.\""
+ "\"It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\""
]
},
"execution_count": null,
@@ -323,7 +339,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "b3160256",
+ "id": "b96cf803",
"metadata": {},
"outputs": [],
"source": [
@@ -335,14 +351,14 @@
"\n",
"\n",
"\n",
- "{det}\n",
+ "- {det}\n",
"\n",
" \"\"\""
]
},
{
"cell_type": "markdown",
- "id": "5aa6e93e",
+ "id": "5beb4b54",
"metadata": {},
"source": [
"Jupyter looks for a `_repr_markdown_` method in displayed objects; we add this in order to display just the content text, and collapse full details into a hideable section. Note that `patch` is from [fastcore](https://fastcore.fast.ai/), and is used to add (or replace) functionality in an existing class. We pass the class(es) that we want to patch as type annotations to `self`. In this case, `_repr_markdown_` is being added to Anthropic's `ToolsBetaMessage` and `Message` classes, so when we display the message now we just see the contents, and the details are hidden away in a collapsible details block."
@@ -351,29 +367,29 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "ac818aa1",
+ "id": "8f98a01f",
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
- "Nice to meet you, Jeremy! As an AI assistant, I don't have a specific identity, but I'm here to help you with any questions or tasks you might have. Please let me know if there's anything I can assist you with.\n",
+ "It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\n",
"\n",
"\n",
"\n",
- "id: msg_016cPzvCzt1nrLvGaQUCDBvC\n",
- "- content: [{'text': \"Nice to meet you, Jeremy! As an AI assistant, I don't have a specific identity, but I'm here to help you with any questions or tasks you might have. Please let me know if there's anything I can assist you with.\", 'type': 'text'}]\n",
+ "- id: msg_01UTL4MnT7dYWULP2mEoZkDt\n",
+ "- content: [{'text': \"It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\", 'type': 'text'}]\n",
"- model: claude-3-haiku-20240307\n",
"- role: assistant\n",
"- stop_reason: end_turn\n",
"- stop_sequence: None\n",
"- type: message\n",
- "- usage: {'input_tokens': 10, 'output_tokens': 53}\n",
+ "- usage: {'input_tokens': 10, 'output_tokens': 54}\n",
"\n",
" "
],
"text/plain": [
- "Message(id='msg_016cPzvCzt1nrLvGaQUCDBvC', content=[TextBlock(text=\"Nice to meet you, Jeremy! As an AI assistant, I don't have a specific identity, but I'm here to help you with any questions or tasks you might have. Please let me know if there's anything I can assist you with.\", type='text')], model='claude-3-haiku-20240307', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=Usage(input_tokens=10, output_tokens=53))"
+ "Message(id='msg_01UTL4MnT7dYWULP2mEoZkDt', content=[TextBlock(text=\"It's nice to meet you Jeremy! As an AI assistant, I don't have a personal identity, but I'm happy to chat and try my best to help you with any questions or tasks you may have. Please feel free to ask me anything.\", type='text')], model='claude-3-haiku-20240307', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=In: 10; Out: 54; Total: 64)"
]
},
"execution_count": null,
@@ -387,7 +403,7 @@
},
{
"cell_type": "markdown",
- "id": "02e66afb",
+ "id": "af3d8249",
"metadata": {},
"source": [
"One key part of the response is the `usage` key, which tells us how many tokens we used by returning a `Usage` object.\n",
@@ -398,7 +414,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "255a5c3e",
+ "id": "7ef9e575",
"metadata": {},
"outputs": [
{
@@ -419,7 +435,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "c94beb58",
+ "id": "ed34bcb0",
"metadata": {},
"outputs": [],
"source": [
@@ -431,7 +447,7 @@
},
{
"cell_type": "markdown",
- "id": "c36f772c",
+ "id": "9d872edc",
"metadata": {},
"source": [
"The constructor provided by Anthropic is rather verbose, so we clean it up a bit, using a lowercase version of the name."
@@ -440,7 +456,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "a6d68537",
+ "id": "1ff1ee30",
"metadata": {},
"outputs": [
{
@@ -461,7 +477,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "6cc0756d",
+ "id": "3d419171",
"metadata": {},
"outputs": [],
"source": [
@@ -472,7 +488,7 @@
},
{
"cell_type": "markdown",
- "id": "ee06e9ee",
+ "id": "3b512c1a",
"metadata": {},
"source": [
"Adding a `total` property to `Usage` makes it easier to see how many tokens we've used up altogether."
@@ -481,7 +497,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "bfe4eea3",
+ "id": "9e868527",
"metadata": {},
"outputs": [
{
@@ -502,7 +518,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "1cd7d600",
+ "id": "7101b6fa",
"metadata": {},
"outputs": [],
"source": [
@@ -513,7 +529,7 @@
},
{
"cell_type": "markdown",
- "id": "2d0c99bb",
+ "id": "6892b040",
"metadata": {},
"source": [
"In python, patching `__repr__` let's us change how an object is displayed."
@@ -522,7 +538,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "a518d83b",
+ "id": "7f5ea965",
"metadata": {},
"outputs": [
{
@@ -543,7 +559,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "b73f1ed0",
+ "id": "42fbdf6a",
"metadata": {},
"outputs": [],
"source": [
@@ -555,7 +571,7 @@
},
{
"cell_type": "markdown",
- "id": "b70b5fa8",
+ "id": "1c5b2297",
"metadata": {},
"source": [
"And, patching `__add__` let's make `+` work on a class."
@@ -564,7 +580,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "f437dcbb",
+ "id": "847c34ac",
"metadata": {},
"outputs": [
{
@@ -584,7 +600,7 @@
},
{
"cell_type": "markdown",
- "id": "1a95bdd5",
+ "id": "71cffe2b",
"metadata": {},
"source": [
"### Creating messages"
@@ -592,7 +608,7 @@
},
{
"cell_type": "markdown",
- "id": "6223a528",
+ "id": "22550354",
"metadata": {},
"source": [
"Creating correctly formatted `dict`s from scratch every time isn't very handy, so next up we'll add helpers for this."
@@ -619,7 +635,7 @@
},
{
"cell_type": "markdown",
- "id": "544feedd",
+ "id": "2aa3ad6c",
"metadata": {},
"source": [
":::{.callout-note}\n",
@@ -717,7 +733,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "ae6f8700",
+ "id": "0b20bb82",
"metadata": {},
"outputs": [],
"source": [
@@ -731,7 +747,7 @@
},
{
"cell_type": "markdown",
- "id": "f2dd964c",
+ "id": "0175f5de",
"metadata": {},
"source": [
":::{.callout-note}\n",
@@ -1687,7 +1703,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "3e152fc2",
+ "id": "8cb8602f",
"metadata": {},
"outputs": [],
"source": [
@@ -1701,7 +1717,7 @@
},
{
"cell_type": "markdown",
- "id": "96dea1de",
+ "id": "8a0e5a6d",
"metadata": {},
"source": [
"Claude supports adding an extra `assistant` message at the end, which contains the *prefill* -- i.e. the text we want Claude to assume the response starts with. However Claude doesn't actually repeat that in the response, so for convenience we'll add it."
@@ -1710,7 +1726,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "d7521df8",
+ "id": "f9f08eab",
"metadata": {},
"outputs": [],
"source": [
@@ -1759,7 +1775,7 @@
},
{
"cell_type": "markdown",
- "id": "a1be054a",
+ "id": "2390385e",
"metadata": {},
"source": [
"Let's try out prefill too:"
@@ -1768,7 +1784,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "49daabac",
+ "id": "ca827efa",
"metadata": {},
"outputs": [],
"source": [
@@ -1779,7 +1795,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "57080838",
+ "id": "d653be7a",
"metadata": {},
"outputs": [
{
@@ -1816,7 +1832,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "b7b5c0cc",
+ "id": "89ace13a",
"metadata": {},
"outputs": [],
"source": [
@@ -1835,7 +1851,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "fd472c0f",
+ "id": "bf8754f8",
"metadata": {},
"outputs": [
{
@@ -1854,7 +1870,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "bd94fe19",
+ "id": "59ee7154",
"metadata": {},
"outputs": [
{
@@ -1871,7 +1887,7 @@
},
{
"cell_type": "markdown",
- "id": "2551eafd",
+ "id": "2b373e29",
"metadata": {},
"source": [
"### Chat tool use"
@@ -2022,7 +2038,7 @@
},
{
"cell_type": "markdown",
- "id": "29e471ec",
+ "id": "905c9ed1",
"metadata": {},
"source": [
"Claude can handle image data as well. As everyone knows, when testing image APIs you have to use a cute puppy."
@@ -2031,7 +2047,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "4546517d",
+ "id": "a459bd90",
"metadata": {},
"outputs": [
{
@@ -2059,7 +2075,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "bf9811f4",
+ "id": "71f2df78",
"metadata": {},
"outputs": [],
"source": [
@@ -2069,7 +2085,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "1b7eccc4",
+ "id": "db2e60cf",
"metadata": {},
"outputs": [],
"source": [
@@ -2084,7 +2100,7 @@
},
{
"cell_type": "markdown",
- "id": "950a8ca1",
+ "id": "38ef607e",
"metadata": {},
"source": [
"Anthropic have documented the particular `dict` structure that expect image data to be in, so we have a little function to create that for us."
@@ -2093,7 +2109,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "382f6cd4",
+ "id": "a8303c9c",
"metadata": {},
"outputs": [],
"source": [
@@ -2105,7 +2121,7 @@
},
{
"cell_type": "markdown",
- "id": "8ba96473",
+ "id": "205b8e15",
"metadata": {},
"source": [
"A Claude message can be a list of image and text parts. So we've also created a helper for making the text parts."
@@ -2114,7 +2130,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "dc6b5b16",
+ "id": "2f12c74d",
"metadata": {},
"outputs": [],
"source": [
@@ -2125,7 +2141,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "bb19d1bc",
+ "id": "6cbc184e",
"metadata": {},
"outputs": [
{
@@ -2162,7 +2178,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "8aca197c",
+ "id": "aa8f3e30",
"metadata": {},
"outputs": [],
"source": [
@@ -2174,7 +2190,7 @@
},
{
"cell_type": "markdown",
- "id": "cc01aed2",
+ "id": "216d648c",
"metadata": {},
"source": [
"Since we know that `bytes` are images and `str` is text, we can autogenerate a part."
@@ -2183,7 +2199,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "3dfedca4",
+ "id": "80209768",
"metadata": {},
"outputs": [],
"source": [
@@ -2195,7 +2211,7 @@
},
{
"cell_type": "markdown",
- "id": "e5a7a343",
+ "id": "87903f03",
"metadata": {},
"source": [
"...and now we can use that on all the parts!\""
@@ -2204,7 +2220,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "81634431",
+ "id": "f9904cd8",
"metadata": {},
"outputs": [
{
@@ -2241,7 +2257,7 @@
},
{
"cell_type": "markdown",
- "id": "cd1652c4",
+ "id": "b4519c43",
"metadata": {},
"source": [
"Claude also supports uploading an image without any text, in which case it'll make a general comment about what it sees. You can then use `Chat` to ask questions:"
@@ -2250,7 +2266,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "468e19eb",
+ "id": "42b108f8",
"metadata": {},
"outputs": [
{
@@ -2288,7 +2304,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "66a9103f",
+ "id": "e1a03e6c",
"metadata": {},
"outputs": [
{
@@ -2325,7 +2341,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "5a4254bf",
+ "id": "48a3db50",
"metadata": {},
"outputs": [
{
@@ -2361,7 +2377,7 @@
},
{
"cell_type": "markdown",
- "id": "eb30b83c",
+ "id": "406bed25",
"metadata": {},
"source": [
"## XML helpers"
@@ -2369,7 +2385,7 @@
},
{
"cell_type": "markdown",
- "id": "b630360b",
+ "id": "b57cdb88",
"metadata": {},
"source": [
"TODO: Document this bit."
@@ -2378,7 +2394,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "f87de6db",
+ "id": "a9e70b81",
"metadata": {},
"outputs": [],
"source": [
@@ -2392,7 +2408,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "1b308f7e",
+ "id": "3050c10c",
"metadata": {},
"outputs": [],
"source": [
@@ -2414,7 +2430,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "b3ab9521",
+ "id": "db19300f",
"metadata": {},
"outputs": [],
"source": [
@@ -2428,7 +2444,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "42407477",
+ "id": "9b17a3bc",
"metadata": {},
"outputs": [],
"source": [
@@ -2441,7 +2457,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "afa1601d",
+ "id": "d6723992",
"metadata": {},
"outputs": [],
"source": [
@@ -2460,7 +2476,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "b3134ca4",
+ "id": "0c565713",
"metadata": {},
"outputs": [
{
@@ -2495,7 +2511,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "05bcc4d3",
+ "id": "5128d854",
"metadata": {},
"outputs": [],
"source": [
@@ -2517,7 +2533,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "e44dd9be",
+ "id": "44702a10",
"metadata": {},
"outputs": [
{
diff --git a/claudio/core.py b/claudio/core.py
index 0fe4fcf..09457b1 100644
--- a/claudio/core.py
+++ b/claudio/core.py
@@ -44,7 +44,7 @@ def _repr_markdown_(self:(ToolsBetaMessage,Message)):
-{det}
+- {det}
"""