Skip to content

Commit

Permalink
chore: merge main
Browse files Browse the repository at this point in the history
  • Loading branch information
William Bakst committed Oct 11, 2024
2 parents 979487c + 8dcb4db commit f464127
Show file tree
Hide file tree
Showing 360 changed files with 19,979 additions and 9,684 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,11 @@ on:
push:
branches:
- main
- release/*
pull_request:
branches:
- main
- release/*

jobs:
lints:
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,11 @@ on:
push:
branches:
- main
- release/*
pull_request:
branches:
- main
- release/*

jobs:
tests:
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

---

Mirascope is a powerful, flexible, and user-friendly library that simplifies the process of working with LLMs through a unified interface that works across various supported providers, including [OpenAI](https://openai.com/), [Anthropic](https://www.anthropic.com/), [Mistral](https://mistral.ai/), [Gemini](https://gemini.google.com), [Groq](https://groq.com/), [Cohere](https://cohere.com/), [LiteLLM](https://www.litellm.ai/), [Azure AI](https://azure.microsoft.com/en-us/solutions/ai), and [Vertex AI](https://cloud.google.com/vertex-ai).
Mirascope is a powerful, flexible, and user-friendly library that simplifies the process of working with LLMs through a unified interface that works across various supported providers, including [OpenAI](https://openai.com/), [Anthropic](https://www.anthropic.com/), [Mistral](https://mistral.ai/), [Gemini](https://gemini.google.com), [Groq](https://groq.com/), [Cohere](https://cohere.com/), [LiteLLM](https://www.litellm.ai/), [Azure AI](https://azure.microsoft.com/en-us/solutions/ai), [Vertex AI](https://cloud.google.com/vertex-ai), and [Bedrock](https://aws.amazon.com/bedrock/).

Whether you're generating text, extracting structured information, or developing complex AI-driven agent systems, Mirascope provides the tools you need to streamline your development process and create powerful, robust applications.

Expand Down
2 changes: 2 additions & 0 deletions docs/WHY.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,8 @@ Let's compare structured outputs using Mirascope vs. the official SDKs:
```python hl_lines="26-46 51"
{% elif provider == "Vertex AI" %}
```python hl_lines="23-62 67"
{% elif provider == "Bedrock" %}
```python hl_lines="17-48 53"
{% else %}
```python hl_lines="18-39 44"
{% endif %}
Expand Down
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/call.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.call

::: mirascope.core.bedrock.call
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/call_params.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.call_params

::: mirascope.core.bedrock.call_params
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/call_response.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.call_response

::: mirascope.core.bedrock.call_response
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/call_response_chunk.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.call_response_chunk

::: mirascope.core.bedrock.call_response_chunk
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/dynamic_config.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.dynamic_config

::: mirascope.core.bedrock.dynamic_config
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/stream.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.stream

::: mirascope.core.bedrock.stream
3 changes: 3 additions & 0 deletions docs/api/core/bedrock/tool.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# mirascope.core.bedrock.tool

::: mirascope.core.bedrock.tool
12 changes: 7 additions & 5 deletions docs/extra/tweaks.css
Original file line number Diff line number Diff line change
@@ -1,14 +1,16 @@
/* Color Palette */
:root {
--md-primary-fg-color: #6366f1;
--md-primary-fg-color--light: #6366f1;
--md-primary-fg-color--dark: #6366f1;
}

/*
* Original credit for this admonition implementation goes to the team at Pydantic
* https://github.com/pydantic/pydantic/blob/main/docs/extra/tweaks.css
*/

/* API documentation link admonition */
:root {
--md-primary-fg-color: #6366f1;
--md-primary-fg-color--light: #6366f1;
--md-primary-fg-color--dark: #6366f1;
}

:root {
--md-admonition-icon--api: url('data:image/svg+xml;charset=utf-8,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path d="M7 7H5a2 2 0 0 0-2 2v8h2v-4h2v4h2V9a2 2 0 0 0-2-2m0 4H5V9h2m7-2h-4v10h2v-4h2a2 2 0 0 0 2-2V9a2 2 0 0 0-2-2m0 4h-2V9h2m6 0v6h1v2h-4v-2h1V9h-1V7h4v2Z"/></svg>')
Expand Down
4 changes: 3 additions & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

LLM abstractions that aren't obstructions.

Mirascope is a powerful, flexible, and user-friendly library that simplifies the process of working with LLMs through a unified interface that works across various supported providers, including [OpenAI](https://openai.com/), [Anthropic](https://www.anthropic.com/), [Mistral](https://mistral.ai/), [Gemini](https://gemini.google.com), [Groq](https://groq.com/), [Cohere](https://cohere.com/), [LiteLLM](https://www.litellm.ai/), [Azure AI](https://azure.microsoft.com/en-us/solutions/ai), and [Vertex AI](https://cloud.google.com/vertex-ai).
Mirascope is a powerful, flexible, and user-friendly library that simplifies the process of working with LLMs through a unified interface that works across various supported providers, including [OpenAI](https://openai.com/), [Anthropic](https://www.anthropic.com/), [Mistral](https://mistral.ai/), [Gemini](https://gemini.google.com), [Groq](https://groq.com/), [Cohere](https://cohere.com/), [LiteLLM](https://www.litellm.ai/), [Azure AI](https://azure.microsoft.com/en-us/solutions/ai), [Vertex AI](https://cloud.google.com/vertex-ai), and [Bedrock](https://aws.amazon.com/bedrock/).

Whether you're generating text, extracting structured information, or developing complex AI-driven agent systems, Mirascope provides the tools you need to streamline your development process and create powerful, robust applications.

Expand Down Expand Up @@ -49,6 +49,8 @@ Install Mirascope, specifying the provider(s) you intend to use, and set your AP
{% elif provider == "Vertex AI" %}
gcloud init
gcloud auth application-default login
{% elif provider == "Bedrock" %}
aws configure
{% else %}
{% if os == "Windows" %}set {{ upper(provider | provider_dir) }}_API_KEY=XXXXX
{% else %}export {{ upper(provider | provider_dir) }}_API_KEY=XXXXX
Expand Down
2 changes: 2 additions & 0 deletions docs/learn/async.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,8 @@ It's important to note that you must use the correct client that supports asynch
```python hl_lines="2 5"
{% elif provider == "Azure AI" %}
```python hl_lines="1-2 8-10"
{% elif provider == "Bedrock" %}
```python hl_lines="7-10 14"
{% else %}
```python hl_lines="1 5"
{% endif %}
Expand Down
2 changes: 2 additions & 0 deletions docs/learn/calls.md
Original file line number Diff line number Diff line change
Expand Up @@ -303,6 +303,8 @@ To use a custom client, you can pass it to the `call` decorator using the `clien
```python hl_lines="2 5"
{% elif provider == "Azure AI" %}
```python hl_lines="1-2 8-10"
{% elif provider == "Bedrock" %}
```python hl_lines="1 6"
{% else %}
```python hl_lines="1 5"
{% endif %}
Expand Down
9 changes: 8 additions & 1 deletion docs/learn/output_parsers.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,12 +27,19 @@ Let's take a look at a basic example:

{% for provider in supported_llm_providers %}
=== "{{ provider }}"

{% if method == "base_message_param" %}
{% if provider == "Bedrock" %}
```python hl_lines="10 22"
{% else %}
```python hl_lines="9 20"
{% endif %}
{% else %}
{% if provider == "Bedrock" %}
```python hl_lines="10 17"
{% else %}
```python hl_lines="9 15"
{% endif %}
{% endif %}
--8<-- "examples/learn/output_parsers/basic_usage/{{ provider | provider_dir }}/{{ method }}.py"
```
{% endfor %}
Expand Down
33 changes: 31 additions & 2 deletions docs/learn/response_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,8 +190,11 @@ By default, `response_model` uses [Tools](./tools.md) under the hood. You can in

{% for provider in supported_llm_providers %}
=== "{{ provider }}"

{% if provider == "Bedrock" %}
```python hl_lines="13"
{% else %}
```python hl_lines="12"
{% endif %}
--8<-- "examples/learn/response_models/json_mode/{{ provider | provider_dir }}/{{ method }}.py"
```
{% endfor %}
Expand Down Expand Up @@ -238,8 +241,11 @@ If you set `stream=True` when `response_model` is set, your LLM call will return

{% for provider in supported_llm_providers %}
=== "{{ provider }}"

{% if provider == "Bedrock" %}
```python hl_lines="11 18-19"
{% else %}
```python hl_lines="10 16-17"
{% endif %}
--8<-- "examples/learn/response_models/streaming/{{ provider | provider_dir }}/{{ method }}.py"
```
{% endfor %}
Expand All @@ -250,6 +256,29 @@ Once exhausted, you can access the final, full response model through the `const

You can also use the `stream` property to access the `BaseStream` instance and [all of it's properties](./streams.md#common-stream-properties-and-methods).

## FromCallArgs

Fields annotated with `FromCallArgs` will be populated with the corresponding argument from the function call rather than expecting it from the LLM's response. This enables seamless validation of LLM outputs against function inputs:

!!! mira "Mirascope"

{% for method, method_title in zip(prompt_writing_methods, prompt_writing_method_titles) %}
=== "{{ method_title }}"

{% for provider in supported_llm_providers %}
=== "{{ provider }}"

{% if method == "string_template" %}
```python hl_lines="14 26"
{% else %}
```python hl_lines="14 25"
{% endif %}
--8<-- "examples/learn/response_models/from_call_args/{{ provider | provider_dir }}/{{ method }}.py"
```
{% endfor %}

{% endfor %}

## Next Steps

By following these best practices and leveraging Response Models effectively, you can create more robust, type-safe, and maintainable LLM-powered applications with Mirascope.
Expand Down
8 changes: 8 additions & 0 deletions docs/learn/tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -223,10 +223,18 @@ Mirascope supports streaming responses with tools, which is useful for long-runn
=== "{{ provider }}"

{% if tool_method == "function" %}
{% if provider == "Bedrock" %}
```python hl_lines="19 26-28"
{% else %}
```python hl_lines="18 24-26"
{% endif %}
{% else %}
{% if provider == "Bedrock" %}
```python hl_lines="20 27-29"
{% else %}
```python hl_lines="19 25-27"
{% endif %}
{% endif %}
--8<-- "examples/learn/tools/streams/{{ provider | provider_dir }}/{{ tool_method }}/{{ method }}.py"
```
{% endfor %}
Expand Down
Loading

0 comments on commit f464127

Please sign in to comment.