From 4de2ca0a5cd045c2f00a5e93bfcd6e7ac74dc6bb Mon Sep 17 00:00:00 2001 From: Collin Dutter Date: Tue, 3 Sep 2024 15:37:37 -0700 Subject: [PATCH] Better wording --- CHANGELOG.md | 11 ++++++----- MIGRATION.md | 2 +- README.md | 3 ++- 3 files changed, 9 insertions(+), 7 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index b2f3e5a9b..e7d833612 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -8,28 +8,29 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ## [0.31.0] - 2024-09-03 +**Note**: This release includes breaking changes. Please refer to the [Migration Guide](./MIGRATION.md#030x-to-031x) for details. + ### Added - Parameter `meta: dict` on `BaseEvent`. ### Changed -- **BREAKING**: Drivers, Loaders, and Engines will now raises exceptions rather than returning `ErrorArtifacts`. +- **BREAKING**: Drivers, Loaders, and Engines now raise exceptions rather than returning `ErrorArtifacts`. - **BREAKING**: Parameter `driver` on `BaseConversationMemory` renamed to `conversation_memory_driver`. - **BREAKING**: `BaseConversationMemory.add_to_prompt_stack` now takes a `prompt_driver` parameter. -- **BREAKING**: `BaseConversationMemoryDriver.load` now returns `tuple[list[Run], Optional[dict]]`. -- **BREAKING**: `BaseConversationMemoryDriver.store` now takes `runs: list[Run]` and `metadata: Optional[dict]` as input. +- **BREAKING**: `BaseConversationMemoryDriver.load` now returns `tuple[list[Run], dict]`. This represents the runs and metadata. +- **BREAKING**: `BaseConversationMemoryDriver.store` now takes `runs: list[Run]` and `metadata: dict` as input. - **BREAKING**: Parameter `file_path` on `LocalConversationMemoryDriver` renamed to `persist_file` and is now type `Optional[str]`. - `Defaults.drivers_config.conversation_memory_driver` now defaults to `LocalConversationMemoryDriver` instead of `None`. - `CsvRowArtifact.to_text()` now includes the header. ### Fixed -- Parsing streaming response with some OpenAi compatible services. +- Parsing streaming response with some OpenAI compatible services. - Issue in `PromptSummaryEngine` if there are no artifacts during recursive summarization. - Issue in `GooglePromptDriver` using Tools with no schema. - Missing `maxTokens` inference parameter in `AmazonBedrockPromptDriver`. - Incorrect model in `OpenAiDriverConfig`'s `text_to_speech_driver`. - Crash when using `CohereRerankDriver` with `CsvRowArtifact`s. -**Note**: This release includes breaking changes. Please refer to the [Migration Guide](./MIGRATION.md#030x-to-031x) for details. ## [0.30.2] - 2024-08-26 diff --git a/MIGRATION.md b/MIGRATION.md index 89ba95494..af8835e5b 100644 --- a/MIGRATION.md +++ b/MIGRATION.md @@ -6,7 +6,7 @@ This document provides instructions for migrating your codebase to accommodate b ### Exceptions Over `ErrorArtifact`s -Drivers, Loaders, and Engines will now raises exceptions rather than returning `ErrorArtifact`s. +Drivers, Loaders, and Engines now raise exceptions rather than returning `ErrorArtifact`s. Update any logic that expects `ErrorArtifact` to handle exceptions instead. #### Before diff --git a/README.md b/README.md index c127a6077..95f6326dd 100644 --- a/README.md +++ b/README.md @@ -170,7 +170,8 @@ The important thing to note here is that no matter how big the webpage is it can In the above example, we set [off_prompt](https://docs.griptape.ai/stable/griptape-framework/structures/task-memory.md#off-prompt) to `True`, which means that the LLM can never see the data it manipulates, but can send it to other Tools. > [!IMPORTANT] -> This example uses Griptape's [ToolkitTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#toolkit-task), which requires a highly capable LLM to function correctly. If you're using a less powerful LLM, consider using the [ToolTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#tool-task) instead, as the `ToolkitTask` might not work properly or at all. +> This example uses Griptape's [ToolkitTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#toolkit-task), which requires a highly capable LLM to function correctly. By default, Griptape uses the [OpenAiChatPromptDriver](https://docs.griptape.ai/stable/griptape-framework/drivers/prompt-drivers/#openai-chat); for another powerful LLM try swapping to the [AnthropicPromptDriver](https://docs.griptape.ai/stable/griptape-framework/drivers/prompt-drivers/#anthropic)! +If you're using a less powerful LLM, consider using the [ToolTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#tool-task) instead, as the `ToolkitTask` might not work properly or at all. [Check out our docs](https://docs.griptape.ai/stable/griptape-framework/drivers/prompt-drivers/) to learn more about how to use Griptape with other LLM providers like Anthropic, Claude, Hugging Face, and Azure.