Skip to content

Commit

Permalink
Better wording
Browse files Browse the repository at this point in the history
  • Loading branch information
collindutter committed Sep 3, 2024
1 parent f785172 commit 4de2ca0
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 7 deletions.
11 changes: 6 additions & 5 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,28 +8,29 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## [0.31.0] - 2024-09-03

**Note**: This release includes breaking changes. Please refer to the [Migration Guide](./MIGRATION.md#030x-to-031x) for details.

### Added
- Parameter `meta: dict` on `BaseEvent`.

### Changed
- **BREAKING**: Drivers, Loaders, and Engines will now raises exceptions rather than returning `ErrorArtifacts`.
- **BREAKING**: Drivers, Loaders, and Engines now raise exceptions rather than returning `ErrorArtifacts`.
- **BREAKING**: Parameter `driver` on `BaseConversationMemory` renamed to `conversation_memory_driver`.
- **BREAKING**: `BaseConversationMemory.add_to_prompt_stack` now takes a `prompt_driver` parameter.
- **BREAKING**: `BaseConversationMemoryDriver.load` now returns `tuple[list[Run], Optional[dict]]`.
- **BREAKING**: `BaseConversationMemoryDriver.store` now takes `runs: list[Run]` and `metadata: Optional[dict]` as input.
- **BREAKING**: `BaseConversationMemoryDriver.load` now returns `tuple[list[Run], dict]`. This represents the runs and metadata.
- **BREAKING**: `BaseConversationMemoryDriver.store` now takes `runs: list[Run]` and `metadata: dict` as input.
- **BREAKING**: Parameter `file_path` on `LocalConversationMemoryDriver` renamed to `persist_file` and is now type `Optional[str]`.
- `Defaults.drivers_config.conversation_memory_driver` now defaults to `LocalConversationMemoryDriver` instead of `None`.
- `CsvRowArtifact.to_text()` now includes the header.

### Fixed
- Parsing streaming response with some OpenAi compatible services.
- Parsing streaming response with some OpenAI compatible services.
- Issue in `PromptSummaryEngine` if there are no artifacts during recursive summarization.
- Issue in `GooglePromptDriver` using Tools with no schema.
- Missing `maxTokens` inference parameter in `AmazonBedrockPromptDriver`.
- Incorrect model in `OpenAiDriverConfig`'s `text_to_speech_driver`.
- Crash when using `CohereRerankDriver` with `CsvRowArtifact`s.

**Note**: This release includes breaking changes. Please refer to the [Migration Guide](./MIGRATION.md#030x-to-031x) for details.

## [0.30.2] - 2024-08-26

Expand Down
2 changes: 1 addition & 1 deletion MIGRATION.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ This document provides instructions for migrating your codebase to accommodate b

### Exceptions Over `ErrorArtifact`s

Drivers, Loaders, and Engines will now raises exceptions rather than returning `ErrorArtifact`s.
Drivers, Loaders, and Engines now raise exceptions rather than returning `ErrorArtifact`s.
Update any logic that expects `ErrorArtifact` to handle exceptions instead.

#### Before
Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,8 @@ The important thing to note here is that no matter how big the webpage is it can
In the above example, we set [off_prompt](https://docs.griptape.ai/stable/griptape-framework/structures/task-memory.md#off-prompt) to `True`, which means that the LLM can never see the data it manipulates, but can send it to other Tools.

> [!IMPORTANT]
> This example uses Griptape's [ToolkitTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#toolkit-task), which requires a highly capable LLM to function correctly. If you're using a less powerful LLM, consider using the [ToolTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#tool-task) instead, as the `ToolkitTask` might not work properly or at all.
> This example uses Griptape's [ToolkitTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#toolkit-task), which requires a highly capable LLM to function correctly. By default, Griptape uses the [OpenAiChatPromptDriver](https://docs.griptape.ai/stable/griptape-framework/drivers/prompt-drivers/#openai-chat); for another powerful LLM try swapping to the [AnthropicPromptDriver](https://docs.griptape.ai/stable/griptape-framework/drivers/prompt-drivers/#anthropic)!
If you're using a less powerful LLM, consider using the [ToolTask](https://docs.griptape.ai/stable/griptape-framework/structures/tasks/#tool-task) instead, as the `ToolkitTask` might not work properly or at all.

[Check out our docs](https://docs.griptape.ai/stable/griptape-framework/drivers/prompt-drivers/) to learn more about how to use Griptape with other LLM providers like Anthropic, Claude, Hugging Face, and Azure.

Expand Down

0 comments on commit 4de2ca0

Please sign in to comment.