Skip to content

Commit

Permalink
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix missing maxTokens in AmazonBedrockPromptDriver
Browse files Browse the repository at this point in the history
collindutter committed Aug 29, 2024
1 parent 49fb104 commit c9c7a1d
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -21,6 +21,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Parsing streaming response with some OpenAi compatible services.
- Issue in `PromptSummaryEngine` if there are no artifacts during recursive summarization.
- Issue in `GooglePromptDriver` using Tools with no schema.
- Missing `maxTokens` inference parameter in `AmazonBedrockPromptDriver`.

**Note**: This release includes breaking changes. Please refer to the [Migration Guide](./MIGRATION.md#030x-to-031x) for details.

2 changes: 1 addition & 1 deletion griptape/drivers/prompt/amazon_bedrock_prompt_driver.py
Original file line number Diff line number Diff line change
@@ -98,7 +98,7 @@ def _base_params(self, prompt_stack: PromptStack) -> dict:
"modelId": self.model,
"messages": messages,
"system": system_messages,
"inferenceConfig": {"temperature": self.temperature},
"inferenceConfig": {"temperature": self.temperature, "maxTokens": self.max_tokens},
"additionalModelRequestFields": self.additional_model_request_fields,
**(
{"toolConfig": {"tools": self.__to_bedrock_tools(prompt_stack.tools), "toolChoice": self.tool_choice}}

0 comments on commit c9c7a1d

Please sign in to comment.