-
-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: tool api call broken when user answer no to when asked for confirmation #371
base: master
Are you sure you want to change the base?
fix: tool api call broken when user answer no to when asked for confirmation #371
Conversation
@@ -235,9 +235,6 @@ def step( | |||
if msg_response: | |||
yield msg_response.replace(quiet=True) | |||
yield from execute_msg(msg_response, confirm) | |||
except KeyboardInterrupt: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is now handled in the execute_msg
as we need the call_id
in the message response.
@@ -290,10 +290,9 @@ def execute(self, confirm: ConfirmFunc) -> Generator[Message, None, None]: | |||
confirm, | |||
) | |||
if isinstance(ex, Generator): | |||
for msg in ex: | |||
yield msg.replace(call_id=self.call_id) | |||
yield from ex |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
call_id
attribution moved to execute_msg
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 Looks good to me! Reviewed everything up to 2d25236 in 1 minute and 12 seconds
More details
- Looked at
223
lines of code in7
files - Skipped
0
files when reviewing. - Skipped posting
2
drafted comments based on config settings.
1. gptme/llm/llm_anthropic.py:203
- Draft comment:
The change frommessage["role"] == "system"
tomessage["role"] == "user"
seems incorrect. The original condition was likely correct as it was handling system messages withcall_id
. Please verify the intent of this change. - Reason this comment was not posted:
Comment did not seem useful.
2. gptme/llm/llm_openai.py:287
- Draft comment:
The function_merge_tool_results_with_same_call_id
is defined but not used in this file. Consider integrating it where necessary or removing it if not needed. - Reason this comment was not posted:
Comment looked like it was already resolved.
Workflow ID: wflow_Iq0Ct3a5VFKQWQwU
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet
mode, and more.
This MR fixes a few bug regarding tool call in tool format:
Ctrl+c
to interrupt gptmeThe underlying problems where that sometimes the
call_id
wasn't set in certain situations or multiple messages where returned by thetooluse.execute
call but all were the response from the tool so we needed to merge them.Important
Fixes tool API call issues by ensuring
call_id
is set and merging tool responses, with updates to message handling and tests.Ctrl+c
.call_id
is set correctly inexecute_msg()
intools/__init__.py
andToolUse.execute()
intools/base.py
.call_id
in_merge_tool_results_with_same_call_id()
inllm_openai.py
andllm_anthropic.py
._handle_tools()
inllm_anthropic.py
andllm_openai.py
to handle user role messages withcall_id
._transform_system_messages()
inllm_anthropic.py
to merge consecutive user messages with the samecall_id
.test_llm_anthropic.py
andtest_llm_openai.py
to verify message conversion and tool handling withcall_id
.This description was created by for 2d25236. It will automatically update as commits are pushed.