You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0, the interpreter crashes if two consecutive user messages are sent.
Reproduce
▌ Model set to bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
Open Interpreter will require approval before running code.
Use interpreter -y to bypass this.
Press CTRL-C to exit.
What is the current directory?
I'll help you check the current directory using a shell command.
pwd
Would you like to run this code? (y/n)
n
Hi!
21:33:46 - LiteLLM:WARNING: factory.py:2485 - Potential consecutive user/tool blocks. Trying to merge. If error occurs, please set a 'assistant_continue_message' or set 'modify_params=True' to insert a dummy assistant message for bedrock calls.
21:33:46 - LiteLLM:WARNING: factory.py:2485 - Potential consecutive user/tool blocks. Trying to merge. If error occurs, please set a 'assistant_continue_message' or set 'modify_params=True' to insert a dummy assistant message for bedrock calls.
21:33:46 - LiteLLM:WARNING: factory.py:2485 - Potential consecutive user/tool blocks. Trying to merge. If error occurs, please set a 'assistant_continue_message' or set 'modify_params=True' to insert a dummy assistant message for bedrock calls.
21:33:46 - LiteLLM:WARNING: factory.py:2485 - Potential consecutive user/tool blocks. Trying to merge. If error occurs, please set a 'assistant_continue_message' or set 'modify_params=True' to insert a dummy assistant message for bedrock calls.
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/main.py", line 2574, in completion
response = bedrock_converse_chat_completion.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/llms/bedrock/chat/converse_handler.py", line 388, in completion
completion_stream = make_sync_call(
^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/llms/bedrock/chat/converse_handler.py", line 59, in make_sync_call
response = client.post(
^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 372, in post
raise e
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 358, in post
response.raise_for_status()
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/httpx/_models.py", line 763, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://bedrock-runtime.us-west-2.amazonaws.com/model/anthropic.claude-3-5-sonnet-20241022-v2:0/converse-stream'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.12/bin/interpreter", line 8, in
sys.exit(main())
^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 612, in main
start_terminal_interface(interpreter)
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 578, in start_terminal_interface
interpreter.chat()
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 191, in chat
for _ in self._streaming_chat(message=message, display=display):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 223, in _streaming_chat
yield from terminal_interface(self, message)
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/terminal_interface/terminal_interface.py", line 162, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 259, in _streaming_chat
yield from self._respond_and_store()
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 318, in _respond_and_store
for chunk in respond(self):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/respond.py", line 87, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 322, in run
yield from run_tool_calling_llm(self, params)
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/run_tool_calling_llm.py", line 178, in run_tool_calling_llm
for chunk in llm.completions(**request_params):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 466, in fixed_litellm_completions
raise first_error # If all attempts fail, raise the first error
^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 443, in fixed_litellm_completions
yield from litellm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 1006, in wrapper
raise e
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 896, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/main.py", line 3009, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
raise e
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 743, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: BedrockException - b'{"message":"messages.2.content: Conversation blocks and tool result blocks cannot be provided in the same turn."}'
. Enable 'litellm.modify_params=True' (for PROXY do: litellm_settings::modify_params: True) to insert a dummy assistant message and fix this error.
Expected behavior
I added litellm.modify_params=True to /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py
And now the interpreter works:
▌ Model set to bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
Open Interpreter will require approval before running code.
Use interpreter -y to bypass this.
Press CTRL-C to exit.
What is the current directory?
I'll help you check the current directory using a shell command.
pwd
Would you like to run this code? (y/n)
n
Hi!
Hello! Since you declined the code execution, I understand you may have concerns about running commands. That's perfectly fine!
I can help answer questions and assist with tasks while being mindful of any security preferences you have. Please let me know what you'd like to do, and I'll make
sure to explain what any suggested code would do before executing it.
Is there something specific you'd like help with?
Screenshots
No response
Open Interpreter version
0.4.3
Python version
3.12
Operating System name and version
macOS version 15.1 (Sonoma)
Additional context
No response
The text was updated successfully, but these errors were encountered:
Describe the bug
When using
bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
, the interpreter crashes if two consecutive user messages are sent.Reproduce
▌ Model set to bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
Open Interpreter will require approval before running code.
Use interpreter -y to bypass this.
Press CTRL-C to exit.
I'll help you check the current directory using a shell command.
pwd
Would you like to run this code? (y/n)
n
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.12/bin/interpreter", line 8, in
sys.exit(main())
^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 612, in main
start_terminal_interface(interpreter)
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 578, in start_terminal_interface
interpreter.chat()
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 191, in chat
for _ in self._streaming_chat(message=message, display=display):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 223, in _streaming_chat
yield from terminal_interface(self, message)
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/terminal_interface/terminal_interface.py", line 162, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 259, in _streaming_chat
yield from self._respond_and_store()
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/core.py", line 318, in _respond_and_store
for chunk in respond(self):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/respond.py", line 87, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 322, in run
yield from run_tool_calling_llm(self, params)
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/run_tool_calling_llm.py", line 178, in run_tool_calling_llm
for chunk in llm.completions(**request_params):
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 466, in fixed_litellm_completions
raise first_error # If all attempts fail, raise the first error
^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 443, in fixed_litellm_completions
yield from litellm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 1006, in wrapper
raise e
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 896, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/main.py", line 3009, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
raise e
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 743, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: BedrockException - b'{"message":"messages.2.content: Conversation blocks and tool result blocks cannot be provided in the same turn."}'
. Enable 'litellm.modify_params=True' (for PROXY do:
litellm_settings::modify_params: True
) to insert a dummy assistant message and fix this error.Expected behavior
I added
litellm.modify_params=True
to /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interpreter/core/llm/llm.pyAnd now the interpreter works:
▌ Model set to bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
Open Interpreter will require approval before running code.
Use interpreter -y to bypass this.
Press CTRL-C to exit.
I'll help you check the current directory using a shell command.
pwd
Would you like to run this code? (y/n)
n
Hello! Since you declined the code execution, I understand you may have concerns about running commands. That's perfectly fine!
I can help answer questions and assist with tasks while being mindful of any security preferences you have. Please let me know what you'd like to do, and I'll make
sure to explain what any suggested code would do before executing it.
Is there something specific you'd like help with?
Screenshots
No response
Open Interpreter version
0.4.3
Python version
3.12
Operating System name and version
macOS version 15.1 (Sonoma)
Additional context
No response
The text was updated successfully, but these errors were encountered: