Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: using pydantic model for structured output returning error for Anthropic models. #6766

Closed
dannylee1020 opened this issue Nov 16, 2024 · 4 comments · Fixed by #6820 · May be fixed by #6863
Closed

[Bug]: using pydantic model for structured output returning error for Anthropic models. #6766

dannylee1020 opened this issue Nov 16, 2024 · 4 comments · Fixed by #6820 · May be fixed by #6863
Assignees
Labels
bug Something isn't working

Comments

@dannylee1020
Copy link

What happened?

Description

litellm.completion returning error when pydantic model is used for structured output in response_format for Anthropic models. This was tested with both Anthropic and OpenAI models and it works only on OpenAI models without an error. It also works for previous versions < 1.52.8, so I suspect something has changed in the release.

  • litellm version : 1.52.8

Relevant log output

Traceback (most recent call last):
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 567, in completion
    response = client.post(
               ^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 389, in post
    raise e
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 375, in post
    response.raise_for_status()
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/httpx/_models.py", line 763, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/main.py", line 1770, in completion
    response = anthropic_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 582, in completion
    raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"invalid_request_error","message":"tools.0.input_schema: JSON schema is invalid - please consult https://json-schema.org or our documentation at https://docs.anthropic.com/en/docs/tool-use"}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/dannylee1020/repos/peony/tests/test.py", line 74, in <module>
    res = litellm.completion(
          ^^^^^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/utils.py", line 960, in wrapper
    raise e
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/utils.py", line 849, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/main.py", line 3034, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2125, in exception_type
    raise e
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 469, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"tools.0.input_schema: JSON schema is invalid - please consult https://json-schema.org or our documentation at https://docs.anthropic.com/en/docs/tool-use"}}

Twitter / LinkedIn details

No response

@dannylee1020 dannylee1020 added the bug Something isn't working label Nov 16, 2024
@ishaan-jaff
Copy link
Contributor

hi @dannylee1020 - can you share the request you're making with litellm?

We recently moved to use Anthropic Tool use for JSON responses - this might be related

@dannylee1020
Copy link
Author

class TestModel(BaseModel):
    first_response: str

res = litellm.completion(
    model="claude-3-5-sonnet-20240620",
    messages=[
        {"role": "system", "content": PROMPT},
        {"role": "user","content": QUERY,},
    ],
    response_format=TestModel,
)

@DaveDeCaprio
Copy link

I'm seeing this also. Downgrading to 1.52.5 fixed it.

@krrishdholakia krrishdholakia self-assigned this Nov 19, 2024
@krrishdholakia
Copy link
Contributor

able to repro this

krrishdholakia added a commit that referenced this issue Nov 19, 2024
…n_schema

fixes passing pydantic obj to anthropic

Fixes #6766
ishaan-jaff pushed a commit that referenced this issue Nov 22, 2024
…n_schema

fixes passing pydantic obj to anthropic

Fixes #6766
ishaan-jaff pushed a commit that referenced this issue Nov 22, 2024
* fix(anthropic/chat/transformation.py): add json schema as values: json_schema

fixes passing pydantic obj to anthropic

Fixes #6766

* (feat): Add timestamp_granularities parameter to transcription API (#6457)

* Add timestamp_granularities parameter to transcription API

* add param to the local test

* fix(databricks/chat.py): handle max_retries optional param handling for openai-like calls

Fixes issue with calling finetuned vertex ai models via databricks route

* build(ui/): add team admins via proxy ui

* fix: fix linting error

* test: fix test

* docs(vertex.md): refactor docs

* test: handle overloaded anthropic model error

* test: remove duplicate test

* test: fix test

* test: update test to handle model overloaded error

---------

Co-authored-by: Show <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
4 participants