-
-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Added MCP support for chat Completion endpoint. #14564
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Added MCP support for chat Completion endpoint. #14564
Conversation
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is this specific to bedrock? @Simon-Lind-glitch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've only gottent his error while using bedrock and did not find any bug regarding this, so we thought we'd isolate this to bedrock
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the issue your closing refers to ollama and all non-openai providers
the fix would be incorrect to apply if it's just for bedrock. Can we please have this be a more generic fix across providers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
modifying the request body should happen via a async_pre_call_hook
- this is cleaner than modifying the routing logic - https://docs.litellm.ai/docs/proxy/call_hooks
see -
async def async_pre_call_hook( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ill get on it
i need to tinker with this a bit more. ill get back to you once its ready |
Title
Added MCP execution for chat completion endpoint
Relevant issues
Fixes bug #14268
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
Type
🐛 Bug Fix
Changes
Ive added support for MCP tool exeuction to the chat completions pipeline.
While I was at it I also fixed the tool format in the requests towards bedrock since it was the model i was testing with.
Reasonably related issues.