Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LiteLLM Minor Fixes & Improvements (11/26/2024) #6913

Open
wants to merge 18 commits into
base: main
Choose a base branch
from

Conversation

krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented Nov 26, 2024

Title

Relevant issues

Causes downstream errors if ui just fails to load team list
Closes #6914

Type

🆕 New Feature
🐛 Bug Fix
🧹 Refactoring
📖 Documentation
🚄 Infrastructure
✅ Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Copy link

vercel bot commented Nov 26, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Nov 27, 2024 7:18am

…test to base_llm_unit_tests

adds complete coverage for all 'response_format' values to ci/cd
…ll gemini models

Allows for ratelimit tracking for gemini models even with wildcard routing enabled

Addresses #6914
rpm_key = RouterCacheEnum.RPM.value.format(
id=id, current_minute=current_minute, model=deployment_name
)
await self.cache.async_increment_cache(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you please make sure this new op does not impact latency / perf @krrishdholakia

In my tests adding 1 increment operation increase median latency by ~300ms at 100 RPS

@krrishdholakia krrishdholakia changed the title Litellm dev 11 26 2024 LiteLLM Minor Fixes & Improvements (11/26/2024) Nov 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: support tracking remaining tpm/rpm for gemini models
2 participants