-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Issues: BerriAI/litellm
[Feature]:
aiohttp
migration - 10-100x Higher RPS Master ti...
#7544
opened Jan 4, 2025 by
ishaan-jaff
Open
3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature]: Allow custom embeddings models via CustomLLM
enhancement
New feature or request
#8660
opened Feb 19, 2025 by
mattmalcher
[Feature]: Add Support for Requesty.ai in LiteLLM
enhancement
New feature or request
#8659
opened Feb 19, 2025 by
ViezeVingertjes
[Feature]: Support for 'Retrieve run' and 'Submit tool outputs to run' which are part of OpenAI's Assistant APIs Runs Feature
enhancement
New feature or request
mlops user request
#8658
opened Feb 19, 2025 by
Nikhil101
[Bug]: No module named tzdata
bug
Something isn't working
mlops user request
#8657
opened Feb 19, 2025 by
bharath-muthineni
[Bug]: output_cost_per_image for vllm
bug
Something isn't working
#8656
opened Feb 19, 2025 by
Mte90
[Bug]: Reasoning Content as default is not expected.
bug
Something isn't working
#8653
opened Feb 19, 2025 by
tonysy
[Bug]: Stream Timeout doesn't work for Bedrock models
bug
Something isn't working
#8652
opened Feb 19, 2025 by
jonas-lyrebird-health
[Bug]: Helm chart ignores PROXY_MASTER_KEY environment variable and always generates its own
bug
Something isn't working
#8650
opened Feb 19, 2025 by
hnykda
[Bug]: Migration job in the helm chart has inconsistent DB specification
bug
Something isn't working
#8649
opened Feb 19, 2025 by
hnykda
[Feature]: Improving Retry Mechanism Consistency and Logging for Streamed Responses in LiteLLM Proxy
enhancement
New feature or request
#8648
opened Feb 19, 2025 by
fengjiajie
[Feature]: more conventional + configurable python logging
enhancement
New feature or request
mlops user request
#8641
opened Feb 19, 2025 by
NorthIsUp
[Bug]: [Nit] max_completion_tokens not supported on Azure when api_version is not specified
bug
Something isn't working
#8638
opened Feb 19, 2025 by
enyst
[Feature]: support adding litellm_metadata to gemini passthrough
enhancement
New feature or request
#8634
opened Feb 18, 2025 by
trashhalo
[Bug]: gemini fallback not working
bug
Something isn't working
#8632
opened Feb 18, 2025 by
clarity99
[Bug]: Reasoning with OpenRouter is not available while streaming the completion
bug
Something isn't working
#8631
opened Feb 18, 2025 by
maykcaldas
[Bug]: async_post_call_streaming_hook
bug
Something isn't working
#8628
opened Feb 18, 2025 by
tony-tvu
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
#8621
opened Feb 18, 2025 by
VamshikrishnaAluwala
[Bug]: Memory Leak in Something isn't working
feb 2025
completion()
with stream=True
bug
#8620
opened Feb 18, 2025 by
iwamot
[Feature]: Refactor litellm proxy's custom auth flow
enhancement
New feature or request
#8602
opened Feb 18, 2025 by
wagnerjt
[Bug]: VertexAI custom model does not pick up uploaded token
bug
Something isn't working
mlops user request
#8597
opened Feb 17, 2025 by
suresiva
[Bug]: KeyError: 'name' error with local ollama models
bug
Something isn't working
#8594
opened Feb 17, 2025 by
hajdul88
[Bug]: _return_huggingface_tokenizer missing models [Patch]
bug
Something isn't working
#8587
opened Feb 17, 2025 by
Mte90
[Bug]: Usage UI doesn't have a monthly filter
bug
Something isn't working
#8586
opened Feb 17, 2025 by
Mte90
Previous Next
ProTip!
Updated in the last three days: updated:>2025-02-16.