-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Issues: BerriAI/litellm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: v1/realtime endpoint not checking api key (Vulnerability)
bug
Something isn't working
#6926
opened Nov 26, 2024 by
mirodrr2
[Bug]: missing together ai serverless models
bug
Something isn't working
#6925
opened Nov 26, 2024 by
CharlieJCJ
[Bug]: AsyncIO RuntimeError in stream success log when running in gevent loop (from celery)
bug
Something isn't working
#6921
opened Nov 26, 2024 by
TheJKM
[Bug]: Proxy customer API endpoint triggers an internal error when attempting to update the Something isn't working
max_budget
value of a customer
bug
#6920
opened Nov 26, 2024 by
mbertrand
[Feature]: Support n (number of completions) in together_ai models
enhancement
New feature or request
#6919
opened Nov 26, 2024 by
slobodan-nf
[Bug]: remove traceback.print_stack
bug
Something isn't working
#6918
opened Nov 26, 2024 by
zaaferani
[Feature]: support tracking remaining tpm/rpm for gemini models
enhancement
New feature or request
mlops user request
#6914
opened Nov 26, 2024 by
krrishdholakia
[Bug]: Different Behavior with Image Input on GROQ/Llama 3.2 Vision Model vs Qwen
bug
Something isn't working
#6912
opened Nov 26, 2024 by
NEWbie0709
[Feature]: Support OAuth token with Vertex AI
enhancement
New feature or request
#6906
opened Nov 26, 2024 by
sean-tr
[Bug]: Bedrock Cross-Region Inference not working - apac models
bug
Something isn't working
#6905
opened Nov 25, 2024 by
MarcusRosen-Rio
Replace deprecated Pydantic Config class with model_config
#6902
opened Nov 25, 2024 by
danielsiwiec
[Feature]: Implement /api/generate for Continue.dev FIM / autocompletion with Ollama?
awaiting: user response
enhancement
New feature or request
#6900
opened Nov 25, 2024 by
deliciousbob
[Feature]: Context Caching for Vertex AI
enhancement
New feature or request
#6898
opened Nov 25, 2024 by
DreamGenX
[Bug]: Facing deadlocks on LiteLLM
awaiting: user response
bug
Something isn't working
#6895
opened Nov 25, 2024 by
xyannie
[Feature]: gemini-1.5-flash-8b support
enhancement
New feature or request
#6894
opened Nov 25, 2024 by
vuongngo
[Bug]: Anthropic images fail with unified anthropic endpoint
bug
Something isn't working
#6893
opened Nov 25, 2024 by
Cyberes
[Bug]: can't add model / Cannot access 'lS' before initialization
bug
Something isn't working
#6892
opened Nov 24, 2024 by
vid
[Bug]: Gemini Pro context caching
bug
Something isn't working
#6891
opened Nov 24, 2024 by
kamalojasv181
[Feature]: Elasticsearch Inference API support
enhancement
New feature or request
#6889
opened Nov 24, 2024 by
stevapple
[Bug]: Example Kubernetes Config is Not Used Properly
bug
Something isn't working
#6882
opened Nov 23, 2024 by
RamboRogers
[Bug]: The "Add new model" dialog doesn't have a provider option for HuggingFace 🤗
bug
Something isn't working
#6879
opened Nov 23, 2024 by
Rami-Sabbagh
[Bug]: Proxy | [BETA] Request Prioritization | Polling logic for scheduler.queue
bug
Something isn't working
#6867
opened Nov 22, 2024 by
thevogoncoder
Unable to save the LiteLLM custom logger logs in a log file
awaiting: user response
#6866
opened Nov 22, 2024 by
manali04shyam
[Bug]: Langsmith async integration doesn't register traces
bug
Something isn't working
#6862
opened Nov 21, 2024 by
tomukmatthews
[Bug]: Langsmith Synchronous Integration in Docs Doesn't Work
bug
Something isn't working
#6861
opened Nov 21, 2024 by
tomukmatthews
Previous Next
ProTip!
no:milestone will show everything without a milestone.