Skip to content

Releases: BerriAI/litellm

v1.27.8

27 Feb 06:41
Compare
Choose a tag to compare

What's Changed

  • fix(utils.py): support returning caching streaming response for function calling streaming calls by @krrishdholakia in #2203
  • build(proxy_server.py): fix /spend/logs query bug by @krrishdholakia in #2212

Full Changelog: v1.27.7...v1.27.8

v1.27.7

27 Feb 03:33
Compare
Choose a tag to compare

Use ClickhouseDB for low latency LLM Analytics / Spend Reports

(sub 1s analytics, with 100M logs)

Getting started with ClickHouse DB + LiteLLM Proxy

Docs + Docker compose for getting started with clickhouse: https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---clickhouse

Step 1: Create a config.yaml file and set litellm_settings: success_callback

model_list:
 - model_name: gpt-3.5-turbo
    litellm_params:
      model: gpt-3.5-turbo
litellm_settings:
  success_callback: ["clickhouse"]

Step 2: Set Required env variables for clickhouse

Env Variables for self hosted click house

CLICKHOUSE_HOST = "localhost"
CLICKHOUSE_PORT = "8123"
CLICKHOUSE_USERNAME = "admin"
CLICKHOUSE_PASSWORD = "admin"

Step 3: Start the proxy, make a test request

New Models

Mistral on Azure AI Studio

Sample Usage

Ensure you have the /v1 in your api_base

from litellm import completion
import os

response = completion(
    model="mistral/Mistral-large-dfgfj", 
    api_base="https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1",
    api_key = "JGbKodRcTp****"
    messages=[
       {"role": "user", "content": "hello from litellm"}
   ],
)
print(response)

[LiteLLM Proxy] Using Mistral Models

Set this on your litellm proxy config.yaml

Ensure you have the /v1 in your api_base

model_list:
  - model_name: mistral
    litellm_params:
      model: mistral/Mistral-large-dfgfj
      api_base: https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1
      api_key: JGbKodRcTp****

What's Changed

Full Changelog: v1.27.6...v1.27.7

v1.27.6

26 Feb 21:32
Compare
Choose a tag to compare

New Models

  • azure/text-embedding-3-large
  • azure/text-embedding-3-small
  • mistral/mistral-large-latest

Log LLM Output in ClickHouse DB

litellm.success_callback = ["clickhouse"]
await litellm.acompletion(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": f"This is a test"}],
    max_tokens=10,
    temperature=0.7,
    user="ishaan-2",
)

What's Changed

Full Changelog: v1.27.4...v1.27.6

v1.27.4

25 Feb 11:20
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.27.1...v1.27.4

v1.27.1

24 Feb 08:01
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.26.13...v1.27.1

v1.26.13

23 Feb 19:24
64d2687
Compare
Choose a tag to compare

What's Changed

Sample Usage

from litellm import completion
import os

os.environ['GROQ_API_KEY'] = ""
response = completion(
    model="groq/llama2-70b-4096", 
    messages=[
       {"role": "user", "content": "hello from litellm"}
   ],
)
print(response)

Group 5725

New Contributors

Full Changelog: v1.26.10...v1.26.13

v1.26.11

23 Feb 07:32
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.26.9...v1.26.11

v1.26.10

23 Feb 06:44
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.26.9...v1.26.10

v1.26.9

23 Feb 05:48
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.26.8...v1.26.9

v1.26.8

22 Feb 07:15
Compare
Choose a tag to compare

Enterprise - Admin UI - Use Custom Branding

Group 204
Use your companies custom branding on the LiteLLM Admin UI
Docs here: https://docs.litellm.ai/docs/proxy/ui#custom-branding-admin-ui
We allow you to

  • Customize the UI Logo
  • Customize the UI color theme

What's Changed

Full Changelog: v1.26.7...v1.26.8