Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Example Kubernetes Config is Not Used Properly #6882

Open
RamboRogers opened this issue Nov 23, 2024 · 2 comments
Open

[Bug]: Example Kubernetes Config is Not Used Properly #6882

RamboRogers opened this issue Nov 23, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@RamboRogers
Copy link

What happened?

When launching the examples for Kubernetes, the config file defined is not used by litellm.

Relevant log output

apiVersion: v1
kind: ConfigMap
metadata:
  name: litellm-config-file
data:
  config.yaml: |
      model_list: 
        - model_name: gpt-3.5-turbo
          litellm_params:
            model: azure/gpt-turbo-small-ca
            api_base: https://my-endpoint-canada-berri992.openai.azure.com/
            api_key: os.environ/CA_AZURE_OPENAI_API_KEY
---
apiVersion: v1
kind: Secret
type: Opaque
metadata:
  name: litellm-secrets
data:
  CA_AZURE_OPENAI_API_KEY: bWVvd19pbV9hX2NhdA== # your api key in base64
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: litellm-deployment
  labels:
    app: litellm
spec:
  selector:
    matchLabels:
      app: litellm
  template:
    metadata:
      labels:
        app: litellm
    spec:
      containers:
      - name: litellm
        image: ghcr.io/berriai/litellm:main-latest # it is recommended to fix a version generally
        ports:
        - containerPort: 4000
        volumeMounts:
        - name: config-volume
          mountPath: /app/proxy_server_config.yaml
          subPath: config.yaml
        envFrom:
        - secretRef:
            name: litellm-secrets
      volumes:
        - name: config-volume
          configMap:
            name: litellm-config-file

Twitter / LinkedIn details

@RamboRogers

@RamboRogers RamboRogers added the bug Something isn't working label Nov 23, 2024
@RamboRogers
Copy link
Author

I resolved by making this change.

spec:
  containers:
  - name: litellm
    image: ghcr.io/berriai/litellm:main-latest
    command: ["/usr/local/bin/python"]
    args: ["/usr/local/bin/litellm", "--port", "4000", "--config", "/app/proxy_server_config.yaml"]

@RamboRogers
Copy link
Author

The link to the bad config example is here,

https://docs.litellm.ai/docs/proxy/deploy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant