We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When launching the examples for Kubernetes, the config file defined is not used by litellm.
apiVersion: v1 kind: ConfigMap metadata: name: litellm-config-file data: config.yaml: | model_list: - model_name: gpt-3.5-turbo litellm_params: model: azure/gpt-turbo-small-ca api_base: https://my-endpoint-canada-berri992.openai.azure.com/ api_key: os.environ/CA_AZURE_OPENAI_API_KEY --- apiVersion: v1 kind: Secret type: Opaque metadata: name: litellm-secrets data: CA_AZURE_OPENAI_API_KEY: bWVvd19pbV9hX2NhdA== # your api key in base64 --- apiVersion: apps/v1 kind: Deployment metadata: name: litellm-deployment labels: app: litellm spec: selector: matchLabels: app: litellm template: metadata: labels: app: litellm spec: containers: - name: litellm image: ghcr.io/berriai/litellm:main-latest # it is recommended to fix a version generally ports: - containerPort: 4000 volumeMounts: - name: config-volume mountPath: /app/proxy_server_config.yaml subPath: config.yaml envFrom: - secretRef: name: litellm-secrets volumes: - name: config-volume configMap: name: litellm-config-file
@RamboRogers
The text was updated successfully, but these errors were encountered:
I resolved by making this change.
spec: containers: - name: litellm image: ghcr.io/berriai/litellm:main-latest command: ["/usr/local/bin/python"] args: ["/usr/local/bin/litellm", "--port", "4000", "--config", "/app/proxy_server_config.yaml"]
Sorry, something went wrong.
The link to the bad config example is here,
https://docs.litellm.ai/docs/proxy/deploy
No branches or pull requests
What happened?
When launching the examples for Kubernetes, the config file defined is not used by litellm.
Relevant log output
Twitter / LinkedIn details
@RamboRogers
The text was updated successfully, but these errors were encountered: