How to use IAM role to access Amazon Bedrock models correctly? #5873
-
I am able to follow this tutorial to run Amazon Bedrock -> LiteLLM -> Open WebUI successfully on local using docker compose: https://willdady.com/streamlining-ai-experimentation-with-open-webui-litellm-and-amazon-bedrock In this case, LiteLLM is using AWS credentials from LiteLLM's {
"data": [
{
"id": "bedrock-claude-3-5-sonnet",
"object": "model",
"created": 1677610602,
"owned_by": "openai"
},
{
"id": "bedrock-claude-3-haiku",
"object": "model",
"created": 1677610602,
"owned_by": "openai"
}
],
"object": "list"
} Now I am trying to deploy LiteLLM to the Amazon EKS and I try to use IAM role to access Amazon Bedrock models. Here is my IAM Role
Here is my Kubernetes manifest files: ---
apiVersion: v1
kind: ServiceAccount
metadata:
name: hm-litellm-service-account
namespace: production-hm-litellm
annotations:
# https://docs.aws.amazon.com/eks/latest/userguide/associate-service-account-role.html
eks.amazonaws.com/role-arn: arn:aws:iam::xxxxxxxxxxxx:role/LiteLLMRole-hm-litellm-service-account
labels:
app.kubernetes.io/name: hm-litellm-service-account
app.kubernetes.io/part-of: production-hm-litellm
---
apiVersion: v1
kind: ConfigMap
metadata:
name: hm-litellm-config-map
namespace: production-hm-litellm
labels:
app.kubernetes.io/name: hm-litellm-config-map
app.kubernetes.io/part-of: production-hm-litellm
data:
proxy_server_config.yaml: |
# https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html
model_list:
- model_name: bedrock-claude-3-5-sonnet
litellm_params:
model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
aws_region_name: us-west-2
aws_session_name: litellm
aws_role_name: arn:aws:iam::xxxxxxxxxxxx:role/LiteLLMRole-hm-litellm-service-account
- model_name: bedrock-claude-3-haiku
litellm_params:
model: bedrock/anthropic.claude-3-haiku-20240307-v1:0
aws_region_name: us-west-2
aws_session_name: litellm
aws_role_name: arn:aws:iam::xxxxxxxxxxxx:role/LiteLLMRole-hm-litellm-service-account
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: hm-litellm-deployment
namespace: production-hm-litellm
labels:
app.kubernetes.io/name: hm-litellm-deployment
app.kubernetes.io/part-of: production-hm-litellm
spec:
replicas: 1
selector:
matchLabels:
app: hm-litellm
template:
metadata:
labels:
app: hm-litellm
spec:
serviceAccountName: hm-litellm-service-account # I am not sure if here really needs, as `model_list` in config map also set each IAM role for each model. So maybe no need.
containers:
- name: litellm
image: ghcr.io/berriai/litellm:main-v1.46.8
ports:
- name: litellm
protocol: TCP
containerPort: 4000
volumeMounts:
- name: litellm-volume
mountPath: /app/proxy_server_config.yaml
subPath: proxy_server_config.yaml
volumes:
- name: litellm-volume
configMap:
name: hm-litellm-config-map
---
apiVersion: v1
kind: Service
metadata:
name: hm-litellm-service
namespace: production-hm-litellm
labels:
app.kubernetes.io/name: hm-litellm-service
app.kubernetes.io/part-of: production-hm-litellm
spec:
type: ClusterIP
selector:
app: hm-litellm
ports:
- name: litellm
protocol: TCP
targetPort: litellm
port: 80 Inside, However, right now LiteLLM
Any guide would be appreciate. Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
what do your server logs show? |
Beta Was this translation helpful? Give feedback.
-
I think somehow LiteLLM does not read I put a new location at command: ["litellm", "--port", "4000", "--config", "/app/config.yaml", "--detailed_debug"] in deployment, now it works! Here are updated version: ---
apiVersion: v1
kind: ConfigMap
metadata:
name: hm-litellm-config-map
namespace: production-hm-litellm
labels:
app.kubernetes.io/name: hm-litellm-config-map
app.kubernetes.io/part-of: production-hm-litellm
data:
config.yaml: | # Updated
# https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html
model_list:
- model_name: bedrock-claude-3-5-sonnet
litellm_params:
model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
aws_region_name: us-west-2
aws_session_name: litellm
aws_role_name: arn:aws:iam::xxxxxxxxxxxx:role/LiteLLMRole-hm-litellm-service-account
- model_name: bedrock-claude-3-haiku
litellm_params:
model: bedrock/anthropic.claude-3-haiku-20240307-v1:0
aws_region_name: us-west-2
aws_session_name: litellm
aws_role_name: arn:aws:iam::xxxxxxxxxxxx:role/LiteLLMRole-hm-litellm-service-account
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: hm-litellm-deployment
namespace: production-hm-litellm
labels:
app.kubernetes.io/name: hm-litellm-deployment
app.kubernetes.io/part-of: production-hm-litellm
spec:
replicas: 1
selector:
matchLabels:
app: hm-litellm
template:
metadata:
labels:
app: hm-litellm
spec:
serviceAccountName: hm-litellm-service-account # I am not sure if here really needs, as `model_list` in config map also set each IAM role for each model. So maybe no need.
containers:
- name: litellm
image: ghcr.io/berriai/litellm:main-v1.46.8
ports:
- name: litellm
protocol: TCP
containerPort: 4000
command: ["litellm", "--port", "4000", "--config", "/app/config.yaml", "--detailed_debug"] # New added
volumeMounts:
- name: litellm-volume
mountPath: /app/config.yaml # Updated
subPath: config.yaml # Updated
volumes:
- name: litellm-volume
configMap:
name: hm-litellm-config-map |
Beta Was this translation helpful? Give feedback.
I think somehow LiteLLM does not read
/app/proxy_server_config.yaml
by default.I put a new location at
/app/config.yaml
and addedin deployment, now it works!
Here are updated version: