-
Notifications
You must be signed in to change notification settings - Fork 582
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
issue in setting "maximumConcurrentTasks" while installing scdf through helm #6056
Comments
@VikasMGowda05 server:
configuration:
extraEnvVars:
name: 'SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEFAULT_MAXIMUM_CONCURRENT_TASKS'
value: 50 |
You can format a file in issue text by using markdown |
@corneil server: server: |
It should be the platform name. If you are using default, then it will be |
default is the account name. The deployment option is only for Cloud Foundry. |
You can add more accounts if you target other namespaces or clusters |
Hi @corneil @cppwfs I have tried adding the above as you suggested, I am getting the below error . Could you please help here
Error |
@VikasMGowda05 Please try: server:
extraEnvVars:
- name: 'SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_KUBERNETES_ACCOUNTS_DEFAULT_MAXIMUMCONCURRENTTASKS'
value: 100 |
Description:
We have seen the SCDF document which says we can set the property maximumConcurrentTasks as below
spring.cloud.dataflow.task.platform..accounts[].deployment.maximumConcurrentTasks`
but while installing the SCDF using helm installation, we need to set this in values.yaml file and information is missing in helm installation document. Could anyone help here?
Release versions:
we are installing "bitnami/spring-cloud-dataflow:2.11.5-debian-12-r2"
Additional context:
Below is our values.yaml file
server:
image:
registry: docker.io
repository: bitnami/spring-cloud-dataflow
tag: 2.11.5-debian-12-r2
digest: ""
pullPolicy: IfNotPresent
pullSecrets: []
debug: false
composedTaskRunner:
image:
registry: docker.io
repository: bitnami/spring-cloud-dataflow-composed-task-runner
tag: 2.11.5-debian-12-r2
digest: ""
configuration:
streamingEnabled: false
batchEnabled: true
accountName: default
trustK8sCerts: false
containerPorts:
http: 8080
jdwp: 5005
replicaCount: 1
updateStrategy:
type: RollingUpdate
startupProbe:
enabled: false
initialDelaySeconds: 120
timeoutSeconds: 1
periodSeconds: 20
failureThreshold: 6
successThreshold: 1
livenessProbe:
enabled: true
initialDelaySeconds: 120
timeoutSeconds: 1
periodSeconds: 20
failureThreshold: 6
successThreshold: 1
readinessProbe:
enabled: true
initialDelaySeconds: 120
timeoutSeconds: 1
periodSeconds: 20
failureThreshold: 6
successThreshold: 1
networkPolicy:
enabled: false
allowExternal: false
allowExternalEgress: false
service:
type: ClusterIP
ports:
http: 8080
ingress:
enabled: true
path: /
pathType: ImplementationSpecific
hostname: "xyz.com"
pdb:
create: false
minAvailable: ""
maxUnavailable: ""
pdb:
create: false
skipper:
enabled: false
rabbitmq:
enabled: false
mariadb:
enabled: false
metrics:
enabled: false
pdb:
create: false
externalDatabase:
host: "{{RDS-endpoint}}.rds.amazonaws.com
driver: com.mysql.cj.jdbc.Driver
dataflow:
url: "{Database url}"
username:
password:
The text was updated successfully, but these errors were encountered: