Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate Katib Rocks for CKF 1.8 #152

Open
orfeas-k opened this issue Nov 13, 2023 · 2 comments
Open

Integrate Katib Rocks for CKF 1.8 #152

orfeas-k opened this issue Nov 13, 2023 · 2 comments
Labels
Kubeflow 1.8 This issue affects the Charmed Kubeflow 1.8 release

Comments

@orfeas-k
Copy link
Contributor

This issue will document the effort of integrating Katib ROCKs into the katib-operator charms.

@orfeas-k orfeas-k changed the title Integrate Katib Rocks Integrate Katib Rocks for CKF 1.8 Nov 13, 2023
@orfeas-k
Copy link
Contributor Author

orfeas-k commented Nov 16, 2023

Integrating katib-controller rock and running charm's integration tests, the pod created by the charm goes into error state with the errors below (see full logs)

docker run --rm katib-controller_v0.16.0-22.04-1_amd64:rock exec /katib-controller                                                                                                                                       1 ↵
2023-11-15T10:37:11.186Z [pebble] Started daemon.
2023-11-15T10:37:11.190Z [pebble] POST /v1/exec 3.74121ms 202
2023-11-15T10:37:11.194Z [pebble] GET /v1/tasks/1/websocket/control 2.590142ms 200
2023-11-15T10:37:11.194Z [pebble] GET /v1/tasks/1/websocket/stdio 76.181µs 200
2023-11-15T10:37:11.194Z [pebble] GET /v1/tasks/1/websocket/stderr 57.01µs 200
{"level":"info","ts":"2023-11-15T10:37:11Z","logger":"entrypoint","msg":"Config:","experiment-suggestion-name":"default","webhook-port":8443,"metrics-addr":":8080","healthz-addr":":18080","inject-security-context":false,"enable-grpc-probe-in-suggestion":true,"trial-resources":[{"Group":"batch","Version":"v1","Kind":"Job"}]}
{"level":"error","ts":"2023-11-15T10:37:11Z","logger":"controller-runtime.client.config","msg":"unable to load in-cluster config","error":"unable to load in-cluster configuration, KUBERNETES_SERVICE_HOST and KUBERNETES_SERVICE_PORT must be defined","stacktrace":"sigs.k8s.io/controller-runtime/pkg/client/config.loadConfig.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/client/config/config.go:133\nsigs.k8s.io/controller-runtime/pkg/client/config.loadConfig\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/client/config/config.go:155\nsigs.k8s.io/controller-runtime/pkg/client/config.GetConfigWithContext\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/client/config/config.go:97\nsigs.k8s.io/controller-runtime/pkg/client/config.GetConfig\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/client/config/config.go:77\nmain.main\n\t/root/parts/katib-controller/src/cmd/katib-controller/v1beta1/main.go:105\nruntime.main\n\t/snap/go/10426/src/runtime/proc.go:267"}
{"level":"error","ts":"2023-11-15T10:37:11Z","logger":"entrypoint","msg":"Fail to get the config","error":"invalid configuration: no configuration has been provided, try setting KUBERNETES_MASTER environment variable","errorCauses":[{"error":"no configuration has been provided, try setting KUBERNETES_MASTER environment variable"}],"stacktrace":"main.main\n\t/root/parts/katib-controller/src/cmd/katib-controller/v1beta1/main.go:107\nruntime.main\n\t/snap/go/10426/src/runtime/proc.go:267"}
2023-11-15T10:37:11.232Z [pebble] GET /v1/changes/1/wait 37.65843ms 200

which means that katib-controller expects some Kubernetes ENV variables. What is weird for me ATM is that I do not see our charm providing such configuration.

Here are full logs

╰─$ kl -n test katib-controller-657995fc6d-kp5pc
Defaulted container "katib-controller" out of: katib-controller, juju-pod-init (init)
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Config:","experiment-suggestion-name":"default","webhook-port":443,"metrics-addr":":8080","healthz-addr":":18080","inject-security-context":false,"enable-grpc-probe-in-suggestion":true,"trial-resources":[{"Group":"batch","Version":"v1","Kind":"Job"},{"Group":"kubeflow.org","Version":"v1","Kind":"TFJob"},{"Group":"kubeflow.org","Version":"v1","Kind":"PyTorchJob"},{"Group":"kubeflow.org","Version":"v1","Kind":"MPIJob"},{"Group":"kubeflow.org","Version":"v1","Kind":"XGBoostJob"},{"Group":"kubeflow.org","Version":"v1","Kind":"MXJob"}]}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.metrics","msg":"Metrics server is starting to listen","addr":":8080"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Registering Components."}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Certs ready"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Setting up controller."}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"experiment-controller","msg":"Using the default suggestion implementation"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Setting up health checker."}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Starting the manager."}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting server","kind":"health probe","addr":"[::]:18080"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"starting server","path":"/metrics","kind":"metrics","addr":"[::]:8080"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"experiment-controller","msg":"Experiment controller created"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"suggestion-controller","msg":"Suggestion controller created"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Job watch added successfully","CRD Group":"batch","CRD Version":"v1","CRD Kind":"Job"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Job watch error. CRD might be missing. Please install CRD and restart katib-controller","CRD Group":"kubeflow.org","CRD Version":"v1","CRD Kind":"TFJob"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Job watch error. CRD might be missing. Please install CRD and restart katib-controller","CRD Group":"kubeflow.org","CRD Version":"v1","CRD Kind":"PyTorchJob"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Job watch error. CRD might be missing. Please install CRD and restart katib-controller","CRD Group":"kubeflow.org","CRD Version":"v1","CRD Kind":"MPIJob"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Job watch error. CRD might be missing. Please install CRD and restart katib-controller","CRD Group":"kubeflow.org","CRD Version":"v1","CRD Kind":"XGBoostJob"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Job watch error. CRD might be missing. Please install CRD and restart katib-controller","CRD Group":"kubeflow.org","CRD Version":"v1","CRD Kind":"MXJob"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"trial-controller","msg":"Trial controller created"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Setting up webhooks."}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"experiment-controller","source":"kind source: *v1beta1.Experiment"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"experiment-controller","source":"kind source: *v1beta1.Trial"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"trial-controller","source":"kind source: *v1beta1.Trial"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"trial-controller","source":"kind source: *unstructured.Unstructured"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting Controller","controller":"trial-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"experiment-controller","source":"kind source: *v1beta1.Suggestion"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"suggestion-controller","source":"kind source: *v1beta1.Suggestion"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"suggestion-controller","source":"kind source: *v1.Deployment"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.webhook.webhooks","msg":"Starting webhook server"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"suggestion-controller","source":"kind source: *v1.Service"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting Controller","controller":"experiment-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting EventSource","controller":"suggestion-controller","source":"kind source: *v1.PersistentVolumeClaim"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting Controller","controller":"suggestion-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.webhook","msg":"Registering webhook","path":"/validate-experiment"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.webhook","msg":"Registering webhook","path":"/mutate-experiment"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.webhook","msg":"Registering webhook","path":"/mutate-pod"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.certwatcher","msg":"Updated current TLS certificate"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Stopping and waiting for non leader election runnables"}
{"level":"info","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.certwatcher","msg":"Starting certificate watcher"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"shutting down server","path":"/metrics","kind":"metrics","addr":"[::]:8080"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Stopping and waiting for leader election runnables"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1beta1.Trial Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1beta1.Experiment Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting workers","controller":"trial-controller","worker count":1}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Shutdown signal received, waiting for all workers to finish","controller":"trial-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting workers","controller":"experiment-controller","worker count":1}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Shutdown signal received, waiting for all workers to finish","controller":"experiment-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Starting workers","controller":"suggestion-controller","worker count":1}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"All workers finished","controller":"experiment-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Shutdown signal received, waiting for all workers to finish","controller":"suggestion-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"All workers finished","controller":"suggestion-controller"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1.Service Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1beta1.Suggestion Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1.Deployment Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1beta1.Trial Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"All workers finished","controller":"trial-controller"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Stopping and waiting for caches"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1.PersistentVolumeClaim Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *v1beta1.Suggestion Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"controller-runtime.source.EventHandler","msg":"failed to get informer from cache","error":"Timeout: failed waiting for *unstructured.Unstructured Informer to sync","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1.1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:68\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext.func1\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:49\nk8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/loop.go:50\nk8s.io/apimachinery/pkg/util/wait.PollUntilContextCancel\n\t/root/go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/poll.go:33\nsigs.k8s.io/controller-runtime/pkg/internal/source.(*Kind).Start.func1\n\t/root/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/source/kind.go:56"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Stopping and waiting for webhooks"}
{"level":"info","ts":"2023-11-16T16:30:34Z","msg":"Wait completed, proceeding to shutdown the manager"}
{"level":"error","ts":"2023-11-16T16:30:34Z","logger":"entrypoint","msg":"Unable to run the manager","error":"listen tcp :443: bind: permission denied","stacktrace":"main.main\n\t/root/parts/katib-controller/src/cmd/katib-controller/v1beta1/main.go:165\nruntime.main\n\t/snap/go/10426/src/runtime/proc.go:267"}

@ca-scribner
Copy link
Contributor

ca-scribner commented Nov 23, 2023

In the middle of rebuilding the katib-ui image as per canonical/katib-rocks#32. Mostly done, but still needs some more massaging.

@NohaIhab NohaIhab added the Kubeflow 1.8 This issue affects the Charmed Kubeflow 1.8 release label Nov 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Kubeflow 1.8 This issue affects the Charmed Kubeflow 1.8 release
Projects
Status: Labeled
Development

No branches or pull requests

3 participants