-
Hi there, I've been using the old Google Spark Operator and now looking at this one too. One thing I'm interested in but was clear from reading the docs was if or how this operator supports running different versions of Spark Applications in the same k8s cluster. E.g. we have many Spark apps running and it would be difficult to upgrade them all at the same time. Ideally, we'd like an operator that allows us to run multiple versions of Spark at the same time without causing any issues. This would allow us to upgrade the Spark apps one at a time. Is this possible with this Spark Operator? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Thanks for your question @mmm-micro. As you can see from our integration tests (e.g. here), we run product images for different versions using the same operator. For Spark, we test using 3.4.1, 3.4.2, 3.5.0, 3.5.1 (the nightly tests use just 3.5.1), so, yes, it is possible for the operator to run products in different versions, but only for the ones that are supported. |
Beta Was this translation helpful? Give feedback.
-
Thank you, I see how it works now. A new pod is created to do the Spark Submit and that Pod's version of spark should align with the Spark app you're running. That seems like a good way of doing it. |
Beta Was this translation helpful? Give feedback.
Thanks for your question @mmm-micro. As you can see from our integration tests (e.g. here), we run product images for different versions using the same operator. For Spark, we test using 3.4.1, 3.4.2, 3.5.0, 3.5.1 (the nightly tests use just 3.5.1), so, yes, it is possible for the operator to run products in different versions, but only for the ones that are supported.