Skip to content

How can you use existing custom Spark images with the Spark app and Deps built into the image with your Spark K8S Operator? #418

Answered by adwk67
mmm-micro asked this question in Q&A
Discussion options

You must be logged in to vote

So to summarise, the same image is used for job/driver/executor, but there are different ways of preparing this image:

  • if you have your own repository, you can take one of our base images and add dependencies to that as needed (edit: as @razvan mentions above, this is the best practice)
  • if not, you can use a S3 bucket (recommended) or a PVC (not recommended) to make these resources available.

We also have an issue for using HDFS in place of S3, but that has not been planned yet.

Replies: 3 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@mmm-micro
Comment options

Answer selected by mmm-micro
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants