Skip to content

not all jars getting added to classpath in spark 3.5.0 #333

Answered by paulpaul1076
paulpaul1076 asked this question in Q&A
Discussion options

You must be logged in to vote
  1. Yes.
  2. Driver
  3. I did add them to my extraCLasPath as you can see in the yaml that I sent

I actually figured it out

So, I have a pod, called testmount-lib

and it has 2 paths:

/dependencies/jars (only 2 jars here)
/dependencies/test_jars (around 10 jars here)

In my yml for the spark job I was specifying /dependencies/test_jars, but the problem is that testmount-lib only exposes the /dependencies/jars path, so I can write any kind of path into spec.driver.config.volumeMounts.mountPath, the jars will still be taken from /dependencies/jars because this is what testmount-lib exposes.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@paulpaul1076
Comment options

Answer selected by paulpaul1076
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants