not all jars getting added to classpath in spark 3.5.0 #333
-
I have a pod with jars, and the jars are located at path "/dependencies/test_jars". This path is mounted into pods that run spark. For some odd reason, only 2 jars are getting picked up from there, yet I have many more jars there. Here's my yml for running the spark application:
Here's the list of jars that I have in "/dependencies/test_jars":
And here's what I get when I print out classpath: Why does this happen? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hey,
|
Beta Was this translation helpful? Give feedback.
I actually figured it out
So, I have a pod, called testmount-lib
and it has 2 paths:
/dependencies/jars (only 2 jars here)
/dependencies/test_jars (around 10 jars here)
In my yml for the spark job I was specifying /dependencies/test_jars, but the problem is that testmount-lib only exposes the /dependencies/jars path, so I can write any kind of path into spec.driver.config.volumeMounts.mountPath, the jars will still be taken from /dependencies/jars because this is what testmount-lib exposes.