Replies: 1 comment
-
It seems that indeed versions are causing issues. I had to revert to pyspark==4.0.0dev1. Because deltaLake4 is currently built on dev1. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear Community, does any of you was able to get a working Spark Session (4.0.0preview2) with Delta 4.0.0rc1 ?
I am in an air-gap server, so I need to bring manually the needed JARS (which is really fun!), but I am not able to get a working SparkSession able to write Spark dataFrame to delta files ...
The initializing part go well, but the
sdf.write.format("delta").save("sdf")
or variation with sd.parquet, full path, full path with .delta
failed every time (error bellow...)
Details:
Instanciate:
with list of jars:
Mainly these ones:
Does anyone have a clue about which JAR could be wrong version / missing ?
Many thanks for your help :-)
But I got the error (which usually is related to a bad .JAR)
Beta Was this translation helpful? Give feedback.
All reactions