You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a problem with using native delta table support even after adding DATALAKE_FORMATS: delta to my environment which adds the jars for delta-core. It still cant import delta module.
A workaround I found is to add --py-files /home/glue_user/aws-glue-libs/datalake-connectors/delta-2.1.0/delta-core_2.12-2.1.0.jar to my arguments and on pytest I added spark.sparkContext.addPyFile("/home/glue_user/aws-glue-libs/datalake-connectors/delta-2.1.0/delta-core_2.12-2.1.0.jar").
Is there a way to automatically load the jars or am I missing something?
The text was updated successfully, but these errors were encountered:
Well, it was a bit tricky, I tried including the jars for core and storage but it didn't register the classes for some reason. I ended up including it in the packages which did work for me.
here is an example of my pytest fixture that initiates the spark session for the tests
I have a problem with using native delta table support even after adding DATALAKE_FORMATS: delta to my environment which adds the jars for delta-core. It still cant import delta module.
A workaround I found is to add --py-files /home/glue_user/aws-glue-libs/datalake-connectors/delta-2.1.0/delta-core_2.12-2.1.0.jar to my arguments and on pytest I added spark.sparkContext.addPyFile("/home/glue_user/aws-glue-libs/datalake-connectors/delta-2.1.0/delta-core_2.12-2.1.0.jar").
Is there a way to automatically load the jars or am I missing something?
The text was updated successfully, but these errors were encountered: