You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I need to run the connector spark-redshift-2.11 on a cluster that use hadoop aws 3.x.
The last version of connector that supports scala 2.11 is the 4.11 compiled with hadoop aws 2.7.7.
So, for my use case, i need a version that supports scala 2.11 compiled with hadoop 3.x.
In the version 4.1.1 the class Redshit writer create val fsDataOut = fs.create(new Path(manifestPath)) where manifestPath is the bucket s3 without s3a://.
So i need that the fix that was applied on version 5.1.1 being applied on a new version ( 4.1.2?) that works with scala 2.11.
It is possible? Thanks.
The text was updated successfully, but these errors were encountered:
Hi,
I need to run the connector spark-redshift-2.11 on a cluster that use hadoop aws 3.x.
The last version of connector that supports scala 2.11 is the 4.11 compiled with hadoop aws 2.7.7.
So, for my use case, i need a version that supports scala 2.11 compiled with hadoop 3.x.
In the version 4.1.1 the class Redshit writer create val fsDataOut = fs.create(new Path(manifestPath)) where manifestPath is the bucket s3 without s3a://.
So i need that the fix that was applied on version 5.1.1 being applied on a new version ( 4.1.2?) that works with scala 2.11.
It is possible? Thanks.
The text was updated successfully, but these errors were encountered: