Skip to content
This repository has been archived by the owner on Mar 27, 2024. It is now read-only.

The jars of gazelle cannot support spark 3.1 and spark 3.2 at the same time #236

Open
yao531441 opened this issue Mar 21, 2022 · 0 comments

Comments

@yao531441
Copy link
Collaborator

We need to provide two sets of jars to make gazelle support different spark versions.

The following is an example of spark version 3.2

spark.executor.extraClassPath /home/sparkuser/nativesql_jars/spark-columnar-core-1.4.0-SNAPSHOT-jar-with-dependencies.jar:/home/sparkuser/nativesql_jars/spark-arrow-datasource-standard-1.4.0-SNAPSHOT-jar-with-dependencies.jar:/home/sparkuser/nativesql_jars/spark-sql-columnar-shims-common-1.4.0-SNAPSHOT.jar:/home/sparkuser/nativesql_jars/spark-sql-columnar-shims-spark321-1.4.0-SNAPSHOT.jar
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant