Skip to content

Commit

Permalink
[KYUUBI #5196][FOLLOWUP] Extract spark core scala version lazily and …
Browse files Browse the repository at this point in the history
…respect engine env

### _Why are the changes needed?_

Only extract the spark core scala version if `SPARK_SCALA_VERSION` env is empty, and respect engine env.

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [x] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request

### _Was this patch authored or co-authored using generative AI tooling?_

No.

Closes #5434 from turboFei/lazy_scala_version.

Closes #5196

fdccef7 [fwang12] lazy extract spark core scala version

Authored-by: fwang12 <[email protected]>
Signed-off-by: fwang12 <[email protected]>
  • Loading branch information
turboFei committed Oct 16, 2023
1 parent b24d94e commit c60f5b7
Showing 1 changed file with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -108,9 +108,9 @@ class SparkProcessBuilder(
}

override protected val engineScalaBinaryVersion: String = {
val sparkCoreScalaVersion =
env.get("SPARK_SCALA_VERSION").filter(StringUtils.isNotBlank).getOrElse {
extractSparkCoreScalaVersion(Paths.get(sparkHome, "jars").toFile.list())
StringUtils.defaultIfBlank(System.getenv("SPARK_SCALA_VERSION"), sparkCoreScalaVersion)
}
}

override protected lazy val engineHomeDirFilter: FileFilter = file => {
Expand Down

0 comments on commit c60f5b7

Please sign in to comment.