Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Cross-compile 213 #3363

Closed
wants to merge 20 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
62a5be1
resolve amgiguity in spark and raster Implicits classes with explicit…
philvarner Mar 11, 2021
f4fbe4f
refactor spire dependency to support using different versions between…
philvarner Mar 11, 2021
ae97d90
replace two uses of to[T] with equivalent toT
philvarner Mar 11, 2021
4ad25dc
disable eviction warnings, since there are many of them and likely al…
philvarner Mar 12, 2021
c7d155f
fixes for non-fqn Seq being immutable in Scala 2.13
philvarner Mar 12, 2021
2a947d1
fix two more instances of resolution of wildcard imports being differ…
philvarner Mar 12, 2021
90ae75e
fix unnecessarily modified imports in Seq update commit
philvarner Mar 12, 2021
9e707ff
Merge branch 'implicits-relative-package-names' into cross-compile-213
philvarner Mar 12, 2021
52e4fc5
Merge branch 'refactor-spire-dep' into cross-compile-213
philvarner Mar 12, 2021
90874fa
Merge branch 'fix-uses-of-to-with-brackets' into cross-compile-213
philvarner Mar 12, 2021
1fe9713
Merge branch 'disable-eviction-warnings' into cross-compile-213
philvarner Mar 12, 2021
aded030
Merge branch 'fixes-for-immutable-seq-change-in-213' into cross-compi…
philvarner Mar 12, 2021
b041e1f
2.13 build delta
philvarner Mar 12, 2021
fe6ce70
resolve a few more problems with ambiguous wildcard resolution in 2.13
philvarner Mar 12, 2021
48fa368
convert use of .to[Vector] to .toVector
philvarner Mar 12, 2021
24a9bb2
replace one private use of method return type of Seq with actual resu…
philvarner Mar 12, 2021
7e0e820
Merge branch 'fixes-for-immutable-seq-change-in-213' into cross-compi…
philvarner Mar 12, 2021
46b8798
Merge branch 'fix-uses-of-to-with-brackets' into cross-compile-213
philvarner Mar 12, 2021
5759af9
Merge branch 'implicits-relative-package-names' into cross-compile-213
philvarner Mar 12, 2021
9ec592c
structure with crosscompile at the project level
philvarner Mar 12, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 29 additions & 0 deletions .locationtech/deploy-213.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#!/usr/bin/env bash

set -Eeuo pipefail
set -x

./sbt -213 "project macros" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project vector" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project proj4" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project raster" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project spark" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project spark-pipeline" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project s3" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project s3-spark" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project accumulo" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project accumulo-spark" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project hbase" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project hbase-spark" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project cassandra" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project cassandra-spark" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project geotools" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project shapefile" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project layer" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project store" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project util" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project vectortile" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project raster-testkit" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project vector-testkit" publish -no-colors -J-Drelease=locationtech \
# && ./sbt -213 "project spark-testkit" publish -no-colors -J-Drelease=locationtech \
&& ./sbt -213 "project gdal" publish -no-colors -J-Drelease=locationtech
3 changes: 1 addition & 2 deletions build.sbt
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
import sbt.Keys._

ThisBuild / scalaVersion := "2.12.13"
ThisBuild / organization := "org.locationtech.geotrellis"
ThisBuild / crossScalaVersions := List("2.12.13", "2.11.12")
ThisBuild / scalaVersion := Settings.scala212

lazy val root = Project("geotrellis", file("."))
.aggregate(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ trait BufferTiles {
addSlice(SpatialKey(col+1, row+1), TopLeft)
addSlice(SpatialKey(col-1, row+1), TopRight)

parts
parts.toSeq
}

def bufferWithNeighbors[
Expand Down
2 changes: 1 addition & 1 deletion proj4/src/main/scala/geotrellis/proj4/io/wkt/WKT.scala
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ import scala.io.Source

object WKT {
private val wktResourcePath = "/proj4/wkt/epsg.properties"
lazy val parsed: Map[Int, WktCS] = records.mapValues(WKTParser.apply)
lazy val parsed: Map[Int, WktCS] = records.mapValues(WKTParser.apply).toMap
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mapValues is deprecated in 2.13

Use .view.mapValues(f). A future version will include a strict version of this method (for now, .view.mapValues(f).toMap).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes -- this is just to maintain backwards compatibility with 2.11 and 2.12 with the fewest changes possible. There are hundreds if not thousands of deprecation warnings when compiling with 2.13 😬

lazy val projections: Set[WktCS] = parsed.values.toSet
lazy val records: Map[Int, String] = parseWktEpsgResource

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ object GenerateTestCases {
.filter { _ startsWith "<" }
.map { s => s.tail.take(s.indexOf('>') - 1) }
.filterNot { _ == "4326" }
.to[Vector]
.toVector
}
val output = new java.io.FileWriter("proj4/src/test/resources/proj4-epsg.csv");

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ object MetaCRSTestFileReader {
.filter(r => r.nonEmpty && !r.head.startsWith("#"))
.drop(1)
.map(parseTest)
.to[List]
.toList
}

private def parseTest(cols: Array[String]): MetaCRSTestCase = {
Expand Down
27 changes: 27 additions & 0 deletions project/CrossCompileAutoPlugin.scala
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
import sbt._
import sbt.Keys._

object CrossCompileAutoPlugin extends AutoPlugin {

override def trigger: sbt.PluginTrigger = allRequirements

override def projectSettings: Seq[Def.Setting[_]] =
Seq(
libraryDependencies ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 13)) => Seq.empty
case Some((2, 11 | 12)) => Seq(
compilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full),
"org.scala-lang.modules" %% "scala-collection-compat" % "2.4.2"
)
case x => sys.error(s"Encountered unsupported Scala version ${x.getOrElse("undefined")}")
}),
Compile / scalacOptions ++= (CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 13)) => Seq(
"-Ymacro-annotations", // replaces paradise in 2.13
"-Wconf:cat=deprecation&msg=Auto-application:silent" // there are many of these, silence until fixed
)
case Some((2, 11 | 12)) => Seq("-Ypartial-unification") // required by Cats
case x => sys.error(s"Encountered unsupported Scala version ${x.getOrElse("undefined")}")
})
)
}
18 changes: 13 additions & 5 deletions project/Dependencies.scala
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ import sbt._

object Version {
val geotools = "24.2"
val spire = "0.13.0"
val accumulo = "1.9.3"
val cassandra = "3.7.2"
val hbase = "2.2.5"
Expand All @@ -34,11 +33,13 @@ object Version {
import sbt.Keys._

object Dependencies {
private def ver(for211: String, for212: String) = Def.setting {

private def ver(for211: String, for212: String, for213: Option[String] = None) = Def.setting {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) => for211
case Some((2, 12)) => for212
case _ => sys.error("not good")
case Some((2, 13)) => for213.getOrElse(for212)
case x => sys.error(s"Encountered unsupported Scala version ${x.getOrElse("undefined")}")
}
}

Expand Down Expand Up @@ -67,6 +68,15 @@ object Dependencies {

def scalaReflect(version: String) = "org.scala-lang" % "scala-reflect" % version

def spire(module: String) = Def.setting {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, 11)) => "org.spire-math" %% "spire" % "0.13.0"
case Some((2, 12)) => "org.spire-math" %% "spire" % "0.13.0" // 0.17.0 exists for 2.12
case Some((2, 13)) => "org.typelevel" %% "spire" % "0.17.0"
case x => sys.error(s"Encountered unsupported Scala version ${x.getOrElse("undefined")}")
}
}

val sparkCore = "org.apache.spark" %% "spark-core" % Version.spark
val sparkSql = "org.apache.spark" %% "spark-sql" % Version.spark
val pureconfig = "com.github.pureconfig" %% "pureconfig" % "0.14.0"
Expand All @@ -77,8 +87,6 @@ object Dependencies {
val jts = "org.locationtech.jts" % "jts-core" % "1.17.1"
val proj4j = "org.locationtech.proj4j" % "proj4j" % "1.1.1"
val openCSV = "com.opencsv" % "opencsv" % "5.3"
val spire = "org.spire-math" %% "spire" % Version.spire
val spireMacro = "org.spire-math" %% "spire-macros" % Version.spire
val apacheIO = "commons-io" % "commons-io" % "2.8.0"
val apacheLang3 = "org.apache.commons" % "commons-lang3" % "3.12.0"
val apacheMath = "org.apache.commons" % "commons-math3" % "3.6.1"
Expand Down
51 changes: 37 additions & 14 deletions project/Settings.scala
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
import Dependencies._
import GTBenchmarkPlugin.Keys._
import sbt._
import sbt.Keys._
import sbt.Keys.{crossScalaVersions, _}
import sbtassembly.AssemblyPlugin.autoImport._
import com.typesafe.tools.mima.plugin.MimaKeys._
import de.heikoseeberger.sbtheader.{CommentStyle, FileType}
Expand All @@ -40,12 +40,19 @@ object Settings {
val all = external ++ local
}

lazy val scala211 = "2.11.12"
lazy val scala212 = "2.12.13"
lazy val scala213 = "2.13.5"

lazy val crossScalaVersionsAll = List(scala213, scala212, scala211)
lazy val crossScalaVersionsSparkOnly = List(scala212, scala211)

lazy val noForkInTests = Seq(
Test / fork := false,
Test / parallelExecution := false
)

val commonScalacOptions = Seq(
lazy val commonScalacOptions = Seq(
"-deprecation",
"-unchecked",
"-feature",
Expand All @@ -56,16 +63,17 @@ object Settings {
"-language:existentials",
"-language:experimental.macros",
"-feature",
"-Ypartial-unification", // required by Cats
// "-Yrangepos", // required by SemanticDB compiler plugin
// "-Ywarn-unused-import", // required by `RemoveUnused` rule
"-target:jvm-1.8")
"-target:jvm-1.8"
)

lazy val commonSettings = Seq(
description := "geographic data processing library for high performance applications",
licenses := Seq("Apache-2.0" -> url("http://www.apache.org/licenses/LICENSE-2.0.html")),
homepage := Some(url("https://geotrellis.io")),
scmInfo := Some(ScmInfo(url("https://github.com/locationtech/geotrellis"), "scm:git:[email protected]:locationtech/geotrellis.git")),
crossScalaVersions := crossScalaVersionsAll,
scalacOptions ++= commonScalacOptions,
publishMavenStyle := true,
Test / publishArtifact := false,
Expand Down Expand Up @@ -93,7 +101,6 @@ object Settings {
).filter(_.asFile.canRead).map(Credentials(_)),

addCompilerPlugin("org.typelevel" %% "kind-projector" % "0.11.3" cross CrossVersion.full),
addCompilerPlugin("org.scalamacros" %% "paradise" % "2.1.1" cross CrossVersion.full),
addCompilerPlugin("org.scalameta" % "semanticdb-scalac" % "4.4.10" cross CrossVersion.full),

pomExtra := (
Expand Down Expand Up @@ -121,7 +128,11 @@ object Settings {
existingText.flatMap(_ => existingText.map(_.trim)).getOrElse(newText)
} }
)
)
),
evictionWarningOptions in update := EvictionWarningOptions.default
.withWarnTransitiveEvictions(false)
.withWarnDirectEvictions(false)
.withWarnScalaVersionEviction(false)
)

lazy val accumulo = Seq(
Expand All @@ -144,6 +155,7 @@ object Settings {

lazy val `accumulo-spark` = Seq(
name := "geotrellis-accumulo-spark",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
accumuloCore
exclude("org.jboss.netty", "netty")
Expand All @@ -166,6 +178,7 @@ object Settings {
) ++ commonSettings ++ noForkInTests

lazy val bench = Seq(
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies += sl4jnop,
jmhIterations := Some(5),
jmhTimeUnit := None, // Each benchmark should determing the appropriate time unit.
Expand Down Expand Up @@ -196,6 +209,7 @@ object Settings {

lazy val `cassandra-spark` = Seq(
name := "geotrellis-cassandra-spark",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
cassandraDriverCore
excludeAll(
Expand Down Expand Up @@ -223,6 +237,7 @@ object Settings {

lazy val `doc-examples` = Seq(
name := "geotrellis-doc-examples",
crossScalaVersions := crossScalaVersionsSparkOnly,
scalacOptions ++= commonScalacOptions,
libraryDependencies ++= Seq(
sparkCore,
Expand Down Expand Up @@ -376,6 +391,7 @@ object Settings {

lazy val `hbase-spark` = Seq(
name := "geotrellis-hbase-spark",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
hadoopClient % Provided,
sparkCore % Provided,
Expand All @@ -399,7 +415,7 @@ object Settings {
name := "geotrellis-macros",
Compile / sourceGenerators += (Compile / sourceManaged).map(Boilerplate.genMacro).taskValue,
libraryDependencies ++= Seq(
spireMacro,
spire("spire-macros").value,
scalaReflect(scalaVersion.value)
)
) ++ commonSettings
Expand All @@ -410,7 +426,8 @@ object Settings {
mdocOut := new File("website/docs"),
mdocVariables := Map(
"VERSION" -> (ThisBuild / version).value
)
),
crossScalaVersions := crossScalaVersionsSparkOnly
)

lazy val proj4 = Seq(
Expand All @@ -430,7 +447,7 @@ object Settings {
name := "geotrellis-raster",
libraryDependencies ++= Seq(
squants,
monocle("core").value,
monocle("core").value,
monocle("macro").value,
scalaXml,
scalaURI.value,
Expand Down Expand Up @@ -497,6 +514,7 @@ object Settings {

lazy val `s3-spark` = Seq(
name := "geotrellis-s3-spark",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
hadoopClient % Provided,
sparkCore % Provided,
Expand Down Expand Up @@ -533,6 +551,7 @@ object Settings {

lazy val spark = Seq(
name := "geotrellis-spark",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
sparkCore % Provided,
hadoopClient % Provided,
Expand All @@ -556,7 +575,8 @@ object Settings {

lazy val `spark-pipeline` = Seq(
name := "geotrellis-spark-pipeline",
libraryDependencies ++= Seq(
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
circe("generic-extras").value,
hadoopClient % Provided,
sparkCore % Provided,
Expand Down Expand Up @@ -587,6 +607,7 @@ object Settings {

lazy val `spark-testkit` = Seq(
name := "geotrellis-spark-testkit",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
hadoopClient % Provided,
sparkCore % Provided,
Expand All @@ -600,19 +621,20 @@ object Settings {
libraryDependencies ++= Seq(
log4s,
scalaj,
spire,
spire("spire").value,
scalatest % Test
)
) ++ commonSettings

lazy val vector = Seq(
name := "geotrellis-vector",
libraryDependencies ++= Seq(
scalaReflect(scalaVersion.value),
jts,
shapeless,
pureconfig,
circe("core").value,
circe("generic").value,
circe("core").value,
circe("generic").value,
circe("parser").value,
cats("core").value,
apacheMath,
Expand Down Expand Up @@ -669,7 +691,7 @@ object Settings {
uzaygezenCore,
scalaXml,
apacheLang3,
fs2("core").value,
fs2("core").value,
fs2("io").value,
cats("effect").value,
scalatest % Test
Expand All @@ -691,6 +713,7 @@ object Settings {

lazy val `gdal-spark` = Seq(
name := "geotrellis-gdal-spark",
crossScalaVersions := crossScalaVersionsSparkOnly,
libraryDependencies ++= Seq(
gdalWarp,
hadoopClient % Provided,
Expand Down
8 changes: 8 additions & 0 deletions publish/publish-to-sonatype-213.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/usr/bin/env bash

# Publish to sonatype for all supported scala version 2.13

set -Eeuo pipefail
set -x

./sbt -213 publishSigned -no-colors -J-Drelease=sonatype
2 changes: 1 addition & 1 deletion raster/src/main/scala/geotrellis/raster/GridBounds.scala
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ case class GridBounds[@specialized(Int, Long) N: Integral](
if(overlapRowMax < rowMax) {
result += GridBounds(overlapColMin, overlapRowMax + 1, overlapColMax, rowMax)
}
result
result.toSeq
}

/**
Expand Down
Loading