Releases: basho/spark-riak-connector
1.6.3
Artifacts
This release is available in the Maven Central Repository for both Scala 2.10 and Scala 2.11. Please choose the version that matches the version of Scala that your Spark installation is using.
This release is also available via Spark Packages.
Change Log
- Data locality support for Coverage Plan Based Partitioning #230
- Improve Coverage Plan Based Partitioning: smart split calculation and more accurate coverage entry distribution across the partitions #231
- Critical, Fix Python serialization for empty JSON objects #226
- Fix double filtering for DataFrames #228
1.6.2
Artifacts
This release is available in the Maven Central Repository for both Scala 2.10 and Scala 2.11. Please choose the version that matches the version of Scala that your Spark installation is using.
This release is also available via Spark Packages.
Change Log
Critical fix Python KV: if object values are JSON objects with list fields (empty or not) then exception happens #219.
1.6.1
Artifacts
This release is available in the Maven Central Repository for both Scala 2.10 and Scala 2.11. Please choose the version that matches the version of Scala that your Spark installation is using.
This release is also available via Spark Packages.
Change Log
- Make Namespace not to be mandatory for KV operations (#212)
- Make the Python tests be executed on TravisCI for both: TS and KV (#213)
- Critical, Fix improper Spark partition order (#211)
- Critical, Fix Python serialization for JSON values with more than 4 keys (#210)
- Critical, fix empty writes (#205)
- Fix value mapping for JSON array values on KV reads (#215)
- Fix Python .partitionBy2iKeys() error (#216)
1.6.0
Artifacts
This release is available in the Maven Central Repository for both Scala 2.10 and Scala 2.11. Please choose the version that matches the version of Scala that your Spark installation is using.
This release is also available via Spark Packages.
Change Log
- Support Python and KV buckets
- Support Spark Streaming context in the connector
- When a Riak node is unavailable during a Spark job, failover to another node
- Remove separate java-connector and incorporate java classes into main connector project
- Build with SBT instead of Maven
- Build artifacts for Scala 2.10 and 2.11
- Use Docker for builds and tests
- Update docs and examples
- Miscellaneous enhancements and bug fixes
1.5.1
1.5.0
Artifacts
The latest version of the Spark-Riak Connector can be found in Bintray:
- Scala (uber jar) - https://dl.bintray.com/basho/data-platform/com/basho/riak/spark-riak-connector/1.5.0/spark-riak-connector-1.5.0-uber.jar
- Scala - https://dl.bintray.com/basho/data-platform/com/basho/riak/spark-riak-connector/1.5.0/spark-riak-connector-1.5.0.jar
- Java - https://dl.bintray.com/basho/data-platform/com/basho/riak/spark-riak-connector-java/1.5.0/spark-riak-connector-java-1.5.0.jar
Change log
1.5.0-rc1
v1.5.0-rc1 bump version to 1.5.0-rc1
v1.3.0-beta1
Merge pull request #92 from basho/develop Release 1.3.0-beta1
Release v1.1.0
Merge pull request #68 from basho/pex-without-streaming-values-for-oss Disable FBR Streaming values for OSS
PreRelease v1.1.0-RC1
Merge pull request #68 from basho/pex-without-streaming-values-for-oss Disable FBR Streaming values for OSS