Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support spark 2.0+ #204

Open
bsphere opened this issue Nov 25, 2016 · 6 comments
Open

support spark 2.0+ #204

bsphere opened this issue Nov 25, 2016 · 6 comments

Comments

@bsphere
Copy link
Contributor

bsphere commented Nov 25, 2016

please add support for spark 2.0+,

File "/usr/local/spark-2.0.2-bin-hadoop2.7/python/lib/py4j-0.10.3-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o32.riakBucket.
: java.lang.NoClassDefFoundError: org/apache/spark/Logging
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at com.basho.riak.spark.SparkContextFunctions.riakBucket$default$3(SparkContextFunctions.scala:44)
	at com.basho.riak.spark.util.python.RiakPythonHelper.riakBucket(RiakPythonHelper.scala:37)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:280)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:214)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 25 more
@srgg
Copy link
Contributor

srgg commented Nov 25, 2016

Thank you for your feedback, Spark 2.0 support is a top priority, therefore it will be added soon.

@courtneycouch
Copy link

How's this coming? It's been 3 months.

@bsphere
Copy link
Contributor Author

bsphere commented Apr 12, 2017

apparently its not a top priority (or priority at all) for Basho

@nikolaypavlov
Copy link
Contributor

+1. I'm already using the spark-2.0-support branch with Spark 2.1, but can we have an official build?

@bsphere
Copy link
Contributor Author

bsphere commented May 8, 2017

the low activity in this project doesn't reflect at all the amount of marketing Basho put into Riak + Spark

@darkredz
Copy link

spark-2.0-support branch doesn't seems to work with spark 2.3.2
causing exception when insert from a spark-submit
Exception in thread "main" java.lang.AbstractMethodError at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99) at com.basho.riak.spark.util.DataMapper$.initializeLogIfNecessary(DataMapper.scala:28) at org.apache.spark.internal.Logging$class.log(Logging.scala:46) at com.basho.riak.spark.util.DataMapper$.log(DataMapper.scala:28) at org.apache.spark.internal.Logging$class.logDebug(Logging.scala:58) at com.basho.riak.spark.util.DataMapper$.logDebug(DataMapper.scala:28) at com.basho.riak.spark.util.DataMapper$.ensureInitialized(DataMapper.scala:35) at com.basho.riak.spark.util.DataMapper$class.$init$(DataMapper.scala:25) at com.basho.riak.spark.writer.mapper.SqlDataMapper.<init>(SqlDataMapper.scala:16) at com.basho.riak.spark.writer.mapper.SqlDataMapper$$anon$1.dataMapper(SqlDataMapper.scala:32) at com.basho.riak.spark.writer.RiakWriter$.tsWriter(RiakWriter.scala:203) at com.basho.riak.spark.rdd.RDDFunctions.saveToRiakTS(RDDFunctions.scala:64) at org.apache.spark.sql.riak.RiakRelation.insert(RiakRelation.scala:93) at org.apache.spark.sql.riak.DefaultSource.createRelation(DefaultSource.scala:54) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68) at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127) at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127) at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80) at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656) at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:656) at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225) at tv.test.aggregate.MainKt.runAggregateFollower(Main.kt:223) at tv.test.aggregate.MainKt.main(Main.kt:66) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants