You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to run the one of the examples inside morpheus on a spark cluster I have edited the creation of the morpheus session using the following lines of code:
val conf = new SparkConf(true)
conf.set("spark.sql.codegen.wholeStage", "true")
conf.set("spark.sql.shuffle.partitions", "12")
conf.set("spark.default.parallelism", "8")
val spark = SparkSession
.builder()
.config(conf)
.master("spark://172.17.67.122:7077")
.appName(s"morpheus-local-${UUID.randomUUID()}")
.enableHiveSupport()
.getOrCreate()
spark.sparkContext.setLogLevel("error")
implicit val morpheus: MorpheusSession = MorpheusSession.create(spark)
import spark.sqlContext.implicits._
val nodesDF = spark.createDataset(Seq(
(0L, "Alice", 42L),
(1L, "Bob", 23L),
(2L, "Eve", 84L)
)).toDF("id", "name", "age")
val relsDF = spark.createDataset(Seq(
(0L, 0L, 1L, "23/01/1987"),
(1L, 1L, 2L, "12/12/2009")
)).toDF("id", "source", "target", "since")
val personTable = MorpheusNodeTable(Set("Person"), nodesDF)
val friendsTable = MorpheusRelationshipTable("KNOWS", relsDF)
val graph = morpheus.readFrom(personTable, friendsTable)
val result = graph.cypher("MATCH (n:Person) RETURN n.name")
result.show
just by editing the sparksession with the master URL 'spark://172.17.67.122:7077' rather than 'local'
I have a problem while running the gradlew run example ./gradlew morpheus-examples:runApp -PmainClass=org.opencypher.morpheus.examples.DataFrameInputExample
while debugging, The problem stated is with the result.show line:
Caused by: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.opencypher.okapi.trees.AbstractTreeNode.<init>(AbstractTreeNode.scala:69)
at org.opencypher.okapi.ir.api.expr.Expr.<init>(Expr.scala:50)
... 88 more
But while I change the example to be with 'local' it runs correctly, and no problems arise
When I searched, It seems the problem is with the Scala version:
although I have the following on my cluster of 3 machines:
OS: Centos 7
Spark: version 2.4.2
Scala: Version 2.12.8
and I have put the required morpheus jars in the Spark class path (spark/jars) directory on all the cluster machines Master and Workers, such as:
morpheus-spark-cypher-0.4.3-SNAPSHOT.jar
okapi-api-0.4.3-SNAPSHOT.jar
okapi-relational-0.4.3-SNAPSHOT.jar
Please help me figuring out the problem, because I take a lot of time trying to solve this issue!
Thanks in advance for your help and support!
The text was updated successfully, but these errors were encountered:
Actually I have the version of Spark2.4.2 which is compiled against Scala 2.12.8 with this spark-sql_2.12-2.4.2.jar but the same problem occurs !! which version of spark do you have ?
Could you help please !!
Hi Mohamed,
the latest release of Morpheus uses Spark 2.4.3 with Scala 2.12.8. Could you try it with that version? You can download the correct Spark version here: https://archive.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-without-hadoop-scala-2.12.tgz
Unfortunately this build does not include hadoop so you have to add the missing packages yourself if you need them or build Spark directly from source.
I'm trying to run the one of the examples inside morpheus on a spark cluster I have edited the creation of the morpheus session using the following lines of code:
just by editing the sparksession with the master URL
'spark://172.17.67.122:7077'
rather than'local'
I have a problem while running the gradlew run example
./gradlew morpheus-examples:runApp -PmainClass=org.opencypher.morpheus.examples.DataFrameInputExample
while debugging, The problem stated is with the
result.show
line:But while I change the example to be with
'local'
it runs correctly, and no problems ariseWhen I searched, It seems the problem is with the Scala version:
although I have the following on my cluster of 3 machines:
and I have put the required morpheus jars in the Spark class path (spark/jars) directory on all the cluster machines Master and Workers, such as:
Please help me figuring out the problem, because I take a lot of time trying to solve this issue!
Thanks in advance for your help and support!
The text was updated successfully, but these errors were encountered: