Skip to content
This repository has been archived by the owner on Dec 4, 2023. It is now read-only.

java.lang.ClassNotFoundException: scala.$less$colon$less #178

Open
coperator opened this issue May 17, 2021 · 4 comments
Open

java.lang.ClassNotFoundException: scala.$less$colon$less #178

coperator opened this issue May 17, 2021 · 4 comments

Comments

@coperator
Copy link

coperator commented May 17, 2021

I get this error java.lang.ClassNotFoundException: scala.$less$colon$less at
S3Mock api = new S3Mock.Builder().withPort(8001).withInMemoryBackend().build();

My pom.xml:

    <dependency>
        <groupId>io.findify</groupId>
        <artifactId>s3mock_2.13</artifactId>
        <version>0.2.6</version>
        <scope>test</scope>
    </dependency>

What could be the reason?

@shuttie
Copy link
Contributor

shuttie commented May 21, 2021

Can you show your other dependencies? Looks like you have a scala 2.12 somewhere in your classpath. You can try using a s3mock_2.12 version and see if it helps.

@zaxeer
Copy link

zaxeer commented Jul 22, 2021

I also get the error after upgrading my springboot application from OpenJKD8 to OpenJDK11 and Springboot 2.2.4.RELEASE to 2.4.5. As these were must upgrades i have to do so even going to lower version s3mock_2.12 still causes Scala Error. Have no know how of Scala so writing here.

@NagulSaida
Copy link

I am receiving the same issue. please provide some solution.

@patiludayk
Copy link

patiludayk commented Jan 10, 2023

with below dependencies I am getting same error while running simple hello world program from read me file

              <dependency>
			<groupId>org.apache.spark</groupId>
			<artifactId>spark-core_2.12</artifactId>
			<version>3.3.1</version>
		</dependency>
		<dependency>
			<groupId>org.apache.spark</groupId>
			<artifactId>spark-sql_2.13</artifactId>
			<version>3.3.1</version>
		</dependency>
String logFile = "/java/spark/spark/README.md"; // Should be some file on your system
		SparkSession spark = SparkSession.builder().appName("Simple Application").config("spark.master",     "local").getOrCreate();
		Dataset<String> logData = spark.read().textFile(logFile).cache();

		long numAs = logData.filter((FilterFunction<String>) s -> s.contains("a")).count();
		long numBs = logData.filter((FilterFunction<String>) s -> s.contains("b")).count();

		System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);

		spark.stop();

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants