java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

Scala

Scala Problem Overview


I have the following class:

import scala.util.{Success, Failure, Try}


class MyClass {

  def openFile(fileName: String): Try[String]  = {
    Failure( new Exception("some message"))
  }

  def main(args: Array[String]): Unit = {
    openFile(args.head)
  }

}

Which has the following unit test:

class MyClassTest extends org.scalatest.FunSuite {

  test("pass inexistent file name") {
    val myClass = new MyClass()
    assert(myClass.openFile("./noFile").failed.get.getMessage == "Invalid file name")
  }

}

When I run sbt test I get the following error:

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.scalatest.tools.FriendlyParamsTranslator$.translateArguments(FriendlyParamsTranslator.scala:174)
        at org.scalatest.tools.Framework.runner(Framework.scala:918)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:533)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:527)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at sbt.Defaults$.createTestRunners(Defaults.scala:527)
        at sbt.Defaults$.allTestGroupsTask(Defaults.scala:543)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:35)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (test:executeTests) java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

Build definitions:

version := "1.0"

scalaVersion := "2.12.0"

// https://mvnrepository.com/artifact/org.scalatest/scalatest_2.11
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0"

I can't figure out what causes this. My class and unit test seem simple enough. Any ideas?

Scala Solutions


Solution 1 - Scala

I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.

Solution 2 - Scala

scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.

Solution 3 - Scala

I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.

Solution 4 - Scala

This error occurs when you use a Scala JAR file that was compiled with Scala 2.11 for a Scala 2.12 project.

Scala libraries are generally cross compiled with different versions of Scala, so different JAR files are published to Maven for different project versions. For example, Scalatest version 3.2.3 publishes separate JAR files to Maven to Scala 2.10, 2.11, 2.12, and 2.13, as you can see here.

Lots of Spark programmers will run into this error when they attach a JAR file that was compiled with Scala 2.11 to a cluster that's running Scala 2.12. See here for a detailed guide on how to migrate Spark projects from Scala 2.11 to Scala 2.12.

As the accepted answer mentioned, the SBT %% operator should be used when specifying Scala dependencies so you can automatically grab library dependencies that correspond with your project's Scala version (as mentioned in the accepted answer). The %% operator won't help you if the library dependency doesn't have a JAR file for the Scala version you're looking for. Look at the Spark releases for example:

spark releases

This build.sbt file will work because there is a Scala 2.12 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

This code will not work because there isn't a Scala 2.11 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

You can cross compile your project and build JAR files for different Scala versions if your library dependencies are also cross compiled. Spark 2.4.7 is cross compiled with Scala 2.11 and Scala 2.12, so you can cross compile your project with this code:

scalaVersion := "2.11.12"
crossScalaVersions := Seq("2.11.12", "2.12.10")
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.7"

The sbt +assembly code will build two JAR files for your project, one that's compiled with Scala 2.11 and another that's compiled with Scala 2.12. Libraries that release multiple JAR files follow a similar process cross compilation workflow.

Solution 5 - Scala

In my experience, if you still get errors after matching scalatest version and scala version in build.sbt, you have to think about your actual scala version that is running on your machine. You can check it by $ scala, seeing

Type in expressions for evaluation. Or try :help.

this type of messages. You need to match this Scala version(eg. 2.12.1 here) and build.sbt's one.

Solution 6 - Scala

In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.

Solution 7 - Scala

This was happening to me in DataBricks. Problem was the same as noted by previous answers, the incompatibility with spark and scala version. For DataBricks, I had to change the cluster DataBricks Runtime Version. Default was Scala 2.11/Spark 2.4.5, bump this up to at least Scala 2.12/Spark 3.0.0

Click Clusters > Cluster_Name > Edit > DataBricks Runtime Version

enter image description here

Solution 8 - Scala

When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.

Solution 9 - Scala

Try adding the following line to your build.sbt

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

your build.sbt should be like this:

libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

With this, the error for me is solved.

Solution 10 - Scala

in eclipse ide the project tends to be preselected with the scala installation 'Latest 2.12 bundle (dynamic)' configuration. If you are not actually using 2.12 for your Scala project and you attempt to run your project through the IDE, then this issue will manifest itself.

I've also noticed if I rebuild my eclipse project with the sbt command: "eclipse with-source" that this has the side effect of resetting the eclipse project scala installation back to the 2.12 setting (even though my build.sbt file is configured for a 2.11 version of Scala). So be on the lookout for both of those scenarios.

Solution 11 - Scala

In my case, I had a project jar dependency which was depending on a different version of scala. This was found under Project Structure -> Modules -> (selected project) -> Dependencies tab. Everything else in the project and its libraries lined up in scala version (2.12), but the other jar was hiding a transitive dependency on an older version (2.11).

Solution 12 - Scala

I am doing a PoC using Apache Spark-3.1.1 and Apache ignite-2.10, and trying to load data from spark to ignite cluster. But while saving data I am getting the below error.

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

My code is as below:

df.write
            .format(FORMAT_IGNITE)
            .option(OPTION_CONFIG_FILE, CONFIG)
            .option(OPTION_TABLE, "connect")
            .option(OPTION_CREATE_TABLE_PRIMARY_KEY_FIELDS, "id")
            .option(OPTION_CREATE_TABLE_PARAMETERS, "template=replicated").mode(SaveMode.Append)
            .save()

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionbskyView Question on Stackoverflow
Solution 1 - ScalaAnton TkachovView Answer on Stackoverflow
Solution 2 - ScalaAlexey RomanovView Answer on Stackoverflow
Solution 3 - ScalaHHHView Answer on Stackoverflow
Solution 4 - ScalaPowersView Answer on Stackoverflow
Solution 5 - ScalaMyounghoonKimView Answer on Stackoverflow
Solution 6 - Scalalouis lView Answer on Stackoverflow
Solution 7 - ScalaibaralfView Answer on Stackoverflow
Solution 8 - ScalaMeirDayanView Answer on Stackoverflow
Solution 9 - ScalaYousef IrmanView Answer on Stackoverflow
Solution 10 - ScalaAndrew NormanView Answer on Stackoverflow
Solution 11 - ScalavoxoidView Answer on Stackoverflow
Solution 12 - ScalaKirti GuptaView Answer on Stackoverflow