Why “could not find implicit” error in Scala + Int

2019-02-16 14:40发布

I have this code that is working 100% from sbt , executing sbt test but throw a compilation error in Intellij Idea.

import org.scalatest.{BeforeAndAfter, FunSuite, GivenWhenThen}

class SimpleTest extends FunSuite with GivenWhenThen with BeforeAndAfter {
  test("Simple Test") {
    Given("Why this error?")
    assert("ok" === "ok")
  }
}

The error is:

Error:(5, 10) could not find implicit value for parameter pos: org.scalactic.source.Position
    Given("Why this error?")
Error:(5, 10) not enough arguments for method Given: (implicit pos: org.scalactic.source.Position)Unit.
Unspecified value parameter pos.
    Given("Why this error?")
Error:(6, 11) could not find implicit value for parameter prettifier: org.scalactic.Prettifier
    assert("ok" === "ok")
Error:(6, 11) macro applications do not support named and/or default arguments
    assert("ok" === "ok")
Error:(6, 11) not enough arguments for macro method assert: (implicit prettifier: org.scalactic.Prettifier, implicit pos: org.scalactic.source.Position)org.scalatest.Assertion.
Unspecified value parameters prettifier, pos.
    assert("ok" === "ok")
Error:(4, 23) could not find implicit value for parameter pos: org.scalactic.source.Position
  test("Simple Test") {

After refresh and reload as suggested:

Error:(6, 11) exception during macro expansion: 
java.lang.NoSuchMethodError: org.scalactic.BooleanMacro.genMacro(Lscala/reflect/api/Exprs$Expr;Ljava/lang/String;Lscala/reflect/api/Exprs$Expr;)Lscala/reflect/api/Exprs$Expr;
    at org.scalatest.AssertionsMacro$.assert(AssertionsMacro.scala:34)
    assert("ok" === "ok")

I am using:

IntelliJ IDEA 2016.3.2
Build #IU-163.10154.41, built on December 21, 2016

scalaVersion := "2.11.0",
"org.scalactic" %% "scalactic" % "3.0.1" % "test",
"org.scalatest" %% "scalatest" % "3.0.1" % "test"

Notes: - Using File -> Invalidate Caches / Restart does not fix the problem - Example that reproduce the error: Example in Github

5条回答
欢心
2楼-- · 2019-02-16 14:56

Workarounds at the bottom of the response. ;)

This problem is related with this list of BUGs:

The problem is that there are dependencies in the project that are using, using test scope, other versions of scalatest and scalactic.

IntelliJ Idea is mixing compile scope and test scope, but SBT is working correctly. IntelliJ Idea team said in the BUG that they are working in this.

My workaround, at the moment, has been move to the same older version that the other libraries are using for testing.

Notes:

@justin-kaeser is assigned and working to fix this. Thx!

A lot of improvement related to the Scala plugin in that latest previews.

Example to reproduce the error : https://github.com/angelcervera/idea-dependencies-bug

Few Workarounds:

  1. Remove problematic dependencies from the Project structure -> Modules
  2. Exclude libraries in the sbt.
  3. Use the same version.
  4. Try with the last EAP: https://www.jetbrains.com/idea/nextversion/
查看更多
男人必须洒脱
3楼-- · 2019-02-16 14:56

As mentioned in issue 170, it can be a issue with mixup of spark-testing-base dependency.

Make sure you are not mixing the dependency.

I had the following dependencies

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.1.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.1.0",
  "org.apache.spark" % "spark-streaming_2.11" % "2.1.0",
  "org.apache.spark" % "spark-mllib_2.11" % "2.1.0",
  "com.holdenkarau" %% "spark-testing-base" % "2.1.0_0.8.0" % "test",
  "org.scalatest" % "scalatest_2.11" % "2.1.0" % "test",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0" classifier "models"
)

And when i tried to run test classes I was getting

Error:(32, 14) could not find implicit value for parameter pos: org.scalactic.source.Position test("lsi"){ Error:(32, 14) not enough arguments for method test: (implicit pos: org.scalactic.source.Position)Unit. Unspecified value parameter pos. test("lsi"){ ..........

Then I change the dependencies to

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  "org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
  "org.apache.spark" % "spark-mllib_2.11" % "2.2.0",
  "com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
  "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0" classifier "models"
)

Re-imported my project (as clean and package didn't work)

And the test classes passed.

查看更多
Luminary・发光体
4楼-- · 2019-02-16 15:08

It's possible some dependencies are transitively including incompatible versions of Scalactic or Scalatest in the compile scope, which also are included in the test scope.

You can check this in the Project Structure under Project Settings / Modules / Dependencies tab, and analyze it more closely with the sbt-dependency-graph plugin.

SBT does however perform dependency evictions which IntelliJ does not (issue), which can cause additional problems when compiling from the IDE. If sbt-dependency-graph shows that the conflicting versions are evicted, then it is probably an instance of this issue.

Workaround: when you find the offending transitive dependency, exclude it from the root dependency in your build.sbt. For example:

"org.apache.spark" %% "spark-core" % "2.1.0" % "provided" exclude("org.scalatest", "scalatest_2.11")
查看更多
Viruses.
5楼-- · 2019-02-16 15:20

I had similar issue.

For me, simplest way to solve this was just removing .idea folder and re-importing the project.

查看更多
够拽才男人
6楼-- · 2019-02-16 15:23

Not sure if this was an IDE bug, but for me upgrading the IDE to latest didn't proved to be of any help. After wasting few hours here is my approach to resolve this error. Which states following.

could not find implicit value for parameter prettifier: org.scalactic.Prettifier

Solution :

In IntelliJ press Ctrl+Alt+Shift+S -> Modules -> Dependencies -> Search for 
org.scalactic:3.0.0.jar (Test scope) and most probably there would be 
another version as 2.x.x in compile scope. Right click on 2.x.x and select 
EDIT and then choose the 3.0.0 version in compile scope, and apply new 
settings.

P.S. Depending on your case there may be only one entry but make sure you 
use 3.0.0 in compile scope to get rid of that weird error.
查看更多
登录 后发表回答