I have the following code:
source
.mapValues(value -> value + " Stream it!!!")
.print(Printed.toSysOut());
as you can see, mapValues
expects a lambda expression.
Now, I am using Java library but the application is written in Scala. How to pass Scala lambda to Java code?
I tried the following:
source
.mapValues(value => value + "hello")
.print(Printed.toSysOut)
But the compiler complains:
[error] (x$1: org.apache.kafka.streams.kstream.Printed[String,?0(in value x$1)])Unit <and>
[error] (x$1: org.apache.kafka.streams.kstream.KeyValueMapper[_ >: String, _ >: ?0(in value x$1), String])Unit <and>
[error] (x$1: String)Unit
[error] cannot be applied to (org.apache.kafka.streams.kstream.Printed[Nothing,Nothing])
[error] .print(Printed.toSysOut)
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed Nov 19, 2017 7:53:44 PM
The error message lists the types of arguments that
print
supports. One of them is:From the error message you can see that you're providing
Printed.toSysOut
with a type of:According to the Kafka 1 javadoc (
Printed
was not present in Kafka 1.1),toSysOut
is defined as:So the answer problem is that Scala is inferring
K
andV
with types ofNothing
. You need to provide the types explicitly.The following will probably work:
It depends on your version of Scala.
In 2.12 Scala functions can be used in places where Java functions are expected and vice versa.
App1.java
App.scala
In 2.11 you can use
scala-java8-compat
App1.java
App.scala
Alternatively in 2.11 in Scala you can define implicit converters between
java.util.function.Function
andscala.Function1
.So if you use 2.11 try
or