Using Java and Apache Spark (that has been rewritten in Scala), faced with old API method (org.apache.spark.rdd.JdbcRDD
constructor), that has AbstractFunction1 as it's argument:
abstract class AbstractFunction1[@scala.specialized -T1, @scala.specialized +R]() extends scala.AnyRef with scala.Function1[T1, R] {}
Because AbstractFunction1
is an abstract class, I cant use Java8 lambdas, so I decided to wrap scala.Function1 trait is the same with java.util.functions.Function
but does't implements andThen
and compose
methods.
As a result, i create thes interface:
import scala.Function1;
@FunctionalInterface
public interface Funct<T, R> extends Function1<T, R>, Serializable {
@Override
default <A> Function1<A, R> compose(Function1<A, T> before) {
return null;
}
@Override
default <A> Function1<T, A> andThen(Function1<R, A> g) {
return null;
}
}
IDE has no problems with this interface, but while compiling, a get:
[ERROR] Funct is not a functional interface
[ERROR] multiple non-overriding abstract methods found in interface Funct
Is it possible to wrap Scala's trait, that i can use lambdas for method:
void doMagic(scala.Function1<T,V> arg)