I want to define a UDF in scala spark like the pseudo code below:
def transformUDF(size:Int):UserDefinedFunction = udf((input:Seq[T]){
if (input != null)
Vectors.dense(input.map(_.toDouble).toArray)
else
Vectors.dense(Array.fill[Double](size)(0.0))
})
if input
is not null, cast every element to Double Type.
if input
is null, return a all-zero vector.
And I want T
to be limited to numeric type, like java.lang.Number in Java. But it seems that Seq[java.lang.Number]
cannot work with the toDouble
.
Is there any appropriate way?