I am trying to execute the following scala code on Spark but due to some reason the function selective is not getting called
var lines = sc.textFile(inputPath+fileName,1)
val lines2 =lines.map(l=>selective(
func,List(2,3),List(1,26),l,";",0
))
lines2.toArray().foreach(l=>out.write(l))
.......
The selective function is defined as follows
def selective(f: (String,Boolean) => String , phoneFields: Seq[Int], codeFields: Seq[Int], in: String, delimiter:String, iMode:Int /* 0 for enc, 1 for dec */) :String =
in.split(delimiter,-1).zipWithIndex
.map {
case (str, ix)
if( phoneFields.contains(ix)||codeFields.contains(ix)) =>
var output=f(str,codeFields.contains(ix))
var sTemp=str+":"+output+"\n"
if((iMode==0)&&codeFields.contains(ix)&&(str.compareTo("")!=0) )
CodeDictString+=sTemp
else if(str.compareTo("")!=0)
PhoneDictString+=sTemp
output
case other => other._1
}.mkString(";").+("\n")
The println statement is not executing. Furthermore the function is not returning any thing. sc is the spark context object