Zeppelin with Spark interpreter ignores imports de

2019-06-06 03:15发布

I'm trying to use some Scala code in Zeppelin 0.8.0 with Spark interpreter:

%spark
import scala.beans.BeanProperty

class Node(@BeanProperty val parent: Option[Node]) {
}

But imports do not seem to be taken into account

import scala.beans.BeanProperty
<console>:14: error: not found: type BeanProperty
                  @BeanProperty val parent: Option[Node]) {
                   ^

EDIT: I found out that the following code works :

class Node(@scala.beans.BeanProperty val parent: Option[Node]) {
}

This also works fine :

def loadCsv(CSVPATH: String): DataFrame = {
    import org.apache.spark.sql.types._
    //[...] some code
    val schema = StructType(
        firstRow.map(s => StructField(s, StringType))
    )
    //[…] some code again
}

So I guess everything works fine if it is imported between braces or directly specified with a path.to.package.Class when used.

QUESTION: How do I import outside of a class/function definition?

1条回答
狗以群分
2楼-- · 2019-06-06 03:31

Importing by path.to.package.Class works well in Zeppelin. You can try it with importing and using java.sql.Date;

import java.sql.Date
val date = Date.valueOf("2019-01-01")

The problem is about Zeppelin context. If you try to use following code snippets in Zeppelin, you will see that it works fine;

object TestImport {
     import scala.beans.BeanProperty
     class Node(@BeanProperty val parent: Option[Node]){}
}
val testObj = new TestImport.Node(None)
testObj.getParent
//prints Option[Node] = None

I hope it helps!

查看更多
登录 后发表回答