How to load IPython shell with PySpark

2019-02-01 08:11发布

I want to load IPython shell (not IPython notebook) in which I can use PySpark through command line. Is that possible? I have installed Spark-1.4.1.

7条回答
仙女界的扛把子
2楼-- · 2019-02-01 08:57

If you use Spark < 1.2 you can simply execute bin/pyspark with an environmental variable IPYTHON=1.

IPYTHON=1 /path/to/bin/pyspark

or

export IPYTHON=1
/path/to/bin/pyspark

While above will still work on the Spark 1.2 and above recommended way to set Python environment for these versions is PYSPARK_DRIVER_PYTHON

PYSPARK_DRIVER_PYTHON=ipython /path/to/bin/pyspark

or

export PYSPARK_DRIVER_PYTHON=ipython
/path/to/bin/pyspark

You can replace ipython with a path to the interpreter of your choice.

查看更多
登录 后发表回答