I want to load IPython shell (not IPython notebook) in which I can use PySpark through command line. Is that possible? I have installed Spark-1.4.1.
相关问题
- how to define constructor for Python's new Nam
- streaming md5sum of contents of a large remote tar
- How to maintain order of key-value in DataFrame sa
- How to get the background from multiple images by
- Evil ctypes hack in python
If you use Spark < 1.2 you can simply execute
bin/pyspark
with an environmental variableIPYTHON=1
.or
While above will still work on the Spark 1.2 and above recommended way to set Python environment for these versions is
PYSPARK_DRIVER_PYTHON
or
You can replace
ipython
with a path to the interpreter of your choice.