-->

OSError: [Errno 'jupyter-notebook' not fou

2019-05-10 17:19发布

问题:

Hi I have installed "Anaconda3-4.3.1-Windows-x86_64" in my desktop but i get below error when I run the command "jupyter notebook" from CMD.

Error:

C:\Users\my pc>jupyter notebook
Traceback (most recent call last):
File "C:\Users\pr275959\AppData\Local\Continuum\Anaconda3\Scripts\jupyter-script.py", line 5, in <"module">
   sys.exit(jupyter_core.command.main())
File "C:\Users\pr275959\AppData\Local\Continuum\Anaconda3\lib\site-packages\jupyter_core\command.py", line 186, in main
_execvp(command, sys.argv[1:])
  File "C:\Users\pr275959\AppData\Local\Continuum\Anaconda3\lib\site-packages\jupyter_core\command.py", line 104, in _execvp
    raise OSError('%r not found' % cmd, errno.ENOENT)
OSError: [Errno 'jupyter-notebook' not found] 2

回答1:

After installing anaconda, create a new environment using the following command

conda create -n yourenvname python=x.x anaconda

Then active the environment using

source activate yourenvname

Now install jupyter

conda install jupyter

Then,run jupyter-notebook



回答2:

For me, just installing jupyter globally solved it:

 conda install jupyter


回答3:

I tried this and it worked for me:

For Python 2

pip install --upgrade --force-reinstall --no-cache-dir jupyter

For Python 3

pip3 install --upgrade --force-reinstall --no-cache-dir jupyter

This should reinstall everything from PyPi. This should solve the problem as I think running pip install "ipython[notebook]" messed things up.

View_source



回答4:

I ran into a similar issue and was able to solve it by using the Anaconda prompt.

In cmd, when I ran jupyter notebook, I would get the error message "Jupyter is not recognized as an internal or external command." But the same code running in Anaconda prompt would launch the notebook.



回答5:

If you installed Anaconda 3.X in Windows and getting an error "Error executing Jupyter command 'notebook': [Errno 'jupyter-notebook' not found] 2" when you run pyspark --master local[*]

Try,

pip install jupyter

and then run pyspark --master local[*]