I am authoring a Jupyter notebook on my local machine that will eventually be run on a remote server (which is running Ubuntu). Every time I need to make a change I must export the notebook as a .py
file and then call it from the command line of the server.
I'd like to be able to run this on the fly, calling one command that takes the current .ipynb
file and executes it on the command line as if it were a .py
, showing all the print statements and output you'd expect if the .py
were run. I thought nbconverter
might do the trick using something like the following command:
jupyter nbconvert --to script --execute nbconvert_test.ipynb
As it turnout, this does not convert the .ipynb
to a .py
file to be executed on the command line as I would like, but rather it creates a new file called nbconvert_test.py
in the same directory which I would then have to run in a separate command. I'd really like to prevent the creation of that file every time I make even a small change, and to skip the extra step on the command line.
Any help is appreciated!
A workaround is a small shell script that has three parts
create a file
runnb.sh
use as such:
EDIT: According to this answer, this command should do just fine
jupyter nbconvert --execute test_nbconvert.ipynb
(just leav out the--to
flagWith the
boar
package, you can run your notebook within a python code, using:For more information, see:
https://github.com/alexandreCameron/boar/blob/master/USAGE.md
You can send the jupyter nbconvert to stranded output and pipe that to python.