When I call
cmdline.execute("scrapy crawl website".split())
print "Hello World"
it stops the script after cmdline.execute, and doesn't run the rest of the script and print "Hello World". How do I fix this?
When I call
cmdline.execute("scrapy crawl website".split())
print "Hello World"
it stops the script after cmdline.execute, and doesn't run the rest of the script and print "Hello World". How do I fix this?
One can run subprocess.call. For example on Windows with powershell:
import subprocess
subprocess.call([r'C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe', '-ExecutionPolicy', 'Unrestricted', 'scrapy crawl website -o items.json -t json'])
I have just tried the following code, and it works for me:
By taking a look at the
execute
function in Scrapy'scmdline.py
, you'll see the final line is:There really is no way around this
sys.exit
call if you call theexecute
function directly, at least not without changing it. Monkey-patching is one option, albeit not a good one! A better option is to avoid calling theexecute
function entirely, and instead use the custom function below:And you can call it like this: