I am working on Scrapy 0.20 with Python 2.7. I found PyCharm has a good Python debugger. I want to test my Scrapy spiders using it. Anyone knows how to do that please?
What I have tried
Actually I tried to run the spider as a scrip. As a result, I built that scrip. Then, I tried to add my Scrapy project to PyCharm as a model like this:
File->Setting->Project structure->Add content root.
But I don't know what else I have to do
I use this simple script:
I am also using PyCharm, but I am not using its built-in debugging features.
For debugging I am using
ipdb
. I set up a keyboard shortcut to insertimport ipdb; ipdb.set_trace()
on any line I want the break point to happen.Then I can type
n
to execute the next statement,s
to step in a function, type any object name to see its value, alter execution environment, typec
to continue execution...This is very flexible, works in environments other than PyCharm, where you don't control the execution environment.
Just type in your virtual environment
pip install ipdb
and placeimport ipdb; ipdb.set_trace()
on a line where you want the execution to pause.I am running scrapy in a virtualenv with Python 3.5.0 and setting the "script" parameter to
/path_to_project_env/env/bin/scrapy
solved the issue for me.You just need to do this.
Create a Python file on crawler folder on your project. I used main.py.
Inside your main.py put this code below.
And you need to create a "Run Configuration" to run your main.py.
Doing this, if you put a breakpoint at your code it will stop there.
intellij idea also work.
create main.py:
show below:
To add a bit to the accepted answer, after almost an hour I found I had to select the correct Run Configuration from the dropdown list (near the center of the icon toolbar), then click the Debug button in order to get it to work. Hope this helps!