This is Windows 7 with python 2.7
I have a scrapy project in a directory called caps (this is where scrapy.cfg is)
My spider is located in caps\caps\spiders\campSpider.py
I cd into the scrapy project and try to run
scrapy crawl campSpider -o items.json -t json
I get an error that the spider can't be found. The class name is campSpider
...
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "c:\Python27\lib\site-packages\scrapy-0.14.0.2841-py2.7-win32.egg\scrapy\spidermanager.py", l
ine 43, in create
raise KeyError("Spider not found: %s" % spider_name)
KeyError: 'Spider not found: campSpider'
Am I missing some configuration item?
Make sure you have set the "name" property of the spider. Example:
Without the name property, the scrapy manager will not be able to find your spider.
Also make sure that your project is not called
scrapy
! I made that mistake and renaming it fixed the problem.I also had this problem,and it turned out to be rather small. Be sure your class inherits from
scrapy.Spider
For anyone who might have the same problem, not only you need to set the
name
of the spider and check forSPIDER_MODULES
andNEWSPIDER_MODULE
in your scrapy settings, if you are running ascrapyd
service, you also need to restart in order to apply any change you have madeYou have to give a name to your spider.
However, BaseSpider is deprecated, use Spider instead.
The project should have been created by the startproject command:
Which gives you the following directory tree:
Make sure that settings.py has the definition of your spider module. eg:
You should have no problems to run your spider locally or on ScrappingHub.
Also, it is possible that you have not deployed your spider. SO first use "scrapyd" to up the server and then use "scrapyd-deploy" to deploy and then run the command.