I want to crawl a website with 2 parts and my script is not as fast as I need.
Is it possible to launch 2 spiders, one for scraping the first part and the second one for the second part?
I tried to have 2 different classes, and run them
scrapy crawl firstSpider
scrapy crawl secondSpider
but i think that it is not smart.
I read the documentation of scrapyd but I don't know if it's good for my case.
I think what you are looking for is something like this:
You can read more at: running-multiple-spiders-in-the-same-process.
Better solution is (if you have multiple spiders) it dynamically get spiders and run them.
(Second Solution): Because
spiders.list()
is deprecated in Scrapy 1.4 Yuda solution should be converted to something likeOr you can run with like this, you need to save this code at the same directory with scrapy.cfg (My scrapy version is 1.3.3) :