I am running scrapy
in a python script
def setup_crawler(domain):
dispatcher.connect(stop_reactor, signal=signals.spider_closed)
spider = ArgosSpider(domain=domain)
settings = get_project_settings()
crawler = Crawler(settings)
crawler.configure()
crawler.crawl(spider)
crawler.start()
reactor.run()
it runs successfully and stops but where is the result ? I want the result in json format, how can I do that?
result = responseInJSON
like we do using command
scrapy crawl argos -o result.json -t json
I managed to make it work simply by adding the
FEED_FORMAT
andFEED_URI
to theCrawlerProcess
constructor, using the basic Scrapy API tutorial code as follows:Easy!
Put that script where you put
scrapy.cfg
You need to set
FEED_FORMAT
andFEED_URI
settings manually:If you want to get the results into a variable you can define a
Pipeline
class that would collect items into the list. Use thespider_closed
signal handler to see the results:FYI, look at how Scrapy parses command-line arguments.
Also see: Capturing stdout within the same process in Python.