JSON not working in scrapy when calling spider thr

2019-06-04 23:16发布

问题:

When i call my spider through a python script which is as follows:

import os
os.environ.setdefault('SCRAPY_SETTINGS_MODULE', 'project.settings')
from twisted.internet import reactor
from scrapy import log, signals
from scrapy.crawler import Crawler
from scrapy.settings import CrawlerSettings
from scrapy.xlib.pydispatch import dispatcher
from spiders.image import aqaqspider
def stop_reactor():
    reactor.stop()

dispatcher.connect(stop_reactor, signal=signals.spider_closed)
spider = aqaqspider(domain='aqaq.com')
crawler = Crawler(CrawlerSettings())
crawler.configure()
crawler.crawl(spider)
crawler.start()
log.start()
log.msg('Running reactor...')
reactor.run()  # the script will block here until the spider is closed
log.msg('Reactor stopped.')

My Json file is not being created. My pipelines.py is has following code:

import json
import codecs

class JsonWithEncodingPipeline(object):

    def __init__(self):
        self.file = codecs.open('scraped_data_utf8.json', 'w', encoding='utf-8')

    def process_item(self, item, spider):
        line = json.dumps(dict(item), ensure_ascii=False) + "\n"
        self.file.write(line)
        return item

    def spider_closed(self, spider):
        self.file.close()

When I call my spider with simple command line as scrapy crawl it is working fine i.e JSON file is being created.

Please Help me. I am new to scrapy???

Thank You all !! I have found the solution....