I am defining an item exporter that pushes items to a message queue. Below is the code.
from scrapy.contrib.exporter import JsonLinesItemExporter
from scrapy.utils.serialize import ScrapyJSONEncoder
from scrapy import log
from scrapy.conf import settings
from carrot.connection import BrokerConnection, Exchange
from carrot.messaging import Publisher
log.start()
class QueueItemExporter(JsonLinesItemExporter):
def __init__(self, **kwargs):
log.msg("Initialising queue exporter", level=log.DEBUG)
self._configure(kwargs)
host_name = settings.get('BROKER_HOST', 'localhost')
port = settings.get('BROKER_PORT', 5672)
userid = settings.get('BROKER_USERID', "guest")
password = settings.get('BROKER_PASSWORD', "guest")
virtual_host = settings.get('BROKER_VIRTUAL_HOST', "/")
self.encoder = settings.get('MESSAGE_Q_SERIALIZER', ScrapyJSONEncoder)(**kwargs)
log.msg("Connecting to broker", level=log.DEBUG)
self.q_connection = BrokerConnection(hostname=host_name, port=port,
userid=userid, password=password,
virtual_host=virtual_host)
self.exchange = Exchange("scrapers", type="topic")
log.msg("Connected", level=log.DEBUG)
def start_exporting(self):
spider_name = "test"
log.msg("Initialising publisher", level=log.DEBUG)
self.publisher = Publisher(connection=self.q_connection,
exchange=self.exchange, routing_key="scrapy.spider.%s" % spider_name)
log.msg("done", level=log.DEBUG)
def finish_exporting(self):
self.publisher.close()
def export_item(self, item):
log.msg("In export item", level=log.DEBUG)
itemdict = dict(self._get_serialized_fields(item))
self.publisher.send({"scraped_data": self.encoder.encode(itemdict)})
log.msg("sent to queue - scrapy.spider.naukri", level=log.DEBUG)
I'm having a few problems. The items are not being submitted to the queue. Ive added the following to my settings:
FEED_EXPORTERS = {
"queue": 'scrapers.exporters.QueueItemExporter'
}
FEED_FORMAT = "queue"
LOG_STDOUT = True
The code does not raise any errors, and neither can I see any of the logging messages. Im at my wits end on how to debug this.
Any help would be much appreciated.
I hit the same problem, although being probably some version updates ahead. The little detail that solved it for me was setting
FEED_URI = "something"
insettings.py
. Without this, the entry inFEED_EXPORTERS
wasn't respected at all."Feed Exporters" are quick (and somehow dirty) shortcuts to call some "standard" item exporters. Instead of setting up a feed exporter from settings, hard wire your custom item exporter to your custom pipeline, as explained here http://doc.scrapy.org/en/0.14/topics/exporters.html#using-item-exporters :
Tip: Good example to start from is Writing Items to MongoDb from official docs.
I've done a similar thing. I've created a pipeline that places each item into an S3-like service (I'm using Minio here, but you get the idea). It creates a new bucket for each spider and places each item into an object with a random name. Full source code can by found in my repo.
Starting with simple quotes spider from the tutorial:
In
settings.py
:In
scrapy_quotes/pipelines.py
create a pipeline and an item exporter: