I'm trying to store more than one spider's

2019-02-15 16:52发布

Here is my pipelines.py. I have two spiders one called bristol.py and one bath.py. When I run 'scrapy crawl bristol' it automatically adds the results to my MySQL database tabled called 'Bristol'. I want to run 'srapy crawl bath' and to be able to store the results in the MySQL database under the table name 'Bath'. I've tried adding the exact same line of code for the 'Bristol' table but I receive an error. This is the code I've tried putting directly underneath the first self.cursor.execute

self.cursor.execute("""INSERT INTO Bath(BathCountry, BathQualification) VALUES ('{0}', '{1}')""".format(item['BathCountry'], "".join([s.encode('utf8') for s in item['BathQualification']])))

When I try this I receive an error, is there a way of doing this? This is the error

 exceptions.KeyError: 'BathCountry'

Thank for your help in advance.

import sys
import MySQLdb
import MySQLdb.cursors
import hashlib
from scrapy.exceptions import DropItem
from scrapy.http import Request

class TestPipeline(object):

    def __init__(self):
        self.conn = MySQLdb.connect(
            user='user',
            passwd='password',
            db='db',
            host='host',
            charset='utf8',
            use_unicode=True
            )
        self.cursor = self.conn.cursor()

    def process_item(self, item, spider):
        try:
            self.cursor.execute("""INSERT INTO Bristol(BristolCountry, BristolQualification) VALUES ('{0}', '{1}')""".format(item['BristolCountry'], "".join([s.encode('utf8') for s in item['BristolQualification']])))
            self.conn.commit()
            return item

        except MySQLdb.Error as e:
            print "Error %d: %s" % (e.args[0], e.args[1])

0条回答
登录 后发表回答