I am importing the Scrapy item keys from items.py
, into pipelines.py
.
The problem is that the order of the imported items are different from how they were defined in the items.py
file.
My items.py
file:
class NewAdsItem(Item):
AdId = Field()
DateR = Field()
AdURL = Field()
In my pipelines.py
:
from adbot.items import NewAdsItem
...
def open_spider(self, spider):
self.ikeys = NewAdsItem.fields.keys()
print("Keys in pipelines: \t%s" % ",".join(self.ikeys) )
#self.createDbTable(ikeys)
The output is:
Keys in pipelines: AdId,AdURL,DateR
instead of the expected: AdId,DateR,AdURL
.
How can I ensure that the imported order remains the same?
Note: This might be related to How to get order of fields in Scrapy item, but it's not at all very clear what's going on, since Python3 docs state that lists and dicts should retain their order. Also note, that when using process_item()
and using item.keys()
, the order is retained! But I need to access the keys in order before item's are scraped.
A simple fix is to define
keys()
method in yourItem
class:The only way I could get this to work, was to use this solution in the following manner.
My
items.py
file:Then import the
_field_order
item into yourpiplines.py
with:I can now create new DB tables in the correct order of appearance, without worrying of Python's weird way of sorting dicts in unexpected ways.