I am looking for some example code of a SQLite pipeline in Scrapy. I know there is no built in support for it, but I'm sure it has been done. Only actual code can help me, as I only know enough Python and Scrapy to complete my very limited task, and need the code as a starting point.
相关问题
- how to define constructor for Python's new Nam
- streaming md5sum of contents of a large remote tar
- How to get the background from multiple images by
- Evil ctypes hack in python
- Correctly parse PDF paragraphs with Python
For anyone trying to solve a similar problem, I just came across a nice Sqlite Item Exproter for SQLite: https://github.com/RockyZ/Scrapy-sqlite-item-exporter.
After including it to your project settings, you can use it with:
It could also be adapted to be used as an Item Pipeline instead of Item Exporter.
The following should be compatible with:
If you want to use APSW, just replace sqlite as noted in the comments.
If you have your
items.py
look like this:In
settings.py
:In
pipelines.py
:You can then check your DB from command line with:
WARNING:
I did something like this:
Here is a sqlite pipeline with sqlalchemy. With sqlalchemy you can easily change your database if ever needed.
In
settings.py
add database configurationThen in
pipelines.py
add the followingand
items.py
should look like thisYou may also consider to move
class Book
intoitems.py
If you feel comfortable with twisted's adbapi, you can take as starting point this mysql pipeline: http://github.com/darkrho/scrapy-googledir-mysql/blob/master/googledir/pipelines.py
And use this line at
__init__
: