I am trying to scrape concurrently with selenium and multiprocessing modules. Below is roughly my approach:
- create queue with number of webdriver instances equal to number of workers
- create pool of workers
- each worker pulls webdriver instance from the queue
- when function terminates webdriver instance is put back on the queue
Here is the code:
#!/usr/bin/env python
# encoding: utf-8
import time
import codecs
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from multiprocessing import Pool
from Queue import Queue
def download_and_save(link_tuple):
link_id, link = link_tuple
print link_id
w = q.get()
w.get(link)
with codecs.open('%s.html' % link_id, 'w', encoding='utf-8') as f:
f.write(w.page_source)
time.sleep(10)
q.put(w)
def main(num_processes):
links = [
'http://openjurist.org/743/f2d/273/zwiener-v-commissioner-of-internal-revenue',
'http://www.oyez.org/advocates/z/l/lonny_f_zwiener',
'http://www.texasbar.com/attorneys/member.cfm?id=21191',
'https://www.courtlistener.com/opinion/441662/lonny-f-zwiener-and-ardith-e-zwiener-v-commissione/cited-by',
'https://www.courtlistener.com/opinion/441662/lonny-f-zwiener-and-ardith-e-zwiener-v-commissione/authorities/',
'http://www.myheritage.com/names/lonny_zwiene',
'https://law.resource.org/pub/us/case/reporter/F2/743/743.F2d.273.84-4068.htm',
'http://www.ancestry.com/1940-census/usa/Texas/Lonny-F-Zwiener_5bbff',
'http://search.ancestry.com/cgi-bin/sse.dll?gl=34&rank=1&new=1&so=3&MSAV=0&msT=1&gss=ms_f-34&gl=bmd_death&rank=1&new=1&so=1&MSAV=0&msT=1&gss=ms_f-2_s&gsfn=Lonny&gsln=Zwiener&msypn__ftp=T',
'http://www.mocavo.com/Lonny-F-Zwiener-Fredericksburg-Gillespie-Texas-1940-United-States-Census/0798164756456805432',
'http://www.taftlaw.com/attorneys/635-mark-s-yuric'
]
n = len(links)
link_tuples = [(link_id, link) for link_id, link in zip(xrange(n), links)]
pool = Pool(num_processes)
pool.map(download_and_save, link_tuples)
if __name__ == '__main__':
num_processes = 2
q = Queue()
dcap = dict(DesiredCapabilities.PHANTOMJS)
dcap["phantomjs.page.settings.userAgent"] = (
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_4) AppleWebKit/537.36 "
"(KHTML, like Gecko) Chrome/42.0.2311.90 Safari/537.36"
)
for i in range(num_processes):
w = webdriver.PhantomJS(desired_capabilities=dcap)
q.put(w)
main(num_processes)
This scripts runs but saved htmls are either duplicated or missing.
Here is a different approach that I've had success with: you keep your workers in __main__, and the workers pull from the task_q.