My code for scrapping data from alibaba website:
import scrapy
class IndiamartSpider(scrapy.Spider):
name = 'alibot'
allowed_domains = ['alibaba.com']
start_urls = ['https://www.alibaba.com/showroom/acrylic-wine-box_4.html']
def parse(self, response):
Title = response.xpath('//*[@class="title three-line"]/a/@title').extract()
Price = response.xpath('//div[@class="price"]/b/text()').extract()
Min_order = response.xpath('//div[@class="min-order"]/b/text()').extract()
Response_rate = response.xpath('//i[@class="ui2-icon ui2-icon-skip"]/text()').extract()
for item in zip(Title,Price,Min_order,Response_rate):
scraped_info = {
'Title':item[0],
'Price': item[1],
'Min_order':item[2],
'Response_rate':item[3]
}
yield scraped_info
Notice the start url, it only scraps through the given URL, but i want this code to scrap all the urls present in my csv file. My csv file contains large amount of URLs. Sample of the data.csv file::
'https://www.alibaba.com/showroom/shock-absorber.html',
'https://www.alibaba.com/showroom/shock-wheel.html',
'https://www.alibaba.com/showroom/shoes-fastener.html',
'https://www.alibaba.com/showroom/shoes-women.html',
'https://www.alibaba.com/showroom/shoes.html',
'https://www.alibaba.com/showroom/shoulder-long-strip-bag.html',
'https://www.alibaba.com/showroom/shower-hair-band.html',
...........
How do i import all the links of csv file in the code at once?
You're almost there already. The only change is in
start_urls
, which you want to be "all the urls in the *.csv file." The following code easily implements that change.To correctly loop through a file without loading all of it into memory you should use generators, as both file objects and start_requests method in python/scrapy are generators:
To explain futher: Scrapy engine uses
start_requests
to generate requests as it goes. It will keep generating requests untill concurrent request limit is full (settings likeCONCURRENT_REQUESTS
).Also worth noting that by default scrapy crawls depth first - newer requests take priority, so start_requests loop will be last to finish.