Python selenium screen capture not getting whole p

2019-02-07 13:54发布

问题:

I am trying to create a generic webcrawler that will go to a site and take a screenshot. I am using Python, Selnium, and PhantomJS. The problem is that the screenshot is not capturing all the images on a page. For example, if I go to you tube, it doesn't capture images below the main page image. (I don't have high enough rep to post screen shot) I think this may have something to do with dynamic content, but I have tried the wait functions such as implicitly wait and on set_page_load_timeout methods. Because this is a generic crawler I can't wait for a specific event (I want to crawl hundreds of sites).

Is it possible to create a generic webcrawler that can do the screen capture I am trying to do? Code I am using is:

phantom = webdriver.PhantomJS()
phantom.set_page_load_timeout(30)
phantom.get(response.url)
img = phantom.get_screenshot_as_png() #64-bit encoded string
phantom.quit

Here is the image

回答1:

Your suggestion solved the problem. Used the following code (stolen in part from answer to another question):

driver = webdriver.PhantomJS()    
driver.maximize_window()
driver.get('http://youtube.com')  
scheight = .1
while scheight < 9.9:
    driver.execute_script("window.scrollTo(0, document.body.scrollHeight/%s);" % scheight)
    scheight += .01        
driver.save_screenshot('screenshot.png')