Selenium driver.get (url)
wait till full page load. But a scraping page try to load some dead JS script. So my Python script wait for it and doesn't works few minutes. This problem can be on every pages of a site.
from selenium import webdriver
driver = webdriver.Chrome()
driver.get('https://www.cortinadecor.com/productos/17/estores-enrollables-screen/estores-screen-corti-3000')
# It try load: https://www.cetelem.es/eCommerceCalculadora/resources/js/eCalculadoraCetelemCombo.js
driver.find_element_by_name('ANCHO').send_keys("100")
How to limit the time wait, block AJAX load of a file, or is other way?
Also I test my script in webdriver.Chrome()
, but will use PhantomJS(), or probably Firefox(). So, if some method uses a change in browser settings, then it must be universal.
When Selenium loads a page/url by default it follows a default configuration with
pageLoadStrategy
set tonormal
. To make Selenium not to wait for full page load we can configure thepageLoadStrategy
.pageLoadStrategy
supports 3 different values as follows:normal
(full page load)eager
(interactive)none
Here is the code block to configure the
pageLoadStrategy
:Firefox :
Chrome :
So a solution might be to set a time to wait and if the element doesn't get caught in that fixed period, catch the exception and log the event or nothing and finally proceed on. The code sample has been taken from here