Selenium driver.get (url)Wait for the page to load. But the scraping page is trying to load some dead JS script. Therefore, my Python script is waiting for it and does not work for several minutes. This problem can be on all pages of the site.
from selenium import webdriver
driver = webdriver.Chrome()
driver.get('https://www.cortinadecor.com/productos/17/estores-enrollables-screen/estores-screen-corti-3000')
driver.find_element_by_name('ANCHO').send_keys("100")
How to limit the timeout, block the loading of an AJAX file, or in another way?
I am also testing my script in webdriver.Chrome(), but will use PhantomJS () or possibly Firefox (). So, if any method uses changes in the browser settings, it should be universal.
source
share