Request Error / URL Timeout in R

I am trying to get a csv file from a url, but it seems to be disconnecting after one minute. The csv file is created during the request, so it takes a little more than a minute. I tried to increase the timeout, but it did not work, it still fails in a minute.

I use url and read.csv as follows:

 # Start the timer ptm <- proc.time() urlCSV <- getURL("http://someurl.com/getcsv", timeout = 200) txtCSV <- textConnection(urlCSV) csvFile <- read.csv(txtCSV) close(txtCSV) # Stop the timer proc.time() - ptm 

summary magazine:

 Error in open.connection(file, "rt") : cannot open the connection In addition: Warning message: In open.connection(file, "rt") : cannot open: HTTP status was '500 Internal Server Error' user system elapsed 0.225 0.353 60.445 

He continues to fail when he reaches the minute, what could be the problem? Or how to increase the timeout?

I tried the url in the browser and it works fine, but to load csv

it takes more than a minute
+6
source share
2 answers

libcurl has the CONNECTTIMEOUT parameter http://curl.haxx.se/libcurl/c/CURLOPT_CONNECTTIMEOUT.html . You can install this in RCurl :

 library(RCurl) > getCurlOptionsConstants()[["connecttimeout"]] [1] 78 myOpts <- curlOptions(connecttimeout = 200) urlCSV <- getURL("http://someurl.com/getcsv", .opts = myOpts) 
+4
source

You get 500 errors from the server, which indicates that there is a timeout, and therefore out of your control (if you can’t request less data)

+1
source

Source: https://habr.com/ru/post/980287/


All Articles