Prevent curling leading to fatal error for large files

I use curl and php to find out information about a given URL (e.g. http status code, mimetype, http redirect location, page name, etc.).

  
 $ ch = curl_init ($ url);
 $ useragent = "Mozilla / 5.0 (X11; U; Linux x86_64; ga-GB) AppleWebKit / 532.9 (KHTML, like Gecko) Chrome / 5.0.307.11 Safari / 532.9";
 curl_setopt ($ ch, CURLOPT_HTTPHEADER, array (
        "Accept: application / rdf + xml; q = 0.9, application / json; q = 0.6, application / xml; q = 0.5, application / xhtml + xml; q = 0.3, text / html; q = 0.2, * / * ; q = 0.1 "
    ));
 curl_setopt ($ ch, CURLOPT_FOLLOWLOCATION, 1);
 curl_setopt ($ ch, CURLOPT_SSL_VERIFYPEER, FALSE);
 curl_setopt ($ ch, CURLOPT_USERAGENT, $ useragent); 
 curl_setopt ($ ch, CURLOPT_RETURNTRANSFER, 1);
 $ content = curl_exec ($ ch);
 $ chinfo = curl_getinfo ($ ch);
 curl_close ($ ch);

It generally works well. However, if the URL points to a larger file, I get a fatal error:

Fatal error : allowable memory size of 16777216 bytes exhausted (tried to allocate 14421576 bytes)

Is there any way to prevent this? For example, let's say curl to refuse if the file is too large or catch an error?

As a workaround, I added

curl_setopt ($ ch, CURLOPT_TIMEOUT, 3); which assumes that any file that takes more than 3 seconds to load will run out of memory, but this is far from satisfactory.

+3
source share
2 answers

CURLOPT_FILE ? /dev/null, ...

CURLOPT_WRITEFUNCTION . , .

, PHP php.ini.

+2

, HEAD? 16MiB.

curl_setopt($ch, CURLOPT_HEADER, true);

file_get_contents() , .

0

Source: https://habr.com/ru/post/1759625/


All Articles