Using cURL to load large XML files

I work with PHP and need to parse some pretty large XML files (50-75 MB uncompressed). The problem, however, is that these XML files are stored remotely and must be downloaded before I can parse them.

After thinking about this problem, I think that using the system () call in PHP to initiate the cURL transfer is probably the best way to avoid PHP timeouts and memory limits.

Has anyone done anything like this before? In particular, what should I pass to cURL to load a remote file and ensure that it is saved in a local folder of my choice?

+3
source share
1 answer

:

function download($src, $dst) {
        $f = fopen($src, 'rb');
        $o = fopen($dst, 'wb');
        while (!feof($f)) {
            if (fwrite($o, fread($f, 2048)) === FALSE) {
                   return 1;
            }
        }
        fclose($f);
        fclose($o);
        return 0;
}
download($url,$target);
if ( file_exists($target) ){
   # do your stuff
}
+1

Source: https://habr.com/ru/post/1733654/


All Articles