Upload multiple images from a remote server using PHP (many images)

I am trying to download many files from an external server (about 3700 images). These images go from 30 KB to 200 KB each.

When I use the copy() function on 1 image, it works. When I use it in a loop, all I get is 30B images (files with empty images).

I tried using copy , cURL , wget and file_get_contents . Every time I get a lot of empty files or nothing at all.

Here are the codes I tried:

Wget:

 exec('wget http://mediaserver.centris.ca/media.ashx?id=ADD4B9DD110633DDDB2C5A2D10&t=pi&f=I -O SIA/8605283.jpg'); 

Copy:

 if(copy($donnees['PhotoURL'], $filetocheck)) { echo 'Photo '.$filetocheck.' updated<br/>'; } 

Curl:

 $ch = curl_init(); $source = $data[PhotoURL]; curl_setopt($ch, CURLOPT_URL, $source); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $data = curl_exec ($ch); curl_close ($ch); $destination = $newfile; $file = fopen($destination, "w+"); fputs($file, $data); fclose($file); 

Nothing seems to work. Unfortunately, I don't have much choice to download all of these files at once, and I need a way to get it working as soon as possible.

Thanks a lot Antoine

+4
source share
2 answers

I used this function for this and worked very well.

 function saveImage($urlImage, $title){ $fullpath = '../destination/'.$title; $ch = curl_init ($urlImage); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_BINARYTRANSFER,1); $rawdata=curl_exec($ch); curl_close ($ch); if(file_exists($fullpath)){ unlink($fullpath); } $fp = fopen($fullpath,'x'); $r = fwrite($fp, $rawdata); setMemoryLimit($fullpath); fclose($fp); return $r; } 

In combination with this other, to prevent memory overflow:

 function setMemoryLimit($filename){ set_time_limit(50); $maxMemoryUsage = 258; $width = 0; $height = 0; $size = ini_get('memory_limit'); list($width, $height) = getimagesize($filename); $size = $size + floor(($width * $height * 4 * 1.5 + 1048576) / 1048576); if ($size > $maxMemoryUsage) $size = $maxMemoryUsage; ini_set('memory_limit',$size.'M'); } 
+5
source

Getting them one at a time can be quite slow. Consider splitting them into packages of 20-50 images and capturing them in multiple streams. Here is the code to get you started:

 $chs = array(); $cmh = curl_multi_init(); for ($t = 0; $t < $tc; $t++) { $chs[$t] = curl_init(); curl_setopt($chs[$t], CURLOPT_URL, $targets[$t]); curl_setopt($chs[$t], CURLOPT_RETURNTRANSFER, 1); curl_multi_add_handle($cmh, $chs[$t]); } $running=null; do { curl_multi_exec($cmh, $running); } while ($running > 0); for ($t = 0; $t < $tc; $t++) { $path_to_file = 'your logic for file path'; file_put_contents($path_to_file, curl_multi_getcontent($chs[$t])); curl_multi_remove_handle($cmh, $chs[$t]); curl_close($chs[$t]); } curl_multi_close($cmh); 

I have used this approach to capture several million images recently, since one by one will take up to a month.

The number of pictures you capture at the same time should depend on their expected size and memory limits.

+9
source

Source: https://habr.com/ru/post/1469552/


All Articles