How to quickly copy 2000 images on a server using PHP?

I have a PHP file (function on the site) - this allows the user to import data from his account to another site.
I also need to copy LOT images from there every time he wants to import data.
For instance. 500 photos, 300-500Kb each - minus. Expect this number to easily be 2,000 images per user.

Steps for each image:

  • Get image URL
  • Make image from URL (using imagecreatefromjpeg and others)
  • Save it on my server (using features like imagejpeg, imagepng, etc.)

It already takes VERY long to execute this code (more than 8 minutes).
I understand a lot of data, but is there another way to do this?
It is possible to start copying in the background or copy many photos at once.
I just want to know if there is any technology specially developed for this, and I do not know about it.
Or there is no other way than outsourcing work with images on some server to post images and saving only thumbnails.

Thanks.

+4
source share
4 answers

There is not much information here. What OS are used? How is "remote" the source site? What format are the images already in?

If another site is deleted (i.e. another hosting company), the main problem that you will encounter is the speed with which the source server can transfer data to your computer.

One big question, though, is "what image format at the moment?". If the images are already JPEG, then obtaining and subsequent conversion to JPEG will again reduce the quality (albeit slightly). It would be best to just copy the image file directly. This will reduce the time your PHP application spends on re-encoding JPEGs. Ask yourself: do you really need convert images?

Depending on which OS commands you have, it may be better for you to disable applications that handle the transfer (for example, wget on Linux). I used wget to retrieve files from a remote server to a local one, and it is not that hard to run.

Remember - the more steps you have in transmitting, the longer it will take to act. At the moment you have:

  • Search
  • transformations
  • spelling

everything is handled by PHP (possibly from the slowest to the fastest)

Does source source provide archive or export access for clients? If so, can this be used for mass file transfer?

By taking as much as possible from PHP, the process will speed up. Calling system functions (e.g. wget , ftp , ssh , imagemagick , etc.) will make everything faster (outside of PHP and Apache)

+3
source

Threading seems like the obvious answer ...

https://github.com/krakjoe/pthreads

I have one voice in the sea of ​​what seems to hundreds of people saying that PHP has no way to cut threads ... I think that the current tendency to turn into curl to make your thread for you is simply poor, and tout as a solution is even worse , the overhead of such things should just be terrible ...

PHP has always had tools for multithreading, now it’s not like it is now, without them, because it would not support multithreaded web servers. It’s just that this is not the purpose of the language design, but external efforts are still not used to bring user edit threads to PHP, the google code doesn’t even exist ... I need PHP to have threads for the upcoming project, so I will have a thread ... and so you and everything else is my gift to the world / web, enjoy :)

+1
source
  • Set up some kind of queue for image import processing - this way the user does not wait and the scripts are not synchronized.

  • Try parallel queries with curl_multi_init ()

0
source

This is a PHP function that you can use to download a file from the Internet (the $ url parameter) to a local file on your server (the $ file_path parameter):

 function download_file($url, $file_path) { $out = fopen($file_path, 'wb'); if ($out == FALSE){ print "File not opened<br>"; exit; } $ch = curl_init(); curl_setopt($ch, CURLOPT_FILE, $out); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_URL, $url); curl_exec($ch); //echo "<br>Error is : ".curl_error ( $ch); curl_close($ch); fclose($out); } 

You can call it like this:

 $url = 'http://upload.wikimedia.org/wikipedia/commons/thumb/1/1f/Iss030e015472_Edit.jpg/352px-Iss030e015472_Edit.jpg'; download_file($url,'/var/www/www.mysite.com/public_html/images/image_user1.jpg'); 

make sure the folder where you save the file has write permissions for your apache user. Also make sure you have the cURL php extension for this.

This function should be much faster than your imagecreatefromjpeg approach. Try it, and if you still find it slow for you, you can improve it by implementing a queue to run multiple requests in parallel with curl_multi_init as Gabriel suggested.

0
source

Source: https://habr.com/ru/post/1433787/


All Articles