PHP: display image from web service

I am using an external web service that will return the URL of the image that will be displayed on my website, for example:

$url = get_from_web_service(); echo '<img url="'.$url.'" />'; 

everything works fine, except when I have 100 images to show that the web service call is becoming time and resource consumption.

 //the problem foreach($items as $item) { $url = get_from_web_service($item); echo '<img url="'.$url.'" />'; } 

So now I am considering two options:

 //Option1: Using php get_file_contents(): foreach($items as $item) { echo '<img url="url_to_my_website/get_image.php?id='.$item->id.'" />' } get_image.php : $url = get_from_web_service($id); header("Content-Type: image/png"); echo file_get_contents($url); //Option2: Using ajax: echo '<img scr="dummy_image_or_website_logo" data-id="123" />'; //ajax call to the web service to get the id=123 and get the url then add the src attribute to that image. 

THOUGHTS

  • The first option seems more direct, but my server may be overloaded and included in every image request.
  • The second option - all this is done by the browser and web service, so my server is not involved at all. but for each image I make 2 calls 1 ajax call to get the image url and another one to get the image. therefore, loading times may be different, and ajax calls may fail for a large number of calls.

Information

  • About 50 images will be displayed on this page.
  • This service will be used by approximately 100 users at a given time.
  • I have no control over the web service, so I can not change its functionality and does not accept more than 1 image identifier for each call.

My questions

  • What is the best option I should consider?
  • If not, which option should I follow? and most importantly, why should I follow this?

thanks

+4
source share
9 answers

Method 1: rendering in PHP

Pros:

  • Allows you to configure headers that are independent of the server software. If you use something that is usually not cached (for example, a PHP file with a query string) or add it to a package that requires a header function regardless of the server software, this is a very good idea.

  • If you know how to use GD or Imagick , you can easily resize, crop, compress, index, etc. your images to reduce the image file size (sometimes dramatically) and make page loading much faster.

  • If the width and height are passed as variables to the PHP file, the sizes can be set dynamically:

     <div id="gallery-images"> <noscript> <!-- So that the thumbnail is small for old mobile devices //--> <img src="get-image.php?id=123&h=200&w=200" /> </noscript> </div> <script type="text/javascript"> /* Something to create an image element inside of the div. * In theory, the browser height and width can be pulled dynamically * on page load, which is useful for ensuring that images are no larger * than they need to be. Having a function to load the full image * if the borwser becomes bigger isn't a bad idea though. */ </script> 

    This would be incredibly useful for mobile users on the image gallery page. It is also very important for users with limited bandwidth (for example, almost everything in Alaska. I say this from personal experience).

  • Allows you to easily clear EXIF ​​image data if it is uploaded by users on a website. This is important for user privacy, and make sure that your JPG has no malicious scripts .

  • It allows you to dynamically create a large sprite image and dramatically reduce your HTTP requests if they cause latency. It would be a lot of work, so it is not a very strong professional, but still you can do it with this method, which you cannot do using the second method.

vs

  • Depending on the number and size of images, this can lead to a large load on your server. When used with browser caching, dynamic images are pulled out of the cache instead of being regenerated, however, it is still very easy for a bot to service a dynamic image several times.

  • This requires knowledge of HTTP headers, basic image manipulation skills, and an understanding of how to efficiently use image processing libraries in PHP.

Method 2: AJAX

Pros:

  • The page will complete loading before any image. This is important if your content is absolutely necessary to download as quickly as possible, and the images are not very important.

  • This is much simpler, simpler, and significantly faster to implement than any dynamic PHP solution.

  • It skips HTTP requests, so the initial content loads faster (since HTTP requests can be sent based on browser action, not just the page load).

Minuses:

  • This does not reduce the number of HTTP requests, but simply crowds them out. Also note that in addition to all these images there will be at least one additional external JS file.

  • Shows nothing if the target device (for example, old mobile devices) does not support JavaScript. The only way to fix this is to load images between the <noscript> tags normally, for which PHP should generate twice as much HTML.

  • You will need to add loading.gif (and another HTTP request) or Please wait while these images load text to the loading.gif page. I personally find this annoying as a website user, because I want to see everything when the page is “done loading”.

Output:

If you have background knowledge or time to learn how to use Method 1 effectively, it gives you much more options because it allows you to manipulate images and HTTP requests sent by your page after loading.

Conversely, if you are looking for an easy way to skip your HTTP requests or want your content to load faster by adding additional image downloads, method 2 is your answer.

Looking back at methods 1 and 2, it seems that using both methods together may be the best answer. When loading two of your cached and compressed images with a page (one of them is visible, the other is a buffer, so the user does not need to wait every time they click "Next") and leave the rest of the load one by one, as the user deems fit.

In your particular situation, I believe that method 2 will be most effective if your images can be displayed in slide show mode. If you need to download all the images at once, try compressing them and apply browser caching using method 1. If too many requests for an image when loading a page destroys your speed, try writing an image.

+9
source

You are currently accessing webservice 100 times. You must change it so that it only contacts the webservice once and retrieves an array from all 100 images instead of each image separately.

Then you can loop over this array, which will be very fast, since further web transactions are not needed.

+3
source

If the images you retrieve from the web service are not dynamic in nature, i.e. do not change / change often, I would suggest setting up a scheduled task of the / cron process on your server that receives images from the web service and stores them locally (on your server), so that you can only display images on the web page from your server and Avoid crawling third-party servers every time a web page is provided to end users.

+2
source

Both options 2 cannot solve your problem, perhaps worsen it.

For option 1:

The process in which the cost of the greatest time is " get_from_web_service($item) ", and the code only runs, it is executed by another script (if the file " get_image.php " is executed on the same server).

For option 2:

This makes the get-image-resource-request request a trigger by the browser, but your server also needs to process get_from_web_service($item) .

It should be clearly said that the problem is the performance of get_from_web_service, the most direct suggestion is to make it better. On the other hand, we can reduce the number of parallel connections. I did not think about this, I only have 2 sentences:

  • Asynchronous . The user did not view your entire page, they only see the page at the top. If the above images are not displayed on top, you can use the jquery.lazyload extension, this can make the image resource in the invisible area not request the server until they are visible.

  • CSS Sprites An image sprite is a collection of images placed in a single image. If the images on your page do not change the frequency, you can write code to combine them every day.

  • Cache image . You can cache the image on your server or on another server (better). And do a few key-> value works: key is about $ item, value is the resource directory (url).

I am not a native speaker of English, I hope that I have made it clear and help you.

+1
source

im not an expert, but I think that every time you echo, it takes time. obtaining 100 images should not be a problem (only)

Besides. maybe get_from_web_service($item); should be able to accept an array?

 $counter = 1; $urls = array(); foreach($items as $item) { $urls[$counter] = get_from_web_service($item); $counter++; } // and then you can echo the information? foreach($urls as $url) { //echo each or use a function to better do it //echo '<img url="url_to_my_website/get_image?id='.$url->id.'" />' } get_image.php : $url = get_from_web_service($item); header("Content-Type: image/png"); echo file_get_contents($url); 

in the end, it would be nice if you could just call

 get_from_web_service($itemArray); //intake the array and return images 
0
source

Option 3: cache web service requests

0
source

Option one is the best option. I would also like to make sure that the images are cached on the server, so multiple trips are not required for the same image on the original web server.

If you're interested, this is the core of the code that I use to cache images, etc. (note that several things are missing, such as backing up the same content to the client, etc.):

 <?php function error404() { header("HTTP/1.0 404 Not Found"); echo "Page not found."; exit; } function hexString($md5, $hashLevels=3) { $hexString = substr($md5, 0, $hashLevels ); $folder = ""; while (strlen($hexString) > 0) { $folder = "$hexString/$folder"; $hexString = substr($hexString, 0, -1); } if (!file_exists('cache/' . $folder)) mkdir('cache/' . $folder, 0777, true); return 'cache/' . $folder . $md5; } if (!isset($_GET['img'])) error404(); getFile($_GET['img']); function getFile($url) { // true to enable caching, false to delete cache if already cached $cache = true; $defaults = array( CURLOPT_HEADER => FALSE, CURLOPT_RETURNTRANSFER => 1, CURLOPT_FOLLOWLOCATION => 1, CURLOPT_MAXCONNECTS => 15, CURLOPT_CONNECTTIMEOUT => 30, CURLOPT_TIMEOUT => 360, CURLOPT_USERAGENT => 'Image Download' ); $ch = curl_init(); curl_setopt_array($ch, $defaults); curl_setopt($ch, CURLOPT_URL, $_GET['img']); $key = hexString(sha1($url)); if ($cache && file_exists($key)) { return file_get_contents($key); } elseif (!$cache && file_exists($key)) { unlink($key); } $data = curl_exec($this->_ch); $info = curl_getinfo($this->_ch); if ($cache === true && $info['http_code'] == 200 && strlen($data) > 20) file_put_contents($key, $data); elseif ($info['http_code'] != 200) error404(); return $data; } $content = getURL($_GET['img']); if ($content !== null or $content !== false) { // Success! header("Content-Type: image"); echo $content; } 
0
source

None of the two options will solve the problem of using server resources. Of the two, however, I would recommend option 1. The second method will delay the page loading, which will slow down the website speed and lower your SEO ratings.

The best option for you is something like:

 foreach($items as $item) { echo '<img url="url_to_my_website/get_image.php?id='.$item->id.'" />' } 

Then when the magic happens, get_image.php :

 if(file_exists('/path_to_local_storage/image_'.$id.'.png')) { $url = '/path_to_images_webfolder/image_'.$id.'.png'; $img = file_get_contents($url); } else { $url = get_from_web_service($id); $img = file_get_contents($url); $imgname = end(explode('/', $url)); file_put_contents($imgname, $img); } header("Content-Type: image/png"); echo $img; 

That you only run a request for a web service once per image, and then save it in your local space. The next time the image is requested, you submit it to your local space by skipping the request for a web service.

Of course, given that image identifiers are unique and persistent.

This is probably not the best solution, but should work well for you.

0
source

As we see above, you include the URL of the provided image web service in the <img> tag src attribute, we can safely assume that these URLs are not secret or confidential.

Knowing this above, the following snippet from get_image.php will work with the least possible cost:

 $url = get_from_web_service($id); header("Location: $url"); 

If you receive a lot of subsequent requests for the same id from this client, you can slightly reduce the number of requests using the browser’s internal cache.

 header("Cache-Control: private, max-age=$seconds"); header("Expires: ".gmdate('r', time()+$seconds)); 

Otherwise, we resort to server caching using Memcached files, a database or simple files:

 is_dir('cache') or mkdir('cache'); $cachedDataFile = "cache/$id"; $cacheExpiryDelay = 3600; // an hour if (is_file($cachedDataFile) && filesize($cachedDataFile) && filemtime($cachedDataFile) + $cacheExpiryDelay > time()) { $url = file_get_contents($cachedDataFile); } else { $url = get_from_web_service($id); file_put_contents($cachedDataFile, $url, LOCK_EX); } header("Cache-Control: private, max-age=$cacheExpiryDelay"); header("Expires: ".gmdate('r', time() + $cacheExpiryDelay)); header("Location: $url"); 
0
source

Source: https://habr.com/ru/post/1500233/


All Articles