Asynchronous function call in PHP

I am working on a PHP web application, and I need to perform some network operations in the request, for example, selecting someone from a remote server based on a user request.

Is it possible to simulate asynchronous behavior in PHP, given that I need to pass some data to the function, as well as get output from it.

My code is similar:

<?php $data1 = processGETandPOST(); $data2 = processGETandPOST(); $data3 = processGETandPOST(); $response1 = makeNetworkCall($data1); $response2 = makeNetworkCall($data2); $response3 = makeNetworkCall($data3); processNetworkResponse($response1); processNetworkResponse($response2); processNetworkResponse($response3); /*HTML and OTHER UI STUFF HERE*/ exit; ?> 

Each network operation takes about 5 seconds to complete adding a total of 15 seconds to the response time of my application if I make 3 requests.

The makeNetworkCall () function performs an HTTP POST request.

The remote server is a third-party API, so I can not control it.

PS: Please do not respond to offers about AJAX or other things. I am currently looking to see if I can do this through PHP, maybe with a C ++ extension or something like that.

+70
asynchronous php network-programming
Jan 09 '13 at 13:27
source share
7 answers

I believe the answer lies here:

PHP astronomical call for php function

Quick note: use threading

+25
May 20 '13 at 21:07
source share

Currently, it is better to use queues than threads (for those who do not use Laravel, there are many other implementations like this one ).

The basic idea is that your original PHP script queues tasks or tasks. Then you have queue workers working elsewhere, taking jobs out of the queue and starting to process them regardless of the source PHP.

Benefits:

  1. Scalability - you can simply add work nodes to keep up with demand. Thus, tasks are performed in parallel.
  2. Reliability - modern queue managers, such as RabbitMQ, ZeroMQ, Redis, etc., are designed to provide maximum reliability.
+10
Jun 03 '17 at 4:23 on
source share

I have no direct answer, but you can look at these things:

+5
May 05 '14 at 8:05
source share

cURL will be your only real choice here (or using non-blocking sockets and some user logic).

This link should send you in the right direction. There is no asynchronous processing in PHP, but if you are trying to make multiple simultaneous web requests, cURL multi will take care of this for you.

+2
Jan 09 '13 at 13:34
source share

I think that if HTML and other user interface elements need the data returned, there will be no way to asynchronize them.

I believe that the only way to do this in PHP is to register the request in the database and check cron every minute, or use something like Gearman queue processing, or maybe exec () command line process

In the meantime, you will need to generate some html or js on the php page that forces it to reload every few seconds to check progress, and not perfect.

To get around the problem, how many different expected requests do you expect? Could you download them automatically every hour or so and save them to the database?

+1
Jan 09 '13 at 14:13
source share

There is also http v2, which is a curl wrapper. Can be installed via pecl.

http://devel-m6w6.rhcloud.com/mdref/http/

0
Jun 16 '14 at 22:38
source share

I think some cURL solution code is needed here, so I will share mine (it was written using several sources, such as the PHP Manual and comments).

It makes several parallel HTTP requests (domains in $aURLs ) and prints the responses after each of them completes (and saves them in $done for other possible uses).

The code is longer than necessary because part of the print is in real time and there are a lot of comments, but feel free to edit the answer to improve it:

 <?php /* Strategies to avoid output buffering, ignore the block if you don't want to print the responses before every cURL is completed */ ini_set('output_buffering', 'off'); // Turn off output buffering ini_set('zlib.output_compression', false); // Turn off PHP output compression //Flush (send) the output buffer and turn off output buffering ob_end_flush(); while (@ob_end_flush()); apache_setenv('no-gzip', true); //prevent apache from buffering it for deflate/gzip ini_set('zlib.output_compression', false); header("Content-type: text/plain"); //Remove to use HTML ini_set('implicit_flush', true); // Implicitly flush the buffer(s) ob_implicit_flush(true); header('Cache-Control: no-cache'); // recommended to prevent caching of event data. $string=''; for($i=0;$i<1000;++$i){$string.=' ';} output($string); //Safari and Internet Explorer have an internal 1K buffer. //Here starts the program output function output($string){ ob_start(); echo $string; if(ob_get_level()>0) ob_flush(); ob_end_clean(); // clears buffer and closes buffering flush(); } function multiprint($aCurlHandles,$print=true){ global $done; // iterate through the handles and get your content foreach($aCurlHandles as $url=>$ch){ if(!isset($done[$url])){ //only check for unready responses $html = curl_multi_getcontent($ch); //get the content if($html){ $done[$url]=$html; if($print) output("$html".PHP_EOL); } } } }; function full_curl_multi_exec($mh, &$still_running) { do { $rv = curl_multi_exec($mh, $still_running); //execute the handles } while ($rv == CURLM_CALL_MULTI_PERFORM); //CURLM_CALL_MULTI_PERFORM means you should call curl_multi_exec() again because there is still data available for processing return $rv; } set_time_limit(60); //Max execution time 1 minute $aURLs = array("http://domain/script1.php","http://domain/script2.php"); // array of URLs $done=array(); //Responses of each URL //Initialization $aCurlHandles = array(); // create an array for the individual curl handles $mh = curl_multi_init(); // init the curl Multi and returns a new cURL multi handle foreach ($aURLs as $id=>$url) { //add the handles for each url $ch = curl_init(); // init curl, and then setup your options curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); // returns the result - very important curl_setopt($ch, CURLOPT_HEADER, 0); // no headers in the output $aCurlHandles[$url] = $ch; curl_multi_add_handle($mh,$ch); } //Process $active = null; //the number of individual handles it is currently working with $mrc=full_curl_multi_exec($mh, $active); //As long as there are active connections and everything looks OK… while($active && $mrc == CURLM_OK) { //CURLM_OK means is that there is more data available, but it hasn't arrived yet. // Wait for activity on any curl-connection and if the network socket has some data… if($descriptions=curl_multi_select($mh,1) != -1) {//If waiting for activity on any curl_multi connection has no failures (1 second timeout) usleep(500); //Adjust this wait to your needs //Process the data for as long as the system tells us to keep getting it $mrc=full_curl_multi_exec($mh, $active); //output("Still active processes: $active".PHP_EOL); //Printing each response once it is ready multiprint($aCurlHandles); } } //Printing all the responses at the end //multiprint($aCurlHandles,false); //Finalize foreach ($aCurlHandles as $url=>$ch) { curl_multi_remove_handle($mh, $ch); // remove the handle (assuming you are done with it); } curl_multi_close($mh); // close the curl multi handler ?> 
0
Jun 29 '19 at 5:39 on
source share



All Articles