Opening thousands of cURL handles without problems? (Php)

I need to use cURL in PHP to make thousands of cURL API requests. I am currently planning to do this in parallel with the curl_multi_ () functions. Basically, for simultaneously executing all thousands of cURL requests in parallel.

I heard that you may run into memory problems by opening too many pens, which can lead to fatal errors. How to avoid this and still make my URL requests as fast as possible?

If I need to limit the number of cURL requests that need to be made at a time, what good # set the limit to?

Background: I am now on a shared hosting service with Godaddy, which handles cURL requests very well, although I have not tested it with thousands of concurrent requests. In the future I will be on the Rackspace Cloud Site , which can handle a modest load.

This is a huge number of cURL requests - this is once a year, and not part of the daily operations of the site.

+3
source share
4 answers

It sounds like an architectural problem. Why do you need to do thousands of queries at once? Is it some kind of parallelism that will do something good, or you just accidentally made a mistake in DOS (denial of service), some bad suspicious web service / API?

, , , . , , . , . , apachebench, .

PHP - , 90% PHP. , . , 1000 PHP- , . PHP 10-20 , (, .

, . , . , 24 36 ?

, , , . PHP, , , , . , , :

  • " " - HTTP-, . (. ).

  • " " - , HTTP-. ,

  • ( ), . Parallelism , - .

  • , " " - , Parallelism. , , .

+5

Rolling Curl. - -. , , .

+1

, timdev, Zebra cURL https://github.com/stefangabos/Zebra_cURL. URL-, ( 10) , . github:

    <?php
        function callback($result) {
            // remember, the "body" property of $result is run through
            // "htmlentities()", so you may need to "html_entity_decode" it
            // show everything
            print_r('<pre>');
            print_r($result->info);
        }
        require 'path/to/Zebra_cURL.php';
        // instantiate the Zebra_cURL class
        $curl = new Zebra_cURL();
        // cache results 60 seconds
        $curl->cache('cache', 60);
        // get RSS feeds of some popular tech websites
        $curl->get(array(
            'http://rss1.smashingmagazine.com/feed/',
            'http://allthingsd.com/feed/',
            'http://feeds.feedburner.com/nettuts',
            'http://www.webmonkey.com/feed/',
            'http://feeds.feedburner.com/alistapart/main',
        ), 'callback');
    ?>

+1

In fact, there is not enough information. How many frequency bands will be used for each connection? if a couple of bytes will not be strangled by most connections that open several sockets at once. Even if your account is limited, your idea of ​​socket 1000 will be a bottleneck and made meaningless. Why can't you open 100 sockets and loop as soon as they complete. it's very fast

0
source

Source: https://habr.com/ru/post/1777151/


All Articles