Is CURL expensive compared to get_contents ()?

Im starting the video link aggregator and I have a script that checks if the video has been removed from the site. This is done by obtaining the HTML output of the link and checking against the target keywords.

I am currently using file_get_contents () to get the html of the link. The problem is that some sites are redirected to a different URL if the link is removed.

Using curl solves the problem ... but will it use more server resources? I run a checker script every 10 minutes and it checks 1000 links (there are 300,000 links in the database).

The code I want to use is as follows:

$Curl_Session = curl_init('http://www.domain.com');
curl_setopt ($Curl_Session, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt ($Curl_Session, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec ($Curl_Session);
curl_close ($Curl_Session);
+3
source share
3 answers

- , . php curl, , , .

. , script. max CURLOPT_MAXREDIRS

, , ? izeize:

curl_setopt($Curl_Session, CURLOPT_HEADER, false);
curl_setopt($Curl_Session, CURLOPT_NOBODY, true);

$info = curl_getinfo();
echo $info[‘download_content_length’]
+3

CURL , , . , , , .

+1

Source: https://habr.com/ru/post/1718591/


All Articles