Nginx and PHP-cgi - cannot file_get_contents any website on the server

This is best explained by the code I think. From the web directory:

vi get.php 

Add this php to get.php

 <? echo file_get_contents("http://IPOFTHESERVER/"); ?> 

IPOFTHESERVER is the IP address of the server running nginx and PHP.

 php get.php 

Returns the contents of the website (default) hosted on this IP BUT

http: //IPOFTHESERVER/get.php

.. returns 504 Gateway timeout . Same thing with curl. This is the same with the exec exec command and the GET command. However, directly everything works fine from the command line.

I played it on 2 nginx servers. For some reason, nginx will not allow me to make an HTTP connection to the server it is running on via PHP (unless it is used on the command line).

Anyone have any ideas why?

Thanks!

+2
source share
2 answers

Make sure that you are not confused in the exhaustion of workers on the PHP side, it was a problem setting up my laboratory server, which was configured to save RAM.

Basically, I forgot that you use one worker to process the main page for the end user, then the get_file_contents () function basically generates a separate HTTP request to the same web server, which requires 2 workers for a one-page download.

Since the first page used the last working one, the get_file_contents function was not available, so Nginx eventually answered with 504 on the first page because there was no response to the reverse proxy request.

+7
source

Make sure allow_url_fopen is set to true in your php.ini.

0
source

Source: https://habr.com/ru/post/985789/


All Articles