Browser shows timeout while server process is still running

I have the following problem:

  • I run the BIG memory process, but split the memory load into smaller pieces, so there is no problem with the CPU timeout.
  • On the server, I create XML files about 100 kb in size and they will be created around 100 +.
  • Now the main problem is that the browser shows the response time and IE at the bottom (only the top status bar) shows the message to download the .php file.
  • During this process, server-side (server-side) is still executing and constantly creating .xml files in incremental order. So no problem with that.

I have the following php.ini configuration.

max_execution_time = 10000 ; Maximum execution time of each script, in seconds max_input_time = 10000 ; Maximum amount of time each script may spend parsing request data memory_limit = 2000M ; Maximum amount of memory a script may consume (128MB) ; Maximum allowed size for uploaded files. upload_max_filesize = 2000M 

I run my site in IE. And I use ZSCE with PHP 5.3

Can someone redirect me on this?

Edit:

Uploading a timeout image and why you need to download the .php file.

enter image description here


Edit 2:

I will briefly explain my thread of execution:

  • I have one PHP file with class hierarchy objects that will start to execute Function1 () from each class hierarchy.
  • I have a class file.
  • First, let's say Function1 () is executed, which contains the logic for creating XML files in pieces.
  • Secondly, let's say the Function2 () function, which will display the output generated by Function1 ().

Everything is done in the order of the class hierarchy. Therefore, I cannot complete the intermediate start of function 1 () until it is executed. And after that Function2 () is called.

Edit 3:

This is special for @hakre .

As you asked a few cross-cutting questions, and I agree with some points, but let me describe this issue in more detail.

  • At first, I downloaded XML files larger than 100 MB at a time, and therefore my memory in the local installation hung and stopped everything on the machines, and the processor time used its most resources.

  • Then I split the large XML files into small ones (now I upload one XML file at a time, and then upload it after using it). This saved me from memory overload and CPU problems when setting up locally.

  • Now in my backend process there are no problems with the processor or memory, but the problem is related to the browser timeout. I even tried cURL, but according to my current structure, it seems to fit because of my class hierarchy. I have a set of classes in a hierarchy, and they all fulfill their Process functions first, and then they all fulfill their output functions. Therefore, if and until the process functions are executed, the output functions will not be displayed and that the browser displays the timeout.

  • I even followed the instructions suggested by @vortex , and got little success, but not what I'm looking for. Why I could not implement cURl, because my function of the process is to create the required XML files at a time, so it takes too much time to output to the browser. Since the process function takes so much time, the output cannot be assigned to the client until it is completed.

cURL output:

 URL....: myurl Code...: 200 (0 redirect(s) in 0 secs) Content: text/html Size: -1 (Own: 433) Filetime: -1 Time...: 60.437 Start @ 60.437 (DNS: 0 Connect: 0.016 Request: 0.016) Speed..: Down: 7 (avg.) Up: 0 (avg.) Curl...: v7.20.0 

The contents of the test.txt file

 * About to connect() to mylocalhost port 80 (#0) * Trying 127.0.0.1... * connected * Connected to mylocalhost (127.0.0.1) port 80 (#0) \> GET myurl HTTP/1.1 Host: mylocalhost Accept: */* < HTTP/1.1 200 OK < Date: Tue, 06 Aug 2013 10:01:36 GMT < Server: Apache/2.2.21 (Win32) mod_ssl/2.2.21 OpenSSL/0.9.8o < X-Powered-By: PHP/5.3.9-ZS5.6.0 ZendServer < Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/ < Cache-Control: private < Transfer-Encoding: chunked < Content-Type: text/html < * Connection #0 to host mylocalhost left intact * Closing connection #0 

Disclaimer The answer to this question is selected based on the first small success based on the selected answer. A solution from @Hakre is also possible with this type of question. But right now no one answered my question, but a little. Hakra’s answer is also more detailed if a person finds more detailed information about this type of problem.

+4
source share
8 answers

Assuming that you made all the modifications on the server side to avoid the server timeout [I saw almost everything described above] to avoid the browser timeout, it is important that you do something like this

 <?php set_time_limit(0); error_reporting(E_ALL); ob_implicit_flush(TRUE); ob_end_flush(); 

I can tell you from experience that an Internet explorer does not have any problems while you output some content to it. I run a 30gb database update daily [takes about 2-4 hours], and the opera seems to be the only browser that ignores the output of the content. if you did not set "ob_implicit_flush", you need to execute "ob_flush ()" after each piece of content.

References

if you are not using ob_implicit_flush at the top of your script, as I wrote earlier, you need to do something like:

 <?php echo 'dummy text or execution stats'; ob_flush(); 

inside the run loop

+10
source

1. I start the BIG memory process, but split the memory load into smaller pieces, so there was no problem with the CPU timeout.

Now that is a wild hunch. How did you find out that the problem is primarily with the CPU? Are you even? If so, what does your test give? If not, how do you now verify that this is not a timeout problem?

Despite the fact that you stated that there will be no specific problem, you do not confirm this, and many questions are still open. This suggests guessing what is counterproductive for troubleshooting (what are you doing here).

What you write here simply means that you wrote the code in chunk memory, however this is not a test for problems with the CPU timeout. One is code, the other part is test. Do not mix them. And don't make wild guesses. Problems for the test, otherwise this did not happen.

First of all, you just need to show you that when troubleshooting, look for facts (monitor, test, profile, debug step) to avoid assumptions. This is important, otherwise you are looking in the wrong places and asking the wrong questions.


From what you describe how the client (browser) behaves, it is not a matter of time. You have a problem in that the response between the response of the header and the response to the body is time consuming to taste your browser. One browser takes a timeout (since such a limit value was called, and it looks more correct for me), and the other browser assumes something is approaching, why not save it.

So you just have a processing problem. Refer to the menu of your Internet browsers (HTTP clients), the configuration values ​​of which you can change to change this behavior. For instance. monitor with curl request at the command line, how long the request really takes. Then set your browser out of time when you connect to this server for the amount of time that you just measured. For example, if you use Internet Explorer: http://www.ehow.com/how_6186601_change-internet-timeout-options.html or if you use Mozilla Firefox: http://forums.mozillazine.org/viewtopic.php?f= 7 & t = 102322 & start = 0

Since you did not show the code on the server side, I assume that you want to solve this problem with the client settings. Curl helps you measure the number of seconds that such a request takes. For detailed query information, use the -v switch (Verbose).

In case you do not want to allow this on the client, curl will still help you measure important data and easily play any synchronization associated with the server. Therefore, in any case, you should go to Curl on the command line, especially since viewing the response headers may show what causes (again) the esoteric behavior of the Internet explorer. Again, the -v switch shows you request and response headers.

If you like to automate such tests using a PHP script, this is also possible with the PHP Curl extension. This is described in:

+3
source

The problem is with your web server, not the browser.

If you are using Apache, you need to configure the Timeout value in the httpd.conf or virtual hosts configuration.

+1
source

You have 3 pages

  • Process - Creates XML files and then updates the database value, saying that the process is complete

  • PHP page that returns {true} or {false} depending on the state of the process termination database

  • Ajax front panel, poll page 2 every few seconds to check the weather, the process is running or not

Long survey

+1
source

Is it possible to send any output to the browser from a script while it is still processing, even a space? If, then do this, it should reset the timeout counter.

If this is not possible, you need to increase the IE timeout in the registry:

 HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings 

You need ReceiveTimeout, if it does not exist, create it as a dword and set the value in milliseconds.

0
source

I had this problem several times, reading a large csv file and putting it in the database. I decided this so that I divided the process of reading and entering the database into smaller parts. Just as I created a new table to make a log of how much data was read and inserted, and the next time the page reloads and starts from this position. That way you can do this by creating one xml in one try and reload the page and run the following form. Thus, the memory used by the browser is updated. Hope this helps.

0
source

What is a processor error?

The correct way to solve the problem is to run heavy material asynchronously, in a separate session group (and not in the web server process tree).

0
source

Try turning on set_time_limit(0); on the PHP script page.

The following links may help you.

http://php.net/manual/en/function.set-time-limit.php

http://php.net/manual/en/function.ignore-user-abort.php

0
source

Source: https://habr.com/ru/post/1493372/


All Articles