I am having a strange problem, and most likely I do not understand something correctly. I have a website that uses ajax requests to load different pages to make it look more like an application.
I noticed that the page load time was quite high (1-3 seconds), and I wanted to compare it to notice where this problem came from. I used Firefox Developer Tools and the Networking tab. When I clicked on the link to load the page using an ajax request, I checked the developer tools. The developer tools gave me the following timings for this request:
Blocked 334ms, Connect 162ms, TLS 170ms, Wait 1183ms, Total 1860ms
As far as I understand, this means that sending a request from a browser took 334 ms, due to other simultaneous requests, it took 162 ms to connect to the server, another 170 ms for handshaking and authorization, and the response on the server was generated and sent at 1183ms. Are these assumptions true?
Then I executed a small timer in my PHP script, putting the following code at the very beginning of the executable .phpand at the very end:
// beginning
$start = microtime(true);
// end
$end = microtime(true) - $start;
Next, I issued a variable $endto see how many seconds my code started and the result was 0.356235252352.
So, what do I get from this, my script runs after ~ 0.4 seconds, but if it is true, where is the rest of the wait time, 1.1s go?
EDIT: , - PHP Apache, , . php- 7-800 , script html- 60-70 .:(