Javascript Timer Accuracy

Please note that this is the next question from my previous one: " How to remember the user's progress through the application? ".

After the answers, I realized that the error was to use Javascript to determine if the user won the game. So I moved the logic to the controller: I save the UNIX timestamp in the session when the user starts the level, and calculates the difference when the user finishes it. If it took the user no more than X seconds, I allow him / her to access the next one and so on and so forth.

Everything worked fine until I decided to introduce a client timer made in Javascript. The timer is not exactly accurate like PHP, because it relies on the user CPU, not the server CPU. Thus, the two timers are not synchronized. The game still works when the time interval is large (for example, the user must complete the level between 4 and 10 seconds), but it does not work properly if the time interval is less (1 or 2 seconds). Even if the user clicks on the winning time, PHP will tell him that he did not win.

I made several attempts, and here are my results:

|===========================| | # | PHP | JS | Ratio | |===========================| | 1 | 6.20 | 5.29 | 1.17 | | 2 | 7.32 | 6.27 | 1.16 | | 3 | 2.89 | 2.24 | 1.29 | | 4 | 22.56 | 20.13 | 1.12 | | 5 | 50.15 | 45.24 | 1.10 | |===========================| 

(Yes, I know, this is the worst ASCII table ever made.)

And this is the timer I created using jQuery (I am not really trained in this language, so it is probably also worse than the Javascript timer ever made):

 $(document).ready(function() { window['time'] = { 'cseconds': 0, 'dseconds': 0, 'seconds': 0, 'minutes': 0, 'hours': 0 }; $("#timer").everyTime(10, function() { window['time']['cseconds'] += 1; if (window['time']['cseconds'] > 9) { window['time']['cseconds'] = 0; window['time']['dseconds'] += 1; } if (window['time']['dseconds'] > 9) { window['time']['dseconds'] = 0; window['time']['seconds'] += 1; } if (window['time']['seconds'] > 60) { window['time']['seconds'] = 0; window['time']['minutes'] += 1; } if (window['time']['minutes'] > 60) { window['time']['minutes'] = 0; window['time']['hours'] += 1; } cseconds = window['time']['cseconds'].toString(); dseconds = window['time']['dseconds'].toString(); seconds = window['time']['seconds'].toString(); minutes = window['time']['minutes'].toString(); hours = window['time']['hours'].toString(); if (seconds.length < 2) { seconds = "0" + seconds; } if (minutes.length < 2) { minutes = "0" + minutes; } if (hours.length < 2) { hours = "0" + hours; } $("#timer").text(hours + ":" + minutes + ":" + seconds + "." + dseconds + cseconds); }); }); 

I tried different solutions, such as AJAX: at first the timer relied only on AJAX requests (which led to thousands of HTTP requests), and it was still inaccurate. Then I modified it to synchronize with PHP every X seconds, but it didn't work either.

Does anyone have any hints?

+4
source share
3 answers

Why is the AJAX solution not working? This was the second decision in my mind. It should work because time is coming from the server ...

The first solution is quite simple, and I think the most accurate. Instead of relying on the periodic execution of the Javascript function, try relying on the system time. Use the periodic Javascript function to update the time on the screen, but get that time using the system date / time, not just seconds += 1 ! Some pseudo codes:

 var startTime = new Date(); // Every 1/2 second or less... var elapsedTimeInMs = new Date().getTime() - startTime.getTime(); 

However, I suggest you read the documentation for Date objects.

+1
source

If you execute the new Date() time during the event both at the beginning and at the end of the event and compare the difference between the elapsed time between them, you will get a very accurate time using javascript. The accuracy of timers in modern browsers is often a few milliseconds, and in older browsers - no worse than 15 ms.

Your code looks like it is trying to use an interval timer and counting intervals. It will not be very accurate. Since javascript is single-threaded, the exact time during which the javascript timer fires is not very accurate, so it cannot be used for the exact time of the event. In other words, it’s much more accurate to measure the elapsed system time, rather than counting the time intervals that have been started.

+3
source

you will need to use AJAX, but only in case the user completes the game to get time on the server ...

0
source

Source: https://habr.com/ru/post/1393412/


All Articles