What is the meaning of the statements about the accuracy and accuracy of the watch?

I saw a lot of discussion of the system clock, which said that, for example, a standard PC clock, for example, Windows, is accurate only +/- 10 ms, while in real time the system clock has submillisecond accuracy. But what do these statements mean? How important this temporal variability depends entirely on the interval at which clock synchronization is measured. If two consecutive clock calls return timestamps that differ by 10 ms, this would be a disaster, and, fortunately, it is not; but if the watch only loses / gets 10 ms for a month, this is almost the perfect time for any practical purpose. To ask the question in another way, if I make two hourly calls, divided by 1 second, what degree of inaccuracy can I expect, say, standard PC-Windows, PC-realtime (for example,QNX with mb support that supports it) and Mac?

+3
source share
2 answers

Your question may lead to a wider discussion. When you talk about the time interval over which the hours are measured, I think this is called drift. If the time stamps from two consecutive clock calls differ by 10 ms, maybe it will take so long, maybe an interruption has occurred, maybe the clock is really drifting, maybe the reporting accuracy is in units of 10 ms, maybe there is a rounding error, etc. .d. The accuracy of reporting system clocks depends on its speed (i.e. 1 GHz = 1 ns), hardware support, and OS support. Sorry, I don’t know how Windows compares with Mac.

+1
source

- , , Java:

System.currentTimeMillis() (15 Windows XP). , System.currentTimeMillis(), , 15 . , , 8 , 0ms, 15ms .

, , , . .

, Java System.nanoTime, (.. ) ( , , , nbetween).

, , API.

0

Source: https://habr.com/ru/post/1795774/


All Articles