I saw a lot of discussion of the system clock, which said that, for example, a standard PC clock, for example, Windows, is accurate only +/- 10 ms, while in real time the system clock has submillisecond accuracy. But what do these statements mean? How important this temporal variability depends entirely on the interval at which clock synchronization is measured. If two consecutive clock calls return timestamps that differ by 10 ms, this would be a disaster, and, fortunately, it is not; but if the watch only loses / gets 10 ms for a month, this is almost the perfect time for any practical purpose. To ask the question in another way, if I make two hourly calls, divided by 1 second, what degree of inaccuracy can I expect, say, standard PC-Windows, PC-realtime (for example,QNX with mb support that supports it) and Mac?
source
share