Worst operating system accuracy?

I'm not sure this question belongs to StackOverflow, but here it is.

I need to create a timestamp using C # for some data that needs to be transferred from one side to the other side, and I need to know what is the worst system clock accuracy on all operating systems (Windows, Linux and Unix)? I need to find out the accuracy so that all operating systems can check this timestamp.

As an example, the clock resolution for Windows Vista operating systems is approximately 10-15 milliseconds.

+3
source share
2 answers

. (0,01 ), .

Linux (. man utime) . Windows NT/Win2K/XP/etc. ( NTFS), 0.000 000 1 ( ).

, GPS- 100 , , , 10 . GPS .

0

- unix timestamp ? , ? .

"" , , , , - ( , -).

/, (ALA Kerberos) , , , .

+1

Source: https://habr.com/ru/post/1726109/


All Articles