I'm not sure this question belongs to StackOverflow, but here it is.
I need to create a timestamp using C # for some data that needs to be transferred from one side to the other side, and I need to know what is the worst system clock accuracy on all operating systems (Windows, Linux and Unix)? I need to find out the accuracy so that all operating systems can check this timestamp.
As an example, the clock resolution for Windows Vista operating systems is approximately 10-15 milliseconds.
source
share