Windows UTC timestamp

I have a buffer with a UTC timestamp in C, I translate this buffer every 10 seconds. The problem is that the time difference between the two packets is not consistent. After 5-10 iterations, the time difference becomes 9, 11, and then again 10. Please help me sort this out.

I use <time.h>UTC for time.

+1
source share
2 answers

A thread that sleeps in X milliseconds does not guarantee to sleep exactly that many milliseconds. I assume you have a statement that looks something like this:

while(1) {
  ...
  sleep(10); // Sleep for 10 seconds.
  // fetch timestamp and send
}

, (, 20 ) , . 10 , .

, , , .., (10) → → (10) , , (10) .

- (, C ):

bool expired = false;
double last, current;
double t1, t2;
double difference = 0;

while(1) {
   ...
   last = (double)clock();
   while(!expired) {
      usleep(200); // sleep for 20 milliseconds
      current = (double)clock();
      if(((current - last) / (double)CLOCKS_PER_SEC) >= (10.0 - difference))
        expired = true;
   }
   t1 = (double)clock();
   // Set and send the timestamp.
   t2 = (double)clock();
   //
   // Calculate how long it took to send the stamps.
   // and take that away from the next sleep cycle.
   //
   difference = (t2 - t1) / (double)CLOCKS_PER_SEC;
   expired = false;
 }

C, , QueryPerformanceFrequency/QueryPerformanceCounter.

LONG_INTEGER freq;
LONG_INTEGER t2, t1;
//
// Get the resolution of the timer.
//
QueryPerformanceFrequency(&freq);

// Start Task.
QueryPerformanceCounter(&t1);

... Do something ....

QueryPerformanceCounter(&t2);

// Very accurate duration in seconds.
double duration = (double)(t2.QuadPart - t1.QuadPart) / (double)freq.QuadPart;
+1

1 , +/- 1 ( +/- 1 ).

: 1 , quantized. t, , t..t+0.9999. , t0 t1, t1-t0 -0.999..+0.999, +/-1 second. 9..11 .

+3

Source: https://habr.com/ru/post/1746154/


All Articles