The difference between latency and jitter in operating systems

a discussion of criteria for operating systems every time I hear Interupt-Latency and OS-Jitter. And now I ask myself: what is the difference between the two.

In my opinion, the delay in waiting for an interrupt is the delay from the appearance of Interupt until Interupt-Service-Routine (ISR) is introduced. On the contrary, Jitter is the moment when the moment of entry into the ISR differs with time.

Is that the same thing you think?

+6
source share
2 answers

Your understanding is basically correct.

Delay = Delay between an event occurring in the real world and the code that responds to the event.

Jitter = Differences between Latent between two or more events.

+13
source

In the field of cluster computing, especially when working with large-scale solutions, there are cases when the work distributed between many systems (and many many processor cores) should be performed in a fairly predictable time frame. The operating system and the stack of software may introduce some variability during the execution of these "pieces" of work. This variability is often called the "OS Jitter". link

The interrupt delay, as you said, is the time between the interrupt signal and the input to the interrupt handler.

Both concepts are orthogonal to each other. However, in practice, more interruptions typically mean more OS jitter.

0
source

Source: https://habr.com/ru/post/912828/


All Articles