I am trying to measure the time taken by a set of statements. Below is the pseudo code. The code is implemented in C ++ on the Xilinx chipset with custom RTOS, so the traditional C ++ clock functions do not work here.
I don't need help with the actual measurement of time, but more with math on how to calculate the actual runtime.
one = clock.getTime();
/*statement
* 1 *
* to *
* 10 */
two = clock.getTime();
fTime = two - one;
Now I know the time spent by the operators. Does this time also include the time spent on getTime () correct too?
one = clock.getTime();
clock.getTime();
two = clock.getTime();
cTime = two - one; //Just measure and the min value i get is 300 microseconds.
Now this block gives me time spent on getTime ().
Finally my question is:
What is the actual time spent by the operators?
- fTime - cTime
- fTime - (2 * cTime)
- Another equation?