I am running a C program using GCC and the simplest cross-compiler DSP to simulate some functionality. I use the following code to measure the execution time of a specific part of my program:
clock_t start,end;
printf("DECODING DATA:\n");
start=clock();
conv3_dec(encoded, decoded,3*length,0);
end=clock();
duration = (double)(end - start) / CLOCKS_PER_SEC;
printf("DECODING TIME = %f\n",duration);
where conv3_dec()
is the function defined in my program and I want to find the execution time of this function.
Now, when my program starts, the functions conv3_dec()
run for almost 2 hours, but the conclusion from it printf("DECODING TIME = %f\n",duration)
says that the function was completed in just half a second ( DECODING TIME = 0.455443
). This is very confusing for me.
I used the method clock_t
to measure the execution time of programs before, but the difference has never been so huge. This is due to the cross compiler. Just like a note, the simulator simulates a DSP processor running at just 500 MHz, just like the difference in clock speed of the DSP processor, and my processor causing the error, measures CLOCKS_PER_SEC.
source
share