Can std::clock() call resolution be measured? Or is it a problem when observation without influence is impossible?
I wrote the following naive landmark:
#include <ctime> #include <iostream> int main() { std::clock_t initial = std::clock(); std::clock_t current; while (initial == (current = std::clock())); std::cout << "Initial: " << initial << std::endl; std::cout << "Current: " << current << std::endl; std::cout << "Precision: " << (static_cast<double>(current - initial) / CLOCKS_PER_SEC) << "s" << std::endl; }
I ran it a few hundred times and it always outputs 0.01s .
My questions:
- Is code above a good way to measure
clock() resolution? - If not, then what's wrong with that? And how can this be improved?
source share