Using the code from this answer :
#include <chrono> #include <ctime> #include <iostream> template <typename Duration> void print_time(tm t, Duration fraction) { using namespace std::chrono; std::printf("[%04u-%02u-%02u %02u:%02u:%02u.%03u]\n", t.tm_year + 1900, t.tm_mon + 1, t.tm_mday, t.tm_hour, t.tm_min, t.tm_sec, static_cast<unsigned>(fraction / milliseconds(1))); // VS2013 library has a bug which may require you to replace // "fraction / milliseconds(1)" with // "duration_cast<milliseconds>(fraction).count()" } int main() { using namespace std; using namespace std::chrono; system_clock::time_point now = system_clock::now(); system_clock::duration tp = now.time_since_epoch(); tp -= duration_cast<seconds>(tp); time_t tt = system_clock::to_time_t(now); print_time(*gmtime(&tt), tp); print_time(*localtime(&tt), tp); }
It should be borne in mind that the fact that the timer returns the values โโof submillisecond denominations does not necessarily indicate that the timer has submillisecond resolution. I think that the Windows implementation in VS2015 can be finally fixed, but the timer they used to support their chrono implementation so far has been sensitive to the timeBeginPeriod() OS, displaying different resolutions, and the default value is 16 milliseconds.
Also, the above code assumes that neither UTC nor your local time zone is offset from the era of std::chrono::system_clock fractional second value.
An example of using the Howard date functions to avoid ctime: http://coliru.stacked-crooked.com/a/98db840b238d3ce7
source share