I have an interesting but strange problem with my game timer. Milliseconds seem to work just fine. However, when I try to apply the std::chrono::seconds , I unexpectedly get 0.000000 when casting to float.
My timer is as follows:
#include <iostream> #include <time.h> #include <chrono> class Timer { public: typedef std::chrono::high_resolution_clock Time; typedef std::chrono::milliseconds ms; //<--If changed to seconds, I get 0.00000 typedef std::chrono::duration<float> fsec; std::chrono::high_resolution_clock::time_point m_timestamp; float currentElapsed; Timer() { m_timestamp = Time::now(); } float getTimeElapsed() { return currentElapsed; } void Tick() { currentElapsed = std::chrono::duration_cast<ms>(Time::now() - m_timestamp).count(); m_timestamp = Time::now(); } public: //Singleton stuff static Timer* Instance(); static void Create(); };
The timer gets a mark once per frame. So, for example, I usually get about 33 ms per frame. 33ms / 1000 = 0.033s seconds, so there should be a lot of space to store this place.
Any ideas on what could happen?
Any help is much appreciated!
EDIT: Sorry seconds, not milliseconds
source share