I am having an interesting, yet strange issue with my game timer. It seems like the milliseconds works just fine. However, when I try to apply the std::chrono::seconds
cast I suddenly get 0.000000
when casting to a float.
My timer is as follows:
#include <iostream>
#include <time.h>
#include <chrono>
class Timer
{
public:
typedef std::chrono::high_resolution_clock Time;
typedef std::chrono::milliseconds ms; //<--If changed to seconds, I get 0.00000
typedef std::chrono::duration<float> fsec;
std::chrono::high_resolution_clock::time_point m_timestamp;
float currentElapsed;
Timer()
{
m_timestamp = Time::now();
}
float getTimeElapsed()
{
return currentElapsed;
}
void Tick()
{
currentElapsed = std::chrono::duration_cast<ms>(Time::now() - m_timestamp).count();
m_timestamp = Time::now();
}
public:
//Singleton stuff
static Timer* Instance();
static void Create();
};
The timer gets ticked once per frame. So, for instance I normally get about 33ms per frame. 33ms / 1000 = 0.033s
seconds so there should be plenty of bit space to hold that.
Any ideas on what maybe going on?
Any help is greatly appreciated!
EDIT: Sorry, Seconds, not Milliseconds