Linux and Windows millisecond time

2019-09-06 20:58发布

I want to get the milliseconds time of a system (I don't care if it's the real time, I want it to be as accurate as possible). Is this a good method to do it?

#ifdef WIN32
unsigned long long freq;
unsigned long long get_ms_time() {
    LARGE_INTEGER t;
    QueryPerformanceCounter(&t);
    return t.QuadPart / freq;
}
#else
unsigned long long get_ms_time() {
    struct timespec t;
    clock_gettime(CLOCK_MONOTONIC, &t);
    return t.tv_sec * 1000 + t.tv_nsec / 1000000;
}
#endif

How can I wrap this value to a signed int? I tried doing this and I get negative values like this (on Linux, I don't know on Windows):

~ start
-2083002438
~ 15 seconds after..
-2082987440
~ 15 seconds after..
-2082972441

I would something like this. ~ start X ~ 15 seconds after.. X + 14998 ~ 15 seconds after.. X + 29997

Where X is a positive number. (I want the output positive and increasing)

1条回答
啃猪蹄的小仙女
2楼-- · 2019-09-06 21:43

I do something like this in my code...

timespec specStart, specStop;

// Get start time ( UNIX timestamp ) in seconds...
clock_gettime( CLOCK_MONOTONIC_RAW, &startTime );

int startTimeInt = startTime.tv_sec;
std::cout << "start time : " << startTimeInt << std::endl;

...
// Get stop time ( UNIX timestamp ) in seconds...
clock_gettime( CLOCK_MONOTONIC_RAW, &stopTime );

int stopTimeInt = stopTime.tv_sec;
std::cout << "stop time : " << stopTimeInt << std::endl;

// Get time diff from stop time to start time
unsigned long long timeStart = specStart.tv_sec * 1000000000 + specStart.tv_nsec;
unsigned long long timeStop = specStop.tv_sec * 1000000000 + specStop.tv_nsec;
unsigned long long timeDelta = timeStio - timeStart; // Time diff in nanoseconds. 

int microSec = timeDelate / 1000; 
int mSec = timeDelta / 1000000;
int sec = timeDelta / 1000000000;

std::cout << "time diff : " << std::endl
    << sec << " s" << std::endl
    << msec << " ms" << std::endl
    << microSec << " µs" << std::endl
    << timeDelta << " ns" << std::endl;
查看更多
登录 后发表回答