I am trying to read a struct
from a binary byte buffer
using cast
and pack
.
I was trying to keep track of worst case read time from in memory buffer so I decided to keep a chrono high resolution clock nano
timer. Whenever the timer increased I printed the value. It gave me a worst case scenario of about 20 micro seconds which was huge considering the size of the struct.
When I measured the average time taken it came out to be ~20 nanoseconds. Then I measured how many times was I breaching 50. And it turns out of the ~20 million times, I was breaching 50 nanoseconds only 500 times.
My question is what can possibly cause this performance fluctuation: average of 20 and worst of 20,000?
Secondly, how can I ensure a constant time performance. I am compiling with -O3 and C++11.
// new approach
#pragma pack(push, 1)
typedef struct {
char a;
long b, c;
char d, name[10];
int e , f;
char g, h;
int h, i;
} myStruct;
#pragma pack(pop)
//in function where i am using it
auto am1 = chrono::high_resolution_clock::now();
myStruct* tmp = (myStruct*)cTemp;
tmp->name[10] = 0;
auto am2 = chrono::high_resolution_clock::now();
chrono::duration<long, nano> arM = chrono::duration_cast<chrono::nanoseconds>(am2 - am1);
if(arM.count() > maxMPO.count())
{
cout << "myStruct read time increased: " << arM.count() << "\n";
maxMPO = arM;
}
I am using g++4.8 with C++11 and an ubuntu server.
On a PC (or Mac, or any desktop), there are Ethernet interrupts, timers, mem-refresh, and dozens of other things going on over which you have no (or very little) control.
You might consider changing the target. If you use a single board computer (SBC) with only static ram, and a network connection which you can turn off and disconnect, and timers and clocks and every other kind of interrupt under your software control, you might achieve an acceptable result.
I once worked with a gal who wrote software for an 8085 SBC. When we hooked up a scope and saw the waveform stability of a software controlled bit, I thought she must have added logic chips. It was amazing.
You simply can not achieve 'jitter' free behaviour on a desktop.