Consider the following piece of code:
#include "stdio.h"
typedef struct CustomStruct
{
short Element1[10];
}CustomStruct;
void F2(char* Y)
{
*Y=0x00;
Y++;
*Y=0x1F;
}
void F1(CustomStruct* X)
{
F2((char *)X);
printf("s = %x\n", (*X).Element1[0]);
}
int main(void)
{
CustomStruct s;
F1(&s);
return 0;
}
At run-time, by the end of calling the function F1
, I get different results by using different compilers.
(*X).Element1[0] = 0x1f00
in some compiler and (*X).Element1[0] = 0x001f
with another one.
It's clear to me that it's an endianness issue.
Is there any compiler option or work-around to use so that I get (*X).Element1[0] = 0x001f
regardless the used compiler?
Endianness is not a compiler issue, nor even an operating system issue, but a platform issue. There are no compiler options or "workarounds" for endianness. There are however conversion routines so that you can normalize the endianness of stored data.
The
ntoh
routines documented here will reorder the bytes pointed to from network order (big endian) to host order (either big or little, depending on the type of host). There are alsohton
functions that go in the opposite direction, from host order to network order.If you want to normalize the bytes stored in your data structure, you need to do it yourself either when you store the data or when you try to read it.
Here are function templates I wrote for
ntohx
andhtonx
that are generalized on the type of data store, be it a 2 byte, 4 byte or 8 byte type:If
F2()
is receiving achar *
, then it must be doing something pretty strange in order to cause endian-related problems.These only happen when accessing more than one
char
at a time, unless it's manually doing such accesses while being broken. Is it casting it's argument back toshort *
or something?In short, show more code.