I've found this article that brings up the following template and a macro for getting array size:
template<typename Type, size_t Size>
char ( &ArraySizeHelper(Type( &Array )[Size]) )[Size];
#define _countof(Array) sizeof(ArraySizeHelper(Array))
and I find the following part totally unclear. sizeof
is applied to a function declaration. I'd expect the result to be "size of function pointer". Why does it obtain "size of return value" instead?
template<typename Type, size_t Size>
char (&ArraySizeHelper(Type(&Array)[Size]))[Size];
#define _countof(Array) sizeof(ArraySizeHelper(Array))
sizeof
is applied to a function declaration. I'd expect the result to be "size of function pointer". Why does it obtain "size of return value" instead?
It's not sizeof ArraySizeHelper
(which would be illegal - can't take sizeof
a function), nor sizeof &ArraySizeHelper
- not even implicitly as implicit conversion from function to pointer-to-function is explicitly disallowed by the Standard, for C++0x see 5.3.3). Rather, it's sizeof ArraySizeHelper(Array)
which is equivalent to sizeof
the value that the function call returns, i.e. sizeof char[Size]
hence Size
.
sizeof
is applied to the result of a function call, not a declaration. It therefore gives the size of the return value, which in this case is a reference to an array of chars.
The template causes the array in the return type to have the same number of elements as the argument array, which is fed to the function from the macro.
Finally, sizeof
is then applied to a reference to this char array. sizeof
on a reference is the same as sizeof
on the type itself. Since sizeof(char) == 1
, this gives the number of elements in the array.
ArraySizeHelper
is a function template which returns a char
array of size Size
. The template takes two parameters, one is type (which is Type
), and other is value (which is Size
).
So when you pass an object of type, say, A[100]
to the function. The compiler deduces both arguments of the template: Type
becomes A
, and Size
becomes 100
.
So the instantiated function return type becomes char[100]
. Since the argument of sizeof
is never evaluated, so the function need not to have definition. sizeof
only needs to know the return type of the function which is char[100]
. That becomes equivalent to sizeof(char[100])
which returns 100 - the size of the array.
Another interesting point to be noted is that sizeof(char)
is not compiler-dependent, unlike other primitive types (other than the variants of char1). Its ALWAYS 1
. So sizeof(char[100])
is guaranteed to be 100
.
1. Size of all variants of char
is ONE, be it char
, signed char
, unsigned char
according to the Standard.