While doing some messing around with vector types and the ObjC runtime, I came across a very perplexing problem.
Neither clang or GCC will give the 'proper' type-encoding for any SIMD vector type, as far as I can tell.
#import <Foundation/Foundation.h>
int main() {
typedef int int4 __attribute__((vector_size(16)));
typedef float float4 __attribute__((vector_size(16)));
NSLog(@"Int4: %s", @encode(int4));
NSLog(@"Float4: %s", @encode(float4));
}
When compiled & run with either GCC or clang, I get the following output:
2014-04-09 06:21:01.102 test[1707:507] Int4: 2014-04-09 06:21:01.103 test[1707:507] Float4:
Instead of what I expect, which would be something like this:
2014-04-09 06:21:01.102 test[1707:507] Int4: ![16,16i] 2014-04-09 06:21:01.103 test[1707:507] Float4: ![16,16f]
As per the documentation here:
vectors ‘![’ followed by the vector_size (the number of bytes composing the vector) followed by a comma, followed by the alignment (in bytes) of the vector, followed by the type of the elements followed by ‘]’
Which becomes a problem when attempting to return these types from an ObjC method, as instead of getting something logical like ![16,16i]@:
for the encoding of the following method:
-(int4) foo;
I only get the string @:
, which causes both NSMethodSignature
and NSInvocation
to essentially crap their pants, and segfault.
Is there some compiler option I can enable to restore the proper encoding of vector types? Or is the only solution here 'don't return vectors from an ObjC method'?
Note: Returning (or passing) a pointer to the vector doesn't help either, as the encoding for a pointer-to-vector is simply ^
, which then greedily-matches the next argument in the list, causing my argument count to be off once again. I suppose I could cast the vector to a void *
, but that really is an ugly hack at that point.