It seems that Adobe Alchemy isn't running global constructors. Here's some simple test code:
#include <stdio.h>
class TestClass {
public:
TestClass(const char message[]) {
printf("hello %s \n", message);
}
};
TestClass global("global");
int main(int argc, char **argv) {
TestClass local("local");
printf("in main\n");
return 0;
}
When compiled with native gcc it outputs:
hello global
hello local
in main
When compiled with alchemy gcc it outputs:
hello local
in main
This problem breaks lots of code, notably UnitTest++ (which depends on globals getting initialized to make its auto test-list functionality work).
I'd really like to get to the bottom of this. Is it a bug or a feature that just didn't get implemented in time for the release? Is it possible to workaround?
EDIT: A relevant post on the Adobe Forums is here.
I've run into the same problem. As far as I could tell, this seems to be the case:
Every single static and global variable of class type will silently fail to be initialized if even a single class attempts dynamic allocation at any point during its initialization. Presumably this is because the ByteBuffer being used for dynamic memory isn't yet available. I wish Alchemy would be more clear with its error messages, because at the moment it's like a strand of old-timey Christmas lights where one dead bulb would cause the entire strand to shut off.
For a workaround, once you've discovered the offending object, you'll need to somehow defer its initialization to runtime. The three techniques that come to mind are pointers, lazy evaluation functions, or references to buffers initialized with placement new.
Pointers
// `global` is now a pointer
TestClass *global;
// all global variable initialization is found here now
void init_globals() {
global = new TestClass("global");
}
int main(int argc, char **argv) {
// this needs to be explicitly called at the start Alchemy
init_globals();
You'll then need to refactor your code, changing every occurence of global
to (*global)
.
Function
// `global` is now a function
TestClass& global() {
// static locals are initialized when their functions are first called
static TestClass global_("global");
return global_;
}
Now you need to replace every occurence of global
with global()
. Notably, this is the only one of these three techniques that doesn't require an explicit init_globals
call. I recommend this way unless the name changing to global()
is troublesome for some reason... in which case:
Placement new
// a memory buffer is created large enough to hold a TestClass object
unsigned char global_mem[sizeof(TestClass)];
// `global` is now a reference.
TestClass& global = *(TestClass*)(void*)global_mem;
void init_globals() {
// this initializes a new TestClass object inside our memory buffer
new (global_mem) TestClass("global");
}
int main(int argc, char **argv) {
init_globals();
The advantage of this approach is you don't need to change any other code, as global
is still just called global
. Unfortunately, maintaining an init_globals
function can be troublesome.
Edit:
As discovered in a later question, in addition to dynamic allocation, functions containing static locals also cannot be called during Alchemy's initialization.