Are globals broken by alchemy?

It seems that Adobe Alchemy does not work with global designers. Here are some simple test codes:

#include <stdio.h>

class TestClass {
public:
    TestClass(const char message[]) {
        printf("hello %s \n", message);
    }
};

TestClass global("global");

int main(int argc, char **argv) {
    TestClass local("local");
    printf("in main\n");
    return 0;
}

When compiling with native gcc, it produces:

hello global
hello local
in main

When compiling with gcc alchemy, it produces:

hello local
in main

This problem breaks a lot of code, in particular UnitTest ++ (which depends on how global variables are initialized to make the automatic testing function work).

I would really like to figure this out. Is this a bug or feature that was not implemented on time for release? Is it possible to get along?

EDIT . Relevant post on the Adobe forums here .

+3
source share
1

. , :

, . , ByteBuffer . , , , .

, , - . , , - , , .

// `global` is now a pointer
TestClass *global;

// all global variable initialization is found here now
void init_globals() {
  global = new TestClass("global");
}

int main(int argc, char **argv) {
  // this needs to be explicitly called at the start Alchemy
  init_globals();

, global (*global).

// `global` is now a function
TestClass& global() {
  // static locals are initialized when their functions are first called
  static TestClass global_("global");
  return global_;
}

global global(). , , init_globals. , - , global(), ... :

// a memory buffer is created large enough to hold a TestClass object
unsigned char global_mem[sizeof(TestClass)];
// `global` is now a reference.  
TestClass& global = *(TestClass*)(void*)global_mem;

void init_globals() {
  // this initializes a new TestClass object inside our memory buffer
  new (global_mem) TestClass("global");
}

int main(int argc, char **argv) {
  init_globals();

, - , global global. , init_globals .


Edit:
, , , , .

+2

Source: https://habr.com/ru/post/1774631/


All Articles