The problem here is the false pointers. D garbage collector is conservative, that is, it does not always know what a pointer is and what is not. Sometimes we have to assume that bit patterns that would point to GC-allocated memory, if interpreted as pointers, are pointers. This is a big problem for large distributions, since large blocks are a larger target for false pointers.
You allocate about 48 MB each time you call testA() . In my experience, this is enough to almost guarantee that there will be a 32-bit system in the block. You will probably get better results if you compile your code in 64-bit mode (supported on Linux, OSX and FreeBSD, but not on Windows), since 64-bit address space is much less common.
As for my GC optimizations (I'm David Simcha, which CyberShadow mentions), there were two lots. One of them was> 6 months old and did not cause any problems. The other is still considered a pull request and is not yet in the main druntime tree. This is probably not a problem.
In the short term, the solution is to manually free these huge blocks. In the long run, we need to add an accurate scan, at least for the heap. (Exactly scanning the stack is much more complicated.) I wrote a patch to do this a couple of years ago, but it was rejected because it relied on templates and evaluated a compile-time function to generate pointer offset information for each data type. We hope that this information will ultimately be generated directly by the compiler, and I will be able to re-create my exact patch to scan the garbage heap.
source share