I read about buffer overflows. I discovered one strange thing about allocating memory to a local variable on the stack
int f1 () { char string1[12]; char string2[4]; }
Here the distribution takes place on the stack.
Now in GCC line2, 4 bytes are allocated, but if I declare, besides force 2 (up to 16), then it is allocated 16 bytes by the compiler. This means that if I allocate string2 to 3,5,6,7, ...., 15 bytes, then it allocates 16 bytes to the compiler, but if I allocate 2 in strength, like 1,2,4,8 .. . then it stands out exactly the same size. If I assign above 16 bytes (not power 2) then it allocates 32 bytes (I think up to 32 bytes).
While in Visual Studio, if I allocate 1 byte, then 9 bytes are allocated, if 12 bytes are allocated from 2-4 bytes, then 16 bytes are allocated from 5-8 bytes by the compiler.
Does anyone know why such an appointment?
Atleast In Visual studio, if there is a buffer overflow, I get a debugging error, but nothing happens in gcc. GCC provides only a segmentation error only if too much overflow occurs.
Arpit source share