I have the concept of static local variables pretty well: global lifespan, local area. In the same way, I understand that automatic variables are allocated / deallocated automatically when the program flow enters and leaves the context of the variable.
#include <stdio.h> void test_var(void){ static unsigned foo = 0; unsigned bar = 0; printf(" %u %u\n", foo++, bar++); } int main(void){ printf("Foo Bar\n"); printf("--- ---\n"); for(unsigned x = 0; x < 10; x++){ test_var(); } return 0; }
Thus, the previous example behaves as expected and outputs the following result:
Foo Bar --- --- 0 0 1 0 2 0 3 0 4 0 5 0 6 0 7 0 8 0 9 0
What confuses me is how the variables behave when not initialized:
#include <stdio.h> void test_var(void){ static unsigned foo; /* not initialized */ unsigned bar; /* not initialized */ printf(" %u %u\n", foo++, bar++); } int main(void){ printf("Foo Bar\n"); printf("--- ---\n"); for(unsigned x = 0; x < 3; x++){ test_var(); } return 0; }
Output:
Foo Bar --- --- 0 2 1 3 2 4 3 5 4 6 5 7 6 8 7 9 8 10 9 11
So the static variable behaves as expected - getting the default value of 0 and saving through function calls; but automatic variable seems to be preserved, although while holding the garbage value, it increases with every call.
Is this because the behavior is undefined in the C standard, or is there a set of rules in the standard that explain this?
source share