Just a general programming question: when you define a value in C (or any language, I suppose), how does the compiler know how to handle the value? For instance:
#define CountCycle 100000
I would suggest that it CountCycleis a data type with long integers, but this is just an assumption. I suppose it could also be floata double(and not int, since it does not exceed ~ 32k), etc.
How does the compiler select the data type for the value #define? I do not have an application to answer this question; I'm just curious.
source
share