#Define VS Variable

I do not understand what is the difference between:

#define WIDTH 10 

and

 int width = 10; 

What are the benefits of using the first or second?

+6
source share
5 answers

Well, there is a big difference. You can change the width value, you can take its address, you can request its size and so on. With width it will simply be replaced by constant 10 everywhere, so the ++WIDTH expression makes no sense. Ono, on the other hand, you can declare an array with width elements, while you cannot declare an array with width elements.

To summarize: the width value is known at compile time and cannot be changed. The compiler does not allocate memory for width . Conversely, width is a variable with an initial value of 10; its further values ​​are not known at compile time; the variable gets its memory from the compiler.

+4
source

What is the difference between the two?

The first is Macro , and the second is a variable declaration.

#define WIDTH 10 is a preprocessor directive that allows you to specify a name ( WIDTH ) and its replacement text ( 10 ). The preprocessor parses the source file, and each occurrence of the name is replaced by the text associated with it. The compiler never sees the macro name at all, what it sees is replaced by text.

A variable declaration is evaluated by the compiler itself. It tells the compiler to declare a variable with the name WIDTH and type int , and also initializes it with a value of 10 .
The compiler knows this variable by its name WIDTH .

Which one do you prefer? And why?

It is generally recommended to use compile-time variables compared to #define . So your variable declaration should be:

 const int width = 10; 

There are several reasons for choosing compile time constants over #define , namely:

Visibility Based Mechanism:

The scope of #define limited by the file in which it is defined. Thus, #defines that are created in one source file are NOT available in another source file. In short, #define does not respect scope. Note that const variables may be limited. They obey all the rules for determining the area.


Avoid weird magic numbers with compilation errors:

If you use #define , they are replaced by the pre-processor during pre-compilation. So, if you get an error message at compile time, this will be confused because the error message will not refer to the macro name, but the value and it will display a sudden value, and everyone could spend a lot of time tracking this code .


Easy debugging:

Also for the same reasons stated in # 2, while debugging #define will not provide any help.

+7
source

WIDTH is a macro that will be replaced with the value (10) by the preprocessor, while WIDTH is a variable.

When you # define a macro (for example, WIDTH here), the preprocessor will simply replace the text before passing the program to the compiler. those. wherever you use WIDTH in your code, it will simply be replaced by 10 .

But when you do int width=10 , the variable is alive

+2
source

First, briefly: before you get the compiled , the C file is pre-processed . the pre-processor checks for the #include and #define .

In your case, this #define tells the pre-processor to change each WIDTH line in the source code using line 10 . When the file gets compiled in the next step, each WIDTH space will actually be 10 . Now the difference between

 #define WIDTH 10 

and

 int width = 10; 

lies in the fact that the first can be considered as the value of constant , while the second is a normal variable, the value of which can be changed.

0
source

One #define processed by the preprocessor if it finds WIDTH in the source code and replaces it with 10 , all it does is the main replacement, by the way, the other is int width = 10; the compiler is processed, it will create entries in the lookup table, generate binary files to allocate enough memory on the stack, depending on where to determine it, and copy the value 10 to this memory location.

So, this is nothing but a label for a constant, the other is a variable at runtime.

You can use preprocessors for faster execution, since variables must be allocated on the stack due to the fact that they do not change at run time.

Usually you use preprocessors for things that do not need to be changed at runtime, but be careful that the preprocessors can be a little complicated to debug, as they can actually manipulate the source code before passing it to the compiler, which leads to very subtle errors that can or may not be obvious when studying source code.

0
source

Source: https://habr.com/ru/post/918338/


All Articles