Should I use constants over definitions?

In C, should I prefer constants over definitions? I've been reading a lot of code lately, and all the examples use definitions a lot.

+42
c c-preprocessor const
Feb 22 2018-10-22T00
source share
10 answers

No, in general, you should not use const objects created in C to create name constants. To create a named constant in C, you must use either macros ( #define ) or enumerations. In fact, the C language has no constants, in the sense that you seem to mean it. (C in this respect is significantly different from C ++)

In C, the concepts of constant and constant expression are defined very differently than C ++. The constant C means a literal value, for example 123 . Here are some examples of constants in C

 123 34.58 'x' 

Constants in C can be used to create constant expressions. However, since const objects of any type are not constants in C, they cannot be used in constant expressions and, therefore, you cannot use objects with a constant that require constant expressions.

For example, the following is not a constant

 const int C = 123; /* C is not a constant!!! */ 

and since the above C not constant, it cannot be used to declare an array type in file scope

 typedef int TArray[C]; /* ERROR: constant expression required */ 

It cannot be used as a case label

 switch (i) { case C: ; /* ERROR: constant expression required */ } 

It cannot be used as a bit field width.

 struct S { int f : C; /* ERROR: constant expression required */ }; 

It cannot be used as an initializer for an object with a static storage duration.

 static int i = C; /* ERROR: constant expression required */ 

It cannot be used as an enum initializer.

 enum { E = C /* ERROR: constant expression required */ }; 

ie it cannot be used wherever a constant is required.

This may seem counter-intuitive, but that is how C. is defined.

That's why you see these many #define -s in the code you're working with. Again, in C, an object that has const has very limited use. They are basically completely useless as "constants," so in C you basically have to use #define or enumerations to declare true constants.

Of course, in situations where an object corresponding to const works for you, i.e. it does what you want, it really surpasses macros in many ways, since it is copied and printed. You should probably prefer objects where applicable, but in general you will have to consider the above limitations.

+92
Feb 22 2018-10-22T00
source share

Constants should be preferred over define s. There are several advantages:

  • Type of security . While C is a weakly typed language, using define loses all type safety, which allows the compiler to pick up problems for you.

  • Easy debugging . You can change the value of constants through the debugger, while define automatically changed in the code by the preprocessor to the actual value, which means that if you want to change the value for testing / debugging purposes, -compile.

+9
Feb 22 2018-10-22T00
source share

I may have used them incorrectly, but at least in gcc, you cannot use constants in case statements.

 const int A=12; switch (argc) { case A: break; } 
+7
Feb 22 '10 at 1:08
source share

Many people here give you the "C ++ style" advice. Some even say that C ++ arguments apply to C. This may be a fair point. (Whether it is or not feels subjective.) People who speak const sometimes mean that something else in two languages ​​is also correct.

But these are mostly insignificant points and personally, I think that in reality there are relatively insignificant consequences for any path. This is a style question, and I think different groups of people will give you different answers.

In terms of general usage, historical usage, and the most common style in C, it is more common to see #define . Using C ++ - isms in C code can be just as weird for some narrow segment of C encoders. (Including me so that my prejudices lie there.)

But I am surprised that no one has proposed a midpoint solution that "feels right" in both languages: if it fits into a group of integer constants, use enum .

+5
Feb 22 '10 at 2:11
source share

Although this question is specific to C, I think it's good to know this:

 #include<stdio.h> int main() { const int CON = 123; int* A = &CON; (*A)++; printf("%d\n", CON); // 124 in C } 

works in C but not in C ++

One of the reasons for using #define is to avoid such things to mess up your code, especially the combination of C and C ++.

+5
May 12 '10 at 4:47 p.m.
source share

define can be used for many purposes (very loose), and should be avoided if you can replace it with const, which defines a variable, and you can do a lot more with it.

In such cases, as indicated below, use the parameter

  • directory switch
  • replacement of the original string
  • code macros

An example when you should use define over const is when you have say 3 version number and you want version 4 to include some methods not available in version 3

 #define VERSION 4 ... #if VERSION==4 ................ #endif 
+3
Feb 22 '10 at 1:08
source share

Definitions have been part of the language longer than constants, so many old codes will use them because they determine where the only way to do the work is when the code was written. For later code, this might just be a programmer’s question.

Constants have a type as well as a value, so they would prefer when it makes sense for your value to have a type, but not when it is non-profitable (or polymorphic).

+2
Feb 22 '10 at 1:01
source share

If this is not programmatically defined, I use #define . For example, if I want all the objects in my UI to have the same space between them, I could use #define kGUISpace 20 .

+1
Feb 22 '10 at 0:59
source share

Besides the excellent reasons that AndreyT uses to use DEFINES rather than the constants in the "C" code, there is another pragmatic reason for using DEFINES.

DEFINES is easy to identify and use header files (.h), in which any experienced C-encoder will look for found constants. Defining consts in header files is not easy - there is more code to avoid duplicate definitions, etc.

Also the arguments of "typafe" are controversial. Most compilers will perceive blatant errors, such as defining a string and an int, or “doing the right thing” with a slight mismatch, for example, assigning an integer value to float.

+1
Feb 22 2018-10-22T00
source share

Macros (definitions) can be used by the preprocessor and cannot be const at compile time.

You can perform compile-time checks to make sure the macro is in the valid range (and #error or #fatal if it is not). You can use the default values ​​for the macro if they are not already defined. You can use a macro in the size of the array.

The compiler can optimize with macros better than it can with constants:

 const int SIZE_A = 15; #define SIZE_B 15 for (i = 0; i < SIZE_A + 1; ++i); // if not optimized may load A and add 1 on each pass for (i = 0; i < SIZE_B + 1; ++i); // compiler will replace "SIZE_B + 1" with 16 

Most of my work is with embedded processors that don't have terrific optimization compilers. Perhaps gcc will consider SIZE_A as a macro at a certain level of optimization.

+1
Feb 22 '10 at 6:18
source share



All Articles