Invalid value generated from math equation using preprocessor directive

I have this preprocessor directive:

#define INDEXES_PER_SECTOR BYTES_PER_SECTOR / 4

where BYTES_PER_SECTOR is declared in another header file as:

#define BYTES_PER_SECTOR    64

I have this simple mathematical equation that I wrote where, after execution, I get an assertion error, since the value assigned by iTotalSingleIndexes is incorrect.

int iTotalSingleIndexes = (iDataBlocks - 29) / INDEXES_PER_SECTOR;

Now I believe that this is because of the INDEXES_PER_SECTOR preprocessor directive. When executing my equation, iDataBlocks is 285, which is correct. I confirmed this with gdb. The problem is that the value assigned to iTotalSingleIndexes is 1 when it should be 16. I really don't know why this is happening.

When I do something like:

int iIndexesInASector = INDEXES_PER_SECTOR;
int iTotalSingleIndexes = (iDataBlocks - 29) / iIndexesInASector;

the correct value is assigned to iTotalSingleIndexes.

, , .

.

+3
4

- . , :

int iTotalSingleIndexes = (iDataBlocks - 29) / INDEXES_PER_SECTOR;

:

int iTotalSingleIndexes = ( iDataBlocks - 29 ) / 64 / 4 ;

... , - /, :

int iTotalSingleIndexes = ((iDataBlocks - 29) / 64) / 4;

..., 1. leppie , :

#define INDEXES_PER_SECTOR (BYTES_PER_SECTOR / 4)

INDEXES_PER_SECTOR .

+17
#define INDEXES_PER_SECTOR (BYTES_PER_SECTOR / 4)
+7

, , , , .

1 .

,

2

,

,

#define X_PLUS_4(X)    X + 4

foo = 1;
y = 3 * X_PLUS_4(foo + 2) * 4; // naively expect y to be 84

y = 3 * foo + 2 + 4 * 4;  // y is 13

, , ,

#define X_PLUS_4(X)    ((X) + 4)

y = 3 * ((foo + 2) + 4) * 4;
+3

, Boost. , const.

const int BytesPerSector = 64;
const int IndexesPerSector = BytesPerSector / 4;

, . const ints.

+2

Source: https://habr.com/ru/post/1759074/


All Articles