Even if x and &x evaluate the same pointer value, they are different types. Type x after it decays to a pointer is int* , while type &x is int (*)[4] .
sizeof(x) sizeof(int)*4 .
Therefore, the numerical difference between &x and &x + 1 is sizeof(int)*4 .
It can be better visualized using a 2D array. Say you have:
int array[2][4];
Memory layout for array :
array | +---+---+---+---+---+---+---+---+ | | | | | | | | | +---+---+---+---+---+---+---+---+ array[0] array[1] | | +---+---+---+---+---+---+---+---+ | | | | | | | | | +---+---+---+---+---+---+---+---+
If you use a pointer to such an array,
int (*ptr)[4] = array;
and look at the memory through a pointer, it looks like this:
ptr ptr+1 | | +---+---+---+---+---+---+---+---+ | | | | | | | | | +---+---+---+---+---+---+---+---+
As you can see, the difference between ptr and ptr+1 is equal to sizeof(int)*4 . This analogy applies to the difference between &x and &x + 1 in your code.
R Sahu Nov 25 '15 at 5:45 2015-11-25 05:45
source share