Calling the first bit to bit-bit 1 or 0?

In this example, I am creating an integer.

int example = 0; 

Now let's say I want to know what the first bit of this whole is. I know that it will be a bit at position 0. But I would call it bit 1 or 0. The reason I'm asking is that I saw the documentation where the first bit of the integer value is marked as bit 0 and then indicated like bit 1. I know that this is a mistake at their end, just curious what I have to say about it like that.

+6
source share
2 answers

In most cases, the least significant bit is called bit 0.

However, it really depends on the context in which you ask it. I worked on two different (interconnected) systems, when one documentation called bit 1, and the other called its bit 0. Talk about obfuscation! It is important that you always qualify something if you document it.

This is usually called "-indexed". Therefore, if the low-order bit is called "bit-zero", then the bit-bit is "with a zero index".

Personally, I always refer to a low-order bit as bit zero. With this convention, you can shift 1 n places to enable n th bits:

 x = 1<<0; 00000001b (bit 0 is on) x = 1<<4; 00010000b (bit 4 is on) 
+8
source

If you just go in powers of two, 2 ** 0 is equal to 1. This is more important for the number of bits starting at 0.

+2
source

Source: https://habr.com/ru/post/906523/


All Articles