This is a subjective challenge, but I think it worked out very well. In my experience, when you use hex or binary, you are interested in a specific bit pattern - and you usually want it to be unsigned. When you are just interested in a numerical value, you use a decimal value, because that is what we are most familiar with. Also, when you use hex or binary, the number of digits you use for input is usually significant, while in decimal it is not. So, how literals work in Julia: decimal gives you an integer with a sign of the type in which the value fits, while hex and binary give you an unsigned value, the storage size of which is determined by the number of digits.
source share