Why is typeof hex or Uint64 binary and decimal type Int64?

julia> typeof(-0b111) Uint64 julia> typeof(-0x7) Uint64 julia> typeof(-7) Int64 

I find this result a bit surprising. Why does the number base identify signed or undefined?

+5
source share
2 answers

This seems to be the expected behavior:

This behavior is based on the observation that when using unsigned hexadecimal literals for integer values, they are usually used to represent a fixed numerical sequence of bytes, and not just an integer value.

http://docs.julialang.org/en/latest/manual/integers-and-floating-point-numbers/#integers

... seems a little strange choice.

+4
source

This is a subjective challenge, but I think it worked out very well. In my experience, when you use hex or binary, you are interested in a specific bit pattern - and you usually want it to be unsigned. When you are just interested in a numerical value, you use a decimal value, because that is what we are most familiar with. Also, when you use hex or binary, the number of digits you use for input is usually significant, while in decimal it is not. So, how literals work in Julia: decimal gives you an integer with a sign of the type in which the value fits, while hex and binary give you an unsigned value, the storage size of which is determined by the number of digits.

+4
source

Source: https://habr.com/ru/post/1208544/


All Articles