Why primitives are not valid

I was wondering why primitives are not null.

Today I read this

The null keyword is a literal representing a null reference that does not apply to any object. Null is the default value for variables of a reference type. Normal value types cannot be null. However, NULL value types were introduced in C # 2.0. See Nullable Types (C # Programming Guide).

So, I know that in Java there are null primitives, such as int?in C # or Integer, but why there is no int or bool, etc. with direct nullity?

This question is not aimed at any programming language, but are there differences in programming languages, why do they not allow primitives to be null?

+4
source share
6 answers

I am trying to explain this with C ++:

In C ++, integer, boolean, ... are directly assigned to memory, so if you create an integer variable, it uses only four bytes for the value. If you want to create a variable with integers with a zero value, you must create a pointer to the memory in which the value is stored, so if the pointer is null (0x0), it points to nothing. If it is not equal to zero, it indicates the actual value of the integer.

If you want to add two integers, you can do this with a single assembler command, because the values ​​can be passed directly to the processor.

- #/Java. #/Java-Virtual Machine. , "" .

+4

null , () . Java Object , (, Integer). , , , null.

Java . , .

+6

null - , , . null - .

, . , null .

+4

, , , , :
char '\ u0000' , , , char...: ) , :)

0

- . . , Int # 1,2 . , .

... . , .

, , , :)

0

, , int, bool .., , .

#, int ( ):

int someInt;

0. .

:

string someString;
SomeClass someObject;

null;

0

Source: https://habr.com/ru/post/1611570/


All Articles