Why are the rules that determine whether a letter can be entered between signed and unsigned integer types so inconsistently?

In C #, the ability to influence between signed and unsigned integer types is affected by:

  • Whether a scalar or array type is converted.
  • Is the variable declared as object.

Consider the following code example:

// If the variable is declared as a byte array then type casting to sbyte[] results in a 
// compile-time error.
byte[] byteArray = new byte[2];
var c = (sbyte[])byteArray; // Compilation eror

// But if the variable is declared as an object then we neither get a compile-time nor a 
// run-time error
object byteArrayObject = new byte[2];
var a = (sbyte[])byteArrayObject;

// With an explicitly typed scalar, the byte -> sbyte type conversion succeeds with no errors
byte scalarByte = 255;
var b = (sbyte)scalarByte;

// But if the scalar is declared as an object, an InvalidCastException is thrown at run-time
object byteObject = (byte)4;
var e = (sbyte)byteObject; // InvalidCastException

Summarizing:

  • Array declared as byte []: failure
  • Array declared as object: success
  • Scalar declared as byte: success
  • Scalar declared as object: crash

Although this example only considers bytes, the same pattern appears to be true for other integer types. Can anyone explain why these results are so contradictory?

+4
source share
1

. , . :

object i = 1;
var l = (long)i; //Runtime expection: unboxing an int to a long

- # CLR; # ( , , ). object , , CLR .

, . # ( ):

var strs = new string[];
var objs = (object[])strs; //Compiles just fine.

, ; :

objs[0] = new object(); //Runtime exception, an object is not a string. Ouch!

, # generic type . , , :

+2

Source: https://habr.com/ru/post/1662835/


All Articles