In C #, the ability to influence between signed and unsigned integer types is affected by:
- Whether a scalar or array type is converted.
- Is the variable declared as
object.
Consider the following code example:
byte[] byteArray = new byte[2];
var c = (sbyte[])byteArray;
object byteArrayObject = new byte[2];
var a = (sbyte[])byteArrayObject;
byte scalarByte = 255;
var b = (sbyte)scalarByte;
object byteObject = (byte)4;
var e = (sbyte)byteObject;
Summarizing:
- Array declared as byte []: failure
- Array declared as object: success
- Scalar declared as byte: success
- Scalar declared as object: crash
Although this example only considers bytes, the same pattern appears to be true for other integer types. Can anyone explain why these results are so contradictory?
source
share