Why use hex for array indices

I recently used the .net reflector to look at some DLL files, and I noticed that for array indices it used hex instead of decimal.

public Random(int Seed) { this.SeedArray = new int[0x38]; int num2 = 0x9a4ec86 - Math.Abs(Seed); this.SeedArray[0x37] = num2; int num3 = 1; for (int i = 1; i < 0x37; i++) { int index = (0x15 * i) % 0x37; this.SeedArray[index] = num3; num3 = num2 - num3; if (num3 < 0) { num3 += 0x7fffffff; } num2 = this.SeedArray[index]; } for (int j = 1; j < 5; j++) { for (int k = 1; k < 0x38; k++) { this.SeedArray[k] -= this.SeedArray[1 + ((k + 30) % 0x37)]; if (this.SeedArray[k] < 0) { this.SeedArray[k] += 0x7fffffff; } } } this.inext = 0; this.inextp = 0x15; Seed = 1; } 

Why did the author use hex instead of decimal?

+4
source share
1 answer

Whether an integer literal is originally written in the source code as hexadecimal or decimal (or, say, ternary, provided that the programming language is supported) is not written to IL. What you see is a hunch made by a decompiler.

In fact, you can see all these literals in decimal form by setting:

 View -> Options -> Disassembler -> Number Format : Decimal 
+5
source

Source: https://habr.com/ru/post/1338132/


All Articles