I need to generate 512 bits of BigInts, but I'm not sure which of the two below is true:
Does 512 bit mean 512 digits 1010101010...001010, which are then converted to the decimal number that it represents?
Or does that mean 512 digits 0-9, so basically a 512-digit number with numbers from 0 to 9? Something like 12414124124 .... 54543 = 512 digits.
source
share