Is there a way to intelligently work with very large integers (millions or billions of digits)? The operations that I need to perform are simply +, -, *, and possibly /.
I am looking for algorithms to perform the above operations in a reasonable amount of time (for example, up to 1 hour on a modern PC). I am not opposed to using any type of representation for numbers, but if I need a different representation for each operation, then the conversion between different representations should also be done in a reasonable amount of time.
All the large number libraries I looked at are completely broken when used for these sizes. Is this a sign that there are no such algorithms or are these library representations / implementations simply not optimized for such sizes?
EDIT A limit of 1 hour is probably not possible. I gave this number because a simple cycle for a billion iterations should take less, and I was hoping for an algorithm that would use O (n) time. Does the 24 hour limit seem more acceptable?
source share