For Python, I recommend gmpy , the Python wrapper for the GNU Bignum Library . While Python in theory processes arbitrarily large integers (limited only by memory), and this is normal for numbers of several thousand digits or so, this is not very suitable for working with millions of digits. It does not use modern algorithms for fast multiplication, digit conversion and other standard operations. gmpy , by contrast, is designed to handle numbers of this kind.
The following are some sample timings showing that even for several thousand digits, gmpy significantly faster than Python builtin longs:
$ python -m timeit -s "from gmpy import mpz" "str(mpz(10)**10000)" 1000 loops, best of 3: 575 usec per loop $ python -m timeit "str(10**10000)" 100 loops, best of 3: 11.8 msec per loop
Aside: at some point, one of the developers of the Python kernel tried to replace Python long integer with the one that used GMP directly. It turned out that this actually slowed down Python for everyday non-huge whole use cases. See http://bugs.python.org/issue1814 for more details.
source share