Summing up the first 100,000,000
positive integers using the following:
import numpy as np
np.arange(1,100000001).sum()
I return: 987459712
which does not match the formula: N(N+1)/2
for N=100000000
. Namely, the formula returns 5000000050000000
.
Before posting, I wrote the following, which returns True
:
np.arange(1,65536).sum() == ((65535+1) * 65535)/2
However, the number 65536
seems to be a critical point, since
np.arange(1,65537).sum() == ((65536+1) * 65536)/2
returns False
.
For integers, more 65536
code returns False
, while integers below this threshold return True
.
Can someone explain what I did wrong when calculating the amount or what happens to the code?
source
share