Wrong sum of consecutive integers in numpy

Summing up the first 100,000,000positive integers using the following:

import numpy as np
np.arange(1,100000001).sum()

I return: 987459712which does not match the formula: N(N+1)/2for N=100000000. Namely, the formula returns 5000000050000000.

Before posting, I wrote the following, which returns True:

np.arange(1,65536).sum() == ((65535+1) * 65535)/2

However, the number 65536seems to be a critical point, since

np.arange(1,65537).sum() == ((65536+1) * 65536)/2

returns False.

For integers, more 65536code returns False, while integers below this threshold return True.

Can someone explain what I did wrong when calculating the amount or what happens to the code?

+4
source share
2

, numpy .

Win 10 64-, Python 3.4.4, numpy 1.13.1:

>>  np.arange(1, 100000001).sum()
987459712
>>  np.arange(1, 100000001).dtype
dtype('int32')

"" numpy, :

>> np.arange(1, 100000001, dtype=np.int64).sum()
500000005000000

, , 32- .

+4

numpy , , int , C :

int_: ( , C long, int64, int32)

longs 32-, 64- (. ), , , int32.

DeepSpace, dtype int64 . arange, sum.


, :

, True:

np.arange(1,65536).sum() == ((65535+1) * 65535)/2

65536 ,

np.arange(1,65537).sum() == ((65536+1) * 65536)/2

False.

, int32 max, :

>> np.arange(1,65536).sum() < np.iinfo(np.int32).max
True    
>>> np.arange(1,65537).sum() < np.iinfo(np.int32).max
False

, Python - Python 3 int s.


. Unix- 64- int64 ( C 64 ), ints .

+4

Source: https://habr.com/ru/post/1689074/


All Articles