Assigning a float as a dictionary key changes its precision (Python)

I have a list of floats (actually this is a pandas Series object if it changes something) that looks like this:

mySeries:

...
22      16.0
23      14.0
24      12.0
25      10.0
26       3.1
...

(So, the elements of this series on the right, the indices on the left.) Then I try to assign the elements from this Series as keys in the dictionary and indexes as values, for example:

{ mySeries[i]: i for i in mySeries.index }

and I get what I wanted, besides ...

{ 6400.0: 0, 66.0: 13, 3.1000000000000001: 23, 133.0: 10, ... }

Why did it 3.1suddenly change to 3.1000000000000001? I guess this has to do with how floating point numbers (?) Are represented, but why is this happening now and how do I avoid / fix it?

EDIT: Please feel free to suggest a better name for this question if it is inaccurate.

EDIT2: , , , -. , mySeries[26] , :

myDict[mySeries[26]]

KeyError. ?

+4
2

3.1, . mySeries [26] .

:

pd.set_option('precision', 20)

mySeries.

0    16.00000000000000000000
1    14.00000000000000000000
2    12.00000000000000000000
3    10.00000000000000000000
4     3.10000000000000008882
dtype: float64

, , .

KeyError, .

>> x = pd.Series([16,14,12,10,3.1])
>> a = {x[i]: i for i in x.index}
>> a[x[4]]
4
>> a.keys()
[16.0, 10.0, 3.1000000000000001, 12.0, 14.0]
>> hash(x[4])
2093862195
>> hash(a.keys()[2])
2093862195
+6

:

>>> x = pd.Series([16,14,12,10,3.1])
>>> x
0    16.0
1    14.0
2    12.0
3    10.0
4     3.1
dtype: float64
>>> x.iloc[4]
3.1000000000000001

:

>>> np.float64(3.1)
3.1000000000000001

. Python .

KeyError , . . :

>>> d = {x[i]:i for i in x.index}
>>> d
{16.0: 0, 10.0: 3, 12.0: 2, 14.0: 1, 3.1000000000000001: 4}
>>> x[4]
3.1000000000000001
>>> d[x[4]]
4

, KeyError Series: mySeries[26]?

+3

Source: https://habr.com/ru/post/1656967/


All Articles