I am experiencing an error, which is probably an error in pandas (v. 0.22 on Windows, Python version 3.6.3) or, rather, in its interaction with NumPy (v. 1.14), but I am wondering if I missing something deeper. "
Here's the problem: if I have two objects Datetimeindexwith the same length and I use np.maximumbetween them, the output will be as expected:
import pandas as pd
import numpy as np
v1 = pd.DatetimeIndex(['2016-01-01', '2018-01-02', '2018-01-03'])
v2 = pd.DatetimeIndex(['2017-01-01', '2017-01-02', '2019-01-03'])
np.maximum(v1, v2)
returns the maximum element:
DatetimeIndex (['2017-01-01', '2018-01-02', '2019-01-03'], dtype = 'datetime64 [ns]', freq = None)
However, if I try to use only one of the two elements, I get an error message:
np.maximum(v1, v2[0])
pandas_libs \ tslib.pyx in pandas._ libs.tslib._Timestamp. richcmp ()
TypeError: "Timestamp" "int"
, , , - , pydatetime:
np.maximum(v1, v2[:1])
DatetimeIndex (['2017-01-01', '2018-01-02', '2018-01-03'], dtype = 'datetime64 [ns]', freq = None)
v1.to_pydatetime() - v2[0].to_pydatetime()
array ([datetime.datetime(2017, 1, 1, 0, 0), datetime.datetime(2018, 1, 2, 0, 0), datetime.datetime(2018, 1, 3, 0, 0)], dtype = object)
, v2 - v1[0] , v2 - v1[:1] (, , ).