How to convert datetime from string format to datetime format in pyspark?

I created a dataframe using sqlContext and I have a problem with the datetime format as it is identified as a string.

df2 = sqlContext.createDataFrame(i[1])
df2.show
df2.printSchema()

Result:

2016-07-05T17:42:55.238544+0900
2016-07-05T17:17:38.842567+0900
2016-06-16T19:54:09.546626+0900
2016-07-05T17:27:29.227750+0900
2016-07-05T18:44:12.319332+0900

string (nullable = true)

Since the datetime schema is a string, I want to change it in datetime format as follows:

df3 =  df2.withColumn('_1', df2['_1'].cast(datetime()))

Here I got the error message: TypeError: The required argument 'year' (pos 1) was not found

What should I do to solve this problem?

+4
source share
1 answer

Try the following:

from pyspark.sql.types import DateType
ndf = df2.withColumn('_1', df2['_1'].cast(DateType()))
+1
source

Source: https://habr.com/ru/post/1652787/


All Articles