Spark SQL dates in seconds

I have the following code:

table.select(datediff(table.col("Start Time"), table.col("End Time"))).show() 

Date Format - 2016-05-19 09:23:28 ( YYYY-MM-DD HH:mm:SS )

The Datific function calculates the difference in days. But I would like to have a difference in seconds.

+5
source share
1 answer

You can use unix_timestamp() to convert the date to seconds.

 import org.apache.spark.sql.functions._ //For $ notation columns // Spark 2.0 import spark.implicits._ table.withColumn("date_diff", (unix_timestamp($"Start Time") - unix_timestamp($"End Time")) ).show() 

Edit: (according to comments)

UDF for latent seconds to HH: mm: ss

 sqlContext.udf.register("sec_to_time", (s: Long) => ((s / 3600L) + ":" + (s / 60L) + ":" + (s % 60L)) ) //Use registered UDF now table.withColumn("date_diff", sec_to_time(unix_timestamp($"Start Time") - unix_timestamp($"End Time")) ).show() 
+6
source

Source: https://habr.com/ru/post/1261933/


All Articles