You can use unix_timestamp() to convert the date to seconds.
import org.apache.spark.sql.functions._ //For $ notation columns // Spark 2.0 import spark.implicits._ table.withColumn("date_diff", (unix_timestamp($"Start Time") - unix_timestamp($"End Time")) ).show()
Edit: (according to comments)
UDF for latent seconds to HH: mm: ss
sqlContext.udf.register("sec_to_time", (s: Long) => ((s / 3600L) + ":" + (s / 60L) + ":" + (s % 60L)) ) //Use registered UDF now table.withColumn("date_diff", sec_to_time(unix_timestamp($"Start Time") - unix_timestamp($"End Time")) ).show()
source share