How to get day of the week in SparkSQL?

I am trying to select the entire record recorded on Sunday through SparkSQL. I have the following attempt, but in vain.

SELECT * FROM mytable WHERE DATEPART(WEEKDAY, create_time) = 0 SELECT * FROM mytable WHERE strftime("%w", create_time) = 0 

How to get day of the week in SparkSQL?

+6
source share
2 answers

SPARK 1.5.0 has a date_format function that takes a format as an argument. This format returns the weekday name from a timestamp:

select date_format(my_timestamp, 'EEEE') from ....

Result: for example. 'Tuesday'

+14
source

If create_time is in UTC, you can use the following to filter specific days in SparkSQL. I used Spark 1.6.1:

 select id, date_format(from_unixtime(created_utc), 'EEEE') from testTable where date_format(from_unixtime(created_utc), 'EEEE') == "Wednesday" 

If you specify "EEEE", the day of the week will be fully published. You can use "E" to indicate an abbreviated version, for example. We'd. You can find more information here: http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame http://docs.oracle.com/javase/6/ docs / api / java / text / SimpleDateFormat.html

+5
source

Source: https://habr.com/ru/post/973000/


All Articles