Spark Request with time

2019-09-22 01:44发布

问题:

I want to make a request to find the most busy time of the day on average in 1-hour intervals.

I have on my dataframe the row date in format "%d/%b/%Y:%H:%M:%S".

I begin like that:

mostBusyTimeDF = logDF.groupBy("date") ...

For example input:

               date
 2015-12-01 21:04:00
 2015-12-01 10:04:00
 2015-12-01 21:07:00
 2015-12-01 21:34:00

In output :

               date         count(1 hour interval)
 2015-12-01 21:04:00                          3
 2015-12-01 10:04:00                          1

After I don't know how can I do it..

Can you help me?

Thanks a lot

回答1:

You can use built-in Spark date functions:

from pyspark.sql.functions import *

logDF = sqlContext.createDataFrame([("2015-12-01 21:04:00", 1), ("2015-12-01 10:04:00", 2), ("2015-12-01 21:07:00", 9), ("2015-12-01 21:34:00", 1)], ['somedate', 'someother'])

busyTimeDF = logDF.groupBy(year("somedate").alias("cnt_year"), \
    month("somedate").alias("cnt_month"), \
    dayofmonth("somedate").alias("cnt_day"), \
    hour('somedate').alias("cnt_hour")) \
       .agg(functions.count("*").alias("cntHour")) 

cond = [busyTimeDF.cnt_year == year(logDF.somedate), \
    busyTimeDF.cnt_month == month(logDF.somedate), \
    busyTimeDF.cnt_day == dayofmonth(logDF.somedate), \
    busyTimeDF.cnt_hour == hour(logDF.somedate)]

busyTimeDF.join(logDF, cond).select('somedate', 'cntHour').show()