How to get current_date - 1
day in sparksql, same as cur_date()-1
in mysql.
问题:
回答1:
The arithmetic functions allow you to perform arithmetic operation on columns containing dates.
For example, you can calculate the difference between two dates, add days to a date, or subtract days from a date. The built-in date arithmetic functions include datediff
, date_add
, date_sub
, add_months
, last_day
,
next_day
, and months_between
.
Out of above what we need is
date_sub(timestamp startdate, int days), Purpose: Subtracts a specified number of days from a TIMESTAMP value. The first argument can be a string, which is automatically cast to TIMESTAMP if it uses the recognized format, as described in TIMESTAMP Data Type. Return type: timestamp
and we have
current_timestamp() Purpose: Alias for the now() function. Return type: timestamp
you can do select
date_sub(CAST(current_timestamp() as DATE), 1)
See https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/sql/functions.html
回答2:
You can try
date_add(current_date(), -1)
I don't know spark either but I found it on google. You can also use this link for reference
回答3:
You can easily perform this task , there are many methods related to the date and what you can use here is date_sub
Example on Spark-REPL:
scala> spark.sql("select date_sub(current_timestamp(), 1)").show
+----------------------------------------------+
|date_sub(CAST(current_timestamp() AS DATE), 1)|
+----------------------------------------------+
| 2016-12-12|
+----------------------------------------------+
回答4:
Yes, the date_sub()
function is the right for the question, anyway, there's an error in the selected answer:
Return type: timestamp
The return type should be date
instead, date_sub() function will trim any hh:mm:ss
part of the timestamp, and returns only a date
.