apache-spark - Spark SQL 在 where 子句的时间戳之间?

标签 apache-spark apache-spark-sql

我正在尝试使用 DataFrame API 返回两个时间戳之间的行。

示例代码为:

val df = Seq(
    ("red", "2016-11-29 07:10:10.234"),
    ("green", "2016-11-29 07:10:10.234"),
    ("blue", "2016-11-29 07:10:10.234")).toDF("color", "date")

  df.where(unix_timestamp($"date", "yyyy-MM-dd HH:mm:ss.S").cast("timestamp").between(LocalDateTime.now(), LocalDateTime.now().minusHours(1))).show()

但它会抛出 Unsupported literal type class java.time.LocalDateTime 错误。

Exception in thread "main" java.lang.RuntimeException: Unsupported literal type class java.time.LocalDateTime 2016-11-29T07:32:12.084
    at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:57)
    at org.apache.spark.sql.functions$.lit(functions.scala:101)
    at org.apache.spark.sql.Column.$greater$eq(Column.scala:438)
    at org.apache.spark.sql.Column.between(Column.scala:542)
    at com.sankar.SparkSQLTimestampDifference$.delayedEndpoint$com$sankar$SparkSQLTimestampDifference$1(SparkSQLTimestampDifference.scala:23)
    at com.sankar.SparkSQLTimestampDifference$delayedInit$body.apply(SparkSQLTimestampDifference.scala:7)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
    at scala.App$class.main(App.scala:76)
    at com.sankar.SparkSQLTimestampDifference$.main(SparkSQLTimestampDifference.scala:7)
    at com.sankar.SparkSQLTimestampDifference.main(SparkSQLTimestampDifference.scala)

最佳答案

当您在 where 子句中使用 Timestamp 时,您需要将 LocalDateTime 转换为 Timestamp。还要注意 between 的第一个参数是 lowerBound 所以在你的情况下 LocalDateTime.now().minusHours(1) 应该在 LocalDateTime.now()。然后你可以这样做:

import java.time.LocalDateTime
import java.sql.Timestamp

df.where(
     unix_timestamp($"date", "yyyy-MM-dd HH:mm:ss.S")
       .cast("timestamp")
       .between(
          Timestamp.valueOf(LocalDateTime.now().minusHours(1)),
          Timestamp.valueOf(LocalDateTime.now())
       ))
  .show()

你会得到过滤后的DF一样

+-----+--------------------+
|color|                date|
+-----+--------------------+
|  red|2016-11-29 10:58:...|
+-----+--------------------+

关于apache-spark - Spark SQL 在 where 子句的时间戳之间?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40856663/

相关文章:

scala - Spark 提交成功运行,但通过 oozie 提交时无法连接到配置单元

java - Spark saveAsNewAPIHadoopFile java.io.IOException : Could not find a serializer for the Value class

apache-spark - Spark SQL - 类似忽略大小写

apache-spark - Spark SQL中缓存机制的区别

scala - 在 Scala Spark 中加入不同的 Dataframe 时动态选择多个列

csv - Spark : avoid task restart when writing

apache-spark - 在 Dataproc 上使用 Spark 进行跨账户 GCS 访问

scala - Spark SQL 过滤多个字段

python - 数组中第 n 项的 SparkSQL sql 语法

scala - 为 Delta Data 更新 Spark Dataframe 的窗口函数 row_number 列