我有以下 JSON 对象:
{
"user_id": "123",
"data": {
"city": "New York"
},
"timestamp": "1563188698.31",
"session_id": "6a793439-6535-4162-b333-647a6761636b"
}
{
"user_id": "123",
"data": {
"name": "some_name",
"age": "23",
"occupation": "teacher"
},
"timestamp": "1563188698.31",
"session_id": "6a793439-6535-4162-b333-647a6761636b"
}
我正在使用 val df = sqlContext.read.json("json")
将文件读取到数据帧
它将所有数据属性组合成数据结构,如下所示:
root
|-- data: struct (nullable = true)
| |-- age: string (nullable = true)
| |-- city: string (nullable = true)
| |-- name: string (nullable = true)
| |-- occupation: string (nullable = true)
|-- session_id: string (nullable = true)
|-- timestamp: string (nullable = true)
|-- user_id: string (nullable = true)
是否可以将数据字段转换为 MAP[String, String] 数据类型?所以它只和原始json有相同的属性?
最佳答案
是的,您可以通过从 JSON 数据导出 Map[String, String] 来实现这一点,如下所示:
import org.apache.spark.sql.types.{MapType, StringType}
import org.apache.spark.sql.functions.{to_json, from_json}
val jsonStr = """{
"user_id": "123",
"data": {
"name": "some_name",
"age": "23",
"occupation": "teacher"
},
"timestamp": "1563188698.31",
"session_id": "6a793439-6535-4162-b333-647a6761636b"
}"""
val df = spark.read.json(Seq(jsonStr).toDS)
val mappingSchema = MapType(StringType, StringType)
df.select(from_json(to_json($"data"), mappingSchema).as("map_data"))
//Output
// +-----------------------------------------------------+
// |map_data |
// +-----------------------------------------------------+
// |[age -> 23, name -> some_name, occupation -> teacher]|
// +-----------------------------------------------------+
首先我们用to_json($"data")
将data
字段的内容提取成字符串,然后用from_json解析并提取Map (to_json($"data"), 模式)
。
关于json - 如何将嵌套的 JSON 转换为 scala 中的映射对象,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57044746/