我的输入 csv 文件中有一条记录,
"2017-11-01","2017-10-29","2017-11-04","4532491","","","","Natural States: "The Environmental Imagination" in Maine, Oregon, and the Nation","1000","Richard W. Judd"
当我在 pyspark 中阅读此 csv 时,字段“自然状态:缅因州、俄勒冈州和国家的“环境想象力””
被分隔为单独的列。
>>> df = spark.read.csv('file.csv')
>>> df.show(truncate=False)
+----------+----------+----------+----------+----+----+----+---------------------------------------------------------+-------+----------------+----+---------------+
|_c0 |_c1 |_c2 |_c3 |_c4 |_c5 |_c6 |_c7 |_c8 |_c9 |_c10|_c11 |
+----------+----------+----------+----------+----+----+----+---------------------------------------------------------+-------+----------------+----+---------------+
|2017-11-01|2017-10-29|2017-11-04| 4532491 |null|null|null|Natural States: "The Environmental Imagination" in Maine | Oregon| and the Nation |1000|Richard W. Judd|
+----------+----------+----------+----------+----+----+----+---------------------------------------------------------+-------+----------------+----+---------------+
除了更改输入文件中的分隔符之外的任何解决方法,因为我们无法更改输入文件。
最佳答案
您可以使用sparkContext
读取文件并使用多个字符分割
作为“,”
,然后转换rdd
到 dataframe
如下
rdd = sc.textFile("file.csv")
def replaceFunc(words):
result = []
for word in words.split("\",\""):
result.append(word.replace("\"", ""))
return result
rdd.map(replaceFunc).toDF().show(1, False)
您应该有以下输出
+----------+----------+----------+-------+---+---+---+------------------------------------------------------------------------------+----+---------------+
|_1 |_2 |_3 |_4 |_5 |_6 |_7 |_8 |_9 |_10 |
+----------+----------+----------+-------+---+---+---+------------------------------------------------------------------------------+----+---------------+
|2017-11-01|2017-10-29|2017-11-04|4532491| | | |Natural States: The Environmental Imagination in Maine, Oregon, and the Nation|1000|Richard W. Judd|
+----------+----------+----------+-------+---+---+---+------------------------------------------------------------------------------+----+---------------+
关于python - 读取 csv 中带有逗号和引号的字段,其中逗号是分隔符 - pyspark,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48435479/