我在HDFS中有以下文件夹:
hdfs://x.x.x.x:8020/Air/BOOK/AE/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/AE/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/BH/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/IN/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/IN/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/KW/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/KW/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/ME/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/OM/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/Others/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/QA/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/QA/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/SA/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/SA/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/AE/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/AE/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/BH/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/BH/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/IN/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/IN/INT/20171001/2017100101
每个文件夹中都有近50个文件。我的意图是合并一个文件夹中的所有文件,以在从HDFS上将其复制到S3时得到一个文件。我遇到的问题是带有组的正则表达式通过选项。我尝试了这个,但这似乎不起作用:
s3-dist-cp --src hdfs:///Air/ --dest s3a://HadoopSplit/Air-merged/ --groupBy '.*/(\w+)/(\w+)/(\w+)/.*' --outputCodec lzo
该命令本身有效,但是我没有将每个文件夹中的文件合并到一个文件中,这使我相信问题出在我的正则表达式上。
最佳答案
我只想通了这个..正确的正则表达式是
.*/Air/(\w+)/(\w+)/(\w+)/.*/.*/.*
合并和复制的命令是:
s3-dist-cp --src hdfs:///Air/ --dest s3a://HadoopSplit/Air-merged/ --groupBy '.*/Air/(\w+)/(\w+)/(\w+)/.*/.*/.*' --outputCodec lzo
关于hadoop - 从HDFS复制到S3时使用GroupBy合并文件夹中的文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46833387/