hadoop - PutHDFS关于Apache NiFi错误

标签 hadoop hdfs yarn apache-nifi

尝试在具有以下配置的Apache NiFi1.2.1上使用PutHDFS处理器时;

hadoop configuration reource : /usr/local/hadoop-2.7.0/etc/hadoop/core-site.xml, /usr/local/hadoop-2.7.0/etc/hadoop/hdfs-site.xml
directory: /mydir

我遇到以下错误。
Caused by: org.apache.hadoop.ipc.RemoteException: File /tweets/.381623121831518.json could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1550)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3067)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:722)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

最佳答案

解析度

我按照以下步骤纠正此问题;

停止所有服务

$ cd $HADOOP_HOME
$ sbin/stop-all.sh 

删除hdfs-site.xml中提到的namenode和datanode目录
$rm datanode
$rm namenode

格式名称节点
hadoop namenode -format

启动所有Hadoop服务
$sbin/start-all.sh

验证:

检查所有正在运行的服务
bash-3.2# jps
61488 ResourceManager
57128 RunNiFi
61160 NameNode
61256 DataNode
57129 NiFi
61609 Jps
61371 SecondaryNameNode
61582 NodeManager

按照PutHDFS Processor-> Destination Directory中的指定检查在/ mydir中传输的文件。此目录中应该有文件传输
$ hdfs dfs -ls /mydir

关于hadoop - PutHDFS关于Apache NiFi错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43591011/

相关文章:

python - 如何在Hadoop上运行pySpark

hadoop - Oozie示例 map 缩小作业失败,并出现java.lang.NoSuchFieldError:HADOOP_CLASSPATH

hadoop - 如何清理hadoop mapreduce内存使用量?

hadoop - 在Map/Reduce中,只能减少重新启动吗?

hadoop - 使用计算查询分区,避免全表扫描

hadoop - HDFS 磁盘已满

hadoop - yarn -让hadoop使用更多资源

oracle - 与 Oracle 相比,在 Hadoop 中的表上使用索引有什么优势吗?

Spring for Hadoop : issues with batch-spark sample on CDH 5. 8

java - 链接多个hadoop作业并无需等待即可提交