hadoop - Mapreduce 在访问 master 机器上的 datanode 时返回错误

标签 hadoop mapreduce hive

我用三台机器设置了一个 Hadoop 2.4.0 集群。一台master机器部署了namenode、resource manager、datanode和node manager。另外两台worker机器部署了datanode和node manager。当我运行 Hive 查询时,工作失败,错误是

2014-06-11 13:40:13,364 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.net.ConnectException: Call From master/127.0.0.1 to master:43607 failed on connection exception: java.net.ConnectException: Connection >refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:5>7) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImp>l.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730) at org.apache.hadoop.ipc.Client.call(Client.java:1414) at org.apache.hadoop.ipc.Client.call(Client.java:1363) at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:231) at com.sun.proxy.$Proxy9.getTask(Unknown Source) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:136) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699) at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462) at org.apache.hadoop.ipc.Client.call(Client.java:1381) ... 4 more

如果我禁用主机上的数据节点,一切正常。我想知道是否允许在主机上部署数据节点。提前感谢您的热心帮助。

顺便说一句,我在三台机器上的/etc/hosts 是一样的:

127.0.0.1 localhost

10.1.154.231 master

10.1.153.220 slave1

10.1.153.133 slave2

最佳答案

请在您的主机上设置无密码 ssh 自身

你可以通过

  cat ~/id_rsa.pub >> ~/.ssh/authorized_keys2 

确保权限正确

   chmod 0600 ~/.ssh/authorized_keys2

关于hadoop - Mapreduce 在访问 master 机器上的 datanode 时返回错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24170877/

相关文章:

hadoop - Pig:访问内部元组的字段进行过滤

hadoop - 将来自不同文件夹的 map-reduce 输出合并到单个文件夹中

hadoop - hive :尝试映射键和值时出错

hadoop - Neo4j Hadoop集成

hadoop - 在 Hadoop 中的多个文件中写入输出

java - hadoop流中的mapred.local.dir错误

java - MapReduce 排序程序中的 NullPointerException

Hadoop 查找存储选项

java - java.lang.NoClassDefFoundError org.apache.hadoop.hbase.mapreduce.ImportTsv

python - 从本地的 pyspark 读取 azure datalake gen2 文件