Hadoop关闭,如何在伪分布式模式下停止hadoop?

标签 hadoop

我尝试使用 stop-dfs.sh 杀死 hadoop 进程,但失败了。难道我必须用蛮力杀-9?

[root@trdstorm sbin]# sudo ./stop-dfs.sh
16/07/26 10:19:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [localhost]
root@localhost's password:
localhost: Permission denied, please try again.
root@localhost's password:
localhost: no namenode to stop
root@localhost's password:
localhost: no datanode to stop
Stopping secondary namenodes [0.0.0.0]
root@0.0.0.0's password:
0.0.0.0: no secondarynamenode to stop  --> no secondaryname to stop, from grep, it has.
16/07/26 10:20:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[root@trdstorm sbin]# ps -aef | grep secondary
hadoop    7406     1  0 Jul15 ?        00:09:38 /usr/java/jdk1.8.0_60/jre/bin/java -Dproc_secondarynamenode -Xmx1000m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop-hadoop-secondarynamenode-trdstorm.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,RFA -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode

根 26651 26238 0 10:20 pts/2 00:00:00 grep 二级

+++++++++++++++++++++无密码ssh后+++++++++++++++++++
+++++++++++++++++++++停止hadoop进程的命令++++++++++++++++++
[hadoop@trdstorm ~]$ ssh localhost
Last login: Tue Jul 26 17:29:00 2016 from localhost
[hadoop@trdstorm ~]$ cd /usr/local/hadoop/sbin/
[hadoop@trdstorm sbin]$ sudo ./stop-all.sh
[sudo] password for hadoop:
This script is Deprecated. Instead use stop-dfs.sh and stop-yarn.sh
16/07/26 17:32:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [localhost]
root@localhost's password:
localhost: no namenode to stop
root@localhost's password:
localhost: no datanode to stop
Stopping secondary namenodes [0.0.0.0]
root@0.0.0.0's password:
0.0.0.0: no secondarynamenode to stop
16/07/26 17:33:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
stopping yarn daemons
no resourcemanager to stop
root@localhost's password:
localhost: no nodemanager to stop
no proxyserver to stop
[hadoop@trdstorm sbin]$ ps -aef | grep second
hadoop    7406     1  0 Jul15 ?        00:09:53 /usr/java/jdk1.8.0_60/jre/bin/java -Dproc_secondarynamenode -Xmx1000m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop-hadoop-secondarynamenode-trdstorm.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,RFA -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode
hadoop   29825 29295  0 17:33 pts/3    00:00:00 grep second

最佳答案

停止所有服务

sudo ./stop-all.sh

关于Hadoop关闭,如何在伪分布式模式下停止hadoop?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38580161/

相关文章:

linux - sudoers 文件是什么?

Hadoop 设置和配置

hadoop - 在 Apache Pig 中取消分组

hadoop - 在MapReduce中使用HBase代替HDFS

java - Hadoop : java. io.IOException : Call to localhost/127. 0.0.1 :54310 failed on local exception: java. io.EOFException

hadoop - pig 分层抽样?

java - 为大数据生成最佳 UUID

shell - 如何从 hdfs 位置删除除一个文件以外的所有文件?

hadoop - Hadoop添加第三个节点会出错

java - 将外部 jar 设置为 hadoop 类路径