ubuntu - 无法打开 hadoop 本地主机 :50070

标签 ubuntu hadoop localhost

嗨,我正在尝试在 ubuntu 上安装 hadoop(单节点)。
我无法打开 localhost:50070。

当我吃午饭时,我得到了这个

6674 NodeManager
6825 Jps
6359 ResourceManager

我是 ubuntu 的新手,所以尽可能多地解释一下,非常感谢。
daniele@daniele-S551LB:/usr/local/hadoop-2.6.0/sbin$ ./start-dfs.sh
Starting namenodes on [localhost]
daniele@localhost's password: 
localhost: starting namenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-daniele-namenode-daniele-S551LB.out
daniele@localhost's password: 
localhost: starting datanode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-daniele-datanode-daniele-S551LB.out
Starting secondary namenodes [0.0.0.0]
daniele@0.0.0.0's password: 
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-daniele-secondarynamenode-daniele-S551LB.out
daniele@daniele-S551LB:/usr/local/hadoop-2.6.0/sbin$ jps
2935 Jps
daniele@daniele-S551LB:/usr/local/hadoop-2.6.0/sbin$ ./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
daniele@localhost's password: 
localhost: starting namenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-daniele-namenode-daniele-S551LB.out
daniele@localhost's password: 
localhost: starting datanode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-daniele-datanode-daniele-S551LB.out
Starting secondary namenodes [0.0.0.0]
daniele@0.0.0.0's password: 
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-daniele-secondarynamenode-daniele-S551LB.out
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop-2.6.0/logs/yarn-daniele-resourcemanager-daniele-S551LB.out
daniele@localhost's password: 
localhost: starting nodemanager, logging to /usr/local/hadoop-2.6.0/logs/yarn-daniele-nodemanager-daniele-S551LB.out
daniele@daniele-S551LB:/usr/local/hadoop-2.6.0/sbin$ jps
3931 Jps
3846 NodeManager
3529 ResourceManager

最佳答案

如果 tmp 目录处于一致状态,请检查日志

日志文件位置:$HADOOP_HOME/logs/hadoop-*-namenode-**.log

您可能会在日志文件中看到一些内容

2016-12-10 00:59:55,718 错误 org.apache.hadoop.hdfs.server.namenode.NameNode:无法启动名称节点。
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException:目录/tmp/dfs/name处于不一致状态:存储目录不存在或不可访问。

在 core-site.xml 中声明 tmp 目录,然后格式化 namenode。

我希望它会帮助你

关于ubuntu - 无法打开 hadoop 本地主机 :50070,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33317166/

相关文章:

Git存储库不克隆

linux - 哪些 IMS 软件包可用于 Ubuntu?

Hadoop 安排作业按顺序运行(一项接一项)?

android - 将 android 设备连接到本地服务器 xampp

macos - 为什么 OS X 不报告支持 AVX2 而 Ubuntu 报告支持?

mongodb - mongodb 和 mongodb-server 的区别

regex - 正则表达式正确将Weblog导入Hive

sql - Hive Query,有什么好的方法可以优化这些并集?

windows - 无法将Fabric8与Docker一起用于Windows

php - 来自本地主机中 MySQL 的登录警告