ubuntu - 无法访问 Spark Web UI

标签 ubuntu apache-spark ssh cluster-computing apache-spark-standalone

我已经在 12 个节点上安装了 spark2.0.0(在集群独立模式下),当我启动它时我得到这个:

./sbin/start-all.sh

starting org.apache.spark.deploy.master.Master, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.master.Master-1-ibnb25.out

localhost192.17.0.17: ssh: Could not resolve hostname localhost192.17.0.17: Name or service not known

192.17.0.20: starting org.apache.spark.deploy.worker.Worker, logging to /home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb28.out

192.17.0.21: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb29.out

192.17.0.19: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb27.out

192.17.0.18: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb26.out

192.17.0.24: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb32.out

192.17.0.22: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb30.out

192.17.0.25: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb33.out

192.17.0.28: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb36.out

192.17.0.27: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb35.out

192.17.0.17: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb25.out

192.17.0.26: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb34.out

192.17.0.23: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb31.out

我已经设置了端口 o master Port=8081,它的 IP=192.17.0.17 表示 HOSTNAME=ibnb25,我从这个主机启动了集群。

在我的本地机器上,我使用这个命令来访问集群

 ssh mName@xx.xx.xx.xx 

当我想从我的本地机器访问 web UI 时,我使用了主机的 IP 地址 (HOST ibnb25)

192.17.0.17:8081

但是无法显示,所以我尝试使用我用来访问集群的地址

xx.xx.xx.xx:8081

但我的浏览器上没有显示任何内容......怎么了??请帮助我

最佳答案

您的/etc/hosts 文件似乎设置不正确。

您应该使用以下命令获取主机名和 IP:

hostname
hostname -i

确保主机名和 IP 之间有空格。

示例/etc/hosts 文件如下所示:

192.17.0.17  <hostname>
192.17.0.17  localhost
<Other IP1>  <other hostname1>
.
.
.
<Other IP-n>  <other hostname-n>

确保在/etc/hosts 文件中的每个节点上都有集群中的所有 IP 主机条目。

对于 FQDN,阅读 this .

关于ubuntu - 无法访问 Spark Web UI,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39226544/

相关文章:

php - 如何在 nginx 上从/file.php 重定向到/file?

bash - 输出 apt-get upgrade 为文本

linux - Ubuntu 中的 PATH 变量设置在哪里?

batch-file - 通过 plink 向 Cisco CLI 传递多个命令时出错

windows - Plink 工作目录

ubuntu - 如何在 Ubuntu 12.04.1 上安装 Aptana?

sql - 如何将非连续记录的行号重置为 1

apache-spark - 如何可靠获取delta表的分区列

java - 如何向Dataframe添加一些信息?

java - 如何自动化 Eclipse Build + SFTP 文件传输?