hadoop - 服务器安装 hdfs 客户端失败

标签 hadoop hdfs hortonworks-data-platform ambari

我在 Ambari 上安装 HDFS 客户端时收到以下错误。服务器重置了好几次,还是解决不了。知道如何解决这个问题吗?

标准错误:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>
    HdfsClient().execute()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install
    self.configure(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure
    hdfs()
 File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
    group=params.user_group
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist

最佳答案

这是一个链接到/etc/hadoop/conf的软链接(soft link)

我跑

python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py --silent --skip=users

运行后,删除/etc/hadoop/conf

但是,重新安装不会重新创建它。

因此您可能必须自己创建所有的 conf 文件。 希望有人能修补它。

关于hadoop - 服务器安装 hdfs 客户端失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33159169/

相关文章:

java - 如何以编程方式在 hdfs 中创建/触摸文件?

hadoop - MapReduce 作业上的容器启动异常

regex - 查找正则表达式中双引号之间的所有CR或LF(只需输入)

java - 更改Hadoop HDFS文件名

java - 产生奇怪结果的简单字数统计 MapReduce 示例

hadoop - 在 VMware 中更改 Hadoop IP 地址

hadoop - 具有 Hive 操作的 Oozie 工作流因权限问题而失败

postgresql - 如何从 Postgres 数据库到 Hadoop 序列文件?

hadoop - 从 Hadoop 集群中运行 web-fetch

json - 为嵌套的 JSON 数据创建 Hive 表