hadoop - 使用 Java API 从 HDFS (Hortonworks Sandbox) 读取文件时出现异常

标签 hadoop hdfs sandbox java file-read

我在尝试使用 Java API 从 HDFS (Hortonworks Sandbox) 读取文件时遇到问题。以下是我的代码-

    System.setProperty ("hadoop.home.dir", "/");
    URI uri = URI.create ("hdfs://localhost:8020/user/maria_dev/test.txt");

    Path path = new Path (uri);

    Configuration conf = new Configuration ();
    conf.set ("fs.defaultFS", "hdfs://localhost:8020");
    conf.set ("dfs.client.use.datanode.hostname","true");
    conf.set("dfs.datanode.use.datanode.hostname","true");
    conf.set("dfs.client.use.legacy.blockreader", "true");

    byte[] btbuffer = new byte[5];
    String s;
    try (FileSystem fs = FileSystem.get (uri, conf)) {
        try {
            FSDataInputStream fileIn = fs.open (path);
            //s = fileIn.readUTF ();
            fileIn.read (btbuffer, 0, 20); 
            s = new String (btbuffer, Charset.forName ("UTF-8"));

        } catch (Exception e) {
            e.printStackTrace ();
        }
    }
    catch (Exception err){
        err.printStackTrace ();
    }

以下是我得到的异常 -

10:39:51.803 [main] WARN org.apache.hadoop.hdfs.BlockReaderFactory - I/O error constructing remote block reader. java.net.ConnectException: Connection refused .   
10:39:51.803 [main] WARN org.apache.hadoop.hdfs.DFSClient - Failed to connect to sandbox.hortonworks.com/172.17.0.2:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused .   
10:39:51.804 [main] INFO org.apache.hadoop.hdfs.DFSClient - Could not obtain BP-1464254149-172.17.0.2-1477381671113:blk_1073742576_1752 from any node: java.io.IOException: No live nodes contain block BP-1464254149-172.17.0.2-1477381671113:blk_1073742576_1752 after checking nodes = [172.17.0.2:50010], ignoredNodes = null No live nodes contain current block Block locations: 172.17.0.2:50010 Dead nodes:  172.17.0.2:50010.

找不到任何可行的解决方案。感谢您的帮助。

编辑: 以下是我的主机系统的/etc/hosts 的条目(从我调用作业的地方)-

127.0.0.1       localhost    
255.255.255.255 broadcasthost    
::1             localhost    
172.17.0.2 sandbox.hortonworks.com localhost    

最佳答案

这是你提到的 URI URI uri = URI.create ("hdfs://localhost:8020/user/maria_dev/test.txt");

但它正在尝试访问 sandbox.hortonworks.com

尝试更改 URI

关于hadoop - 使用 Java API 从 HDFS (Hortonworks Sandbox) 读取文件时出现异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42221388/

相关文章:

Hadoop、MapReduce : how to add second node to mapReduce?

date - hive 失败,发生异常java.io.IOException:java.lang.ClassCastException:无法将java.sql.Timestamp强制转换为java.sql.Date

java - 在具有许多功能的JavaRDD中使用选择性功能

hadoop - 如何将位于 HDFS 上的类型安全配置文件添加到 spark-submit(集群模式)?

hadoop - 具有高可用性的 Namenode 与基于 zookeeper 的领导者选择

iphone - 如何在应用内购买中运行开发签名构建?

eclipse - HCatOutputFormat ClassNotFoundException

jdbc - Hadoop Hive - 如何将 'add jar' 用于 Hive JDBC 客户端?

c - seccomp 如何处理 ptrace 事件

macos - 文件包的文档范围、安全范围书签