java - 在 hbase 中,无法识别 dir hdfs ://test/apps/hbase/data/lib, 的 fs 忽略了 java.io.IOException

标签 java maven hadoop jdbc hbase

我能够连接到 Hbase 进行 JAVA 代码插入,它不会抛出任何错误,但在建立 Maven 依赖关系后,我收到以下错误:

org.apache.hadoop.hbase.util.DynamicClassLoader - 无法识别目录 hdfs://test/apps/hbase/data/lib 的文件系统,忽略 java.io.IOException:没有方案的文件系统:HDFS

我正在使用这个jar,它是在其他spring项目中使用maven创建的。

在下面找到完整的日志。

09:16:56,920 [http-nio-8080-exec-4] WARN  org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
        09:16:56,982 [http-nio-8080-exec-4] ERROR org.apache.hadoop.util.Shell  - Failed to locate the winutils binary in the hadoop binary path
        java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
                at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
                at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
                at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
                at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
                at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
                at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
                at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
                at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:257)
                at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:234)
                at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:749)
                at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)...........................
.....at java.lang.Thread.run(Thread.java:748)
        09:16:58,733 [http-nio-8080-exec-4] WARN  org.apache.hadoop.hbase.util.DynamicClassLoader  - Failed to identify the fs of dir hdfs://test/apps/hbase/data/lib, ignored
        java.io.IOException: No FileSystem for scheme: hdfs
                at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2579)
                at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2586)
                at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
                at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
                at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
                at org.apache.hadoop.hbase.util.DynamicClassLoader.initTempDir(DynamicClassLoader.java:118)
                at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:98)
                at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:241)
                at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
                at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
                at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
                at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879)
                at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:635)
                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
                at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
                at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
                at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)...................

找到 pom.xml 的以下配置。

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.trinity.test</groupId>
  <artifactId>Hbaseinsertproject</artifactId>
  <version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.apache.drill.exec</groupId>
            <artifactId>drill-jdbc-all</artifactId>
            <version>1.9.0</version>
        </dependency>
        <dependency>
            <groupId>org.activiti</groupId>
            <artifactId>activiti-engine</artifactId>
            <version>6.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.activiti</groupId>
            <artifactId>activiti5-engine</artifactId>
            <version>6.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.kie</groupId>
            <artifactId>kie-internal</artifactId>
            <version>7.15.0.Final</version>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <!-- <artifactId>kafka_2.9.2</artifactId> <version>0.8.1.1</version> -->
            <!-- <artifactId>kafka_2.9.1</artifactId> <version>0.8.2.1</version> -->
            <artifactId>kafka_2.11</artifactId>
            <version>0.9.0.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.apache.zookeeper</groupId>
                    <artifactId>zookeeper</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-mapper-asl</artifactId>
            <version>1.5.0</version>
        </dependency>


        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>16.0.1</version>
        </dependency>

        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>


        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>1.1.2</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                </exclusion>
            </exclusions>

            <type>pom</type>
        </dependency>
        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-core</artifactId>
            <version>4.7.0-HBase-1.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.1.2</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                </exclusion>
            </exclusions>

        </dependency>
        <dependency>
            <groupId>jdk.tools</groupId>
            <artifactId>jdk.tools</artifactId>
            <version>1.8.0_144</version>
            <scope>system</scope>
            <systemPath>C:/Program Files/Java/jdk1.8.0_144/lib/tools.jar</systemPath>
        </dependency>
        <dependency>
            <groupId>org.json</groupId>
            <artifactId>json</artifactId>
            <version>20180813</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <!-- any other plugins -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
            </plugin>
        </plugins>
    </build>
    </project>

Java 代码是:

public class HbaseConnectionHolder {

    public static Connection connection=null;
    public static Configuration conf=null;
    public static Table table=null;
    static {
        System.out.println("------------HBaseConfiguration.create()");
        conf = HBaseConfiguration.create();
        System.out.println("------------configuration");
        conf.set("hbase.zookeeper.quorum", "<test1.cloud>:2080,<test2.cloud>:2181,<test3.cloud>:2181");
        conf.set("hbase.zookeeper.property.clientPort", "2080");
        conf.set("hbase.cluster.distributed", "true");
        conf.set("zookeeper.znode.parent","/hbase-unsecure");
        try {
            System.out.println("------------connection");
            connection = ConnectionFactory.createConnection(conf);
            System.out.println("------------table");
            table = connection.getTable(TableName.valueOf("test"));
        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    public static Connection getHbaseConnection()
    {
        return connection;
    }

    public static Table getHbaseTableInstance()
    {
        return table;
    }

}

下面是调用execute方法的代码。

公共(public)类测试实现 JavaDelegate{

public void execute(DelegateExecution execution) {
        try {
            Put put = new Put(Bytes.toBytes("basic_id/123420"));

            put.add(Bytes.toBytes("det"), Bytes.toBytes("name"), Bytes.toBytes(""));
            HbaseConnectionHolder.getHbaseTableInstance().put(put);
        } catch (IOException e) {
            e.printStackTrace();
        } catch (ParseException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

    }

    }

我已添加

core-site.xml
hadoop-env.sh
hbase-env.sh
hbase-policy.xml
hdfs-site.xml
hbase-site.xml

如果添加所有这些东西,它在上面的java代码中工作正常(没有任何错误),但是如果我将它构建为maven jar,那么它会给出上面提到的异常。我是否缺少 Maven 或上述资源文件中的任何配置。

最佳答案

我添加了 hadoop jar,然后问题就解决了。

关于java - 在 hbase 中,无法识别 dir hdfs ://test/apps/hbase/data/lib, 的 fs 忽略了 java.io.IOException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54042600/

相关文章:

java - 查找在二叉树中加起来总和的路由数

java - Maven jetty 插件 - 使用多模块项目自动重新加载

spring - org.springframework.test.web.server.samples.context.SecurityRequestPostProcessors 的 Maven 依赖

Hadoop MapReduce WordCount 示例缺陷?

java - 如何在 HBox 中的其他两个节点之间添加节点?

java - 以编程方式设置 View 的宽度/高度

java - 套接字:BufferedReader readLine() block

java - 如何优化存储库部署到生产服务器 Hippo CMS?

hadoop - HDFS未格式化,但没有错误

hadoop 2.2.0 java.lang.Thread.run 容器启动异常(Thread.java :744)