java - ClassNotFoundException : org. apache.hadoop.conf.Configuration 启动 Flink SQL 客户端

标签 java hadoop hive apache-flink

我正在尝试设置 Hive 与 Flink 的集成,如图所示 here 。我已按照上述配置了所有内容,所有服务(Hive、MySQL、Kafka)都正常运行。但是,当我使用以下命令在独立的本地 Flink 集群上启动 Flink SQL 客户端时:

./bin/sql-client.sh embedded

我得到一个ClassNotFoundException:org.apache.hadoop.conf.Configuration...

这是包含异常跟踪的详细日志文件:

2020-04-01 11:27:31,458 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.rpc.address, localhost
2020-04-01 11:27:31,459 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.rpc.port, 6123
2020-04-01 11:27:31,459 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.heap.size, 1024m
2020-04-01 11:27:31,460 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: taskmanager.memory.process.size, 1568m
2020-04-01 11:27:31,460 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2020-04-01 11:27:31,460 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: parallelism.default, 1
2020-04-01 11:27:31,460 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.execution.failover-strategy, region
2020-04-01 11:27:31,507 INFO  org.apache.flink.core.fs.FileSystem                           - Hadoop is not in the classpath/dependencies. The extended set of supported File Systems via Hadoop is not available.
2020-04-01 11:27:31,527 WARN  org.apache.flink.client.cli.CliFrontend                       - Could not load CLI class org.apache.flink.yarn.cli.FlinkYarnSessionCli.
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.flink.client.cli.CliFrontend.loadCustomCommandLine(CliFrontend.java:1076)
at org.apache.flink.client.cli.CliFrontend.loadCustomCommandLines(CliFrontend.java:1030)
at org.apache.flink.table.client.gateway.local.LocalExecutor.<init>(LocalExecutor.java:135)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:85)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnException
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
... 7 more
2020-04-01 11:27:31,543 INFO  org.apache.flink.table.client.gateway.local.LocalExecutor     - Using default environment file: file:/home/rudip7/flink/conf/sql-client-defaults.yaml
2020-04-01 11:27:31,861 INFO  org.apache.flink.table.client.config.entries.ExecutionEntry   - Property 'execution.restart-strategy.type' not specified. Using default value: fallback
2020-04-01 11:27:32,776 ERROR org.apache.flink.table.client.SqlClient                       - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.
org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at             
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
... 14 more

你对我缺少的东西有什么提示吗?我的类路径上有所有 Hadoop 库,但我不明白为什么缺少这个类...

我使用的是 Flink 1.10 和 Java 8

此外,日志文件中的这一行让我想知道这是否真的是我的错误:

2020-04-01 11:27:32,776 ERROR org.apache.flink.table.client.SqlClient                       - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.

提前谢谢您!

最佳答案

如果你没有在环境中添加HADOOP_HOME,你可以在运行嵌入的./bin/sql-client.sh之前导出HADOOP_CLASSPATH

export HADOOP_CLASSPATH=`hadoop classpath`

关于java - ClassNotFoundException : org. apache.hadoop.conf.Configuration 启动 Flink SQL 客户端,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/60968006/

相关文章:

java - 在 Drools 6 中从数据库加载和更新规则

java - Spring数据存储库junit测试

java - 使用 Apache Hadoop JAR 文件或特定于供应商的文件?

apache-spark - 在 Spark 2.4.0 中使用 spark.sql 的 Hive 表锁定信息抛出错误

hadoop - 无法在Hive中为CSV文件创建表

Java聊天服务器

java - 如何修改 getMessage 异常?

hadoop - 如何将文件夹中包含的所有 jar 添加到配置单元?

hadoop - 为什么要在 MapReduce 框架中使用两个(或更多)reducer?

hadoop - 传递一个列表 tp reducer 输出