从昨天开始,我收到一个奇怪的类路径错误 - 升级 JDK 后:
*NoClassDefFoundError*: org/apache/hadoop/hive/ql/session/SessionState when creating Hive client using classpath: file:/Users/geoheil/project/build/classes/java/test, file:/Users/geoheil/project/build/classes/scala/test/
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
但是,在 IntelliJ 中它工作得很好。
详情见配置:
dependencies {
compileOnly deps.sparkCore
compileOnly deps.sparkSql
compileOnly deps.sparkHive
compileOnly deps.sparkMllib
compileOnly deps.scalaLib
testImplementation deps.scalaT
testRuntime deps.pgdown
testImplementation deps.scalaC
testImplementation deps.sparkTestingB
}
configurations {
testCompile.extendsFrom compileOnly
}
详细配置见https://github.com/geoHeil/classpath-gradle-test-failures/blob/master/build.gradle简单地:
java 版:
java version "1.8.0_192"
Java(TM) SE Runtime Environment (build 1.8.0_192-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.192-b12, mixed mode)
最佳答案
项目编译步骤中的警告提示使用了 Java 9+。有关新模块系统的详细信息,请参见 SO answer .
设置 Java 8 后,您的演示项目运行良好,而 Java 10 失败并出现您描述的错误。我希望任何 JDK 9+ 版本都会出现相同的错误。
甲骨文 JDK 8 (1.8.0_181):
$ gradle clean test --console=plain -Dorg.gradle.java.home=/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home
> Task :clean
> Task :compileJava NO-SOURCE
> Task :compileScala NO-SOURCE
> Task :processResources NO-SOURCE
> Task :classes UP-TO-DATE
> Task :compileTestJava NO-SOURCE
> Task :compileTestScala
Pruning sources from previous analysis, due to incompatible CompileSetup.
> Task :processTestResources NO-SOURCE
> Task :testClasses
> Task :test
Discovery starting.
Discovery completed in 117 milliseconds.
Run starting. Expected test count is: 2
18/10/19 13:24:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/10/19 13:24:30 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
Test2:
foo2
18/10/19 13:24:37 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
18/10/19 13:24:37 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
18/10/19 13:24:37 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
root
|-- value: integer (nullable = false)
+-----+
|value|
+-----+
| 1|
| 2|
| 3|
| 4|
+-----+
- should test fine (7 seconds, 859 milliseconds)
hello
Test1:
foo
- should test fine (1 millisecond)
Run completed in 10 seconds, 497 milliseconds.
Total number of tests run: 2
Suites: completed 3, aborted 0
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
BUILD SUCCESSFUL in 18s
3 actionable tasks: 3 executed
JDK 10 (10.0.2):
$ gradle clean test --console=plain -Dorg.gradle.java.home=/Library/Java/JavaVirtualMachines/jdk-10.0.2.jdk/Contents/Home
...
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.parboiled.transform.AsmUtils (file:/Users/bountin/.gradle/caches/modules-2/files-2.1/org.parboiled/parboiled-java/1.1.7/2298c64ce8ee8e2fb37e97e16d7be52f0c7cf61f/parboiled-java-1.1.7.jar) to method java.lang.ClassLoader.findLoadedClass(java.lang.String)
WARNING: Please consider reporting this to the maintainers of org.parboiled.transform.AsmUtils
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
...
- should test fine *** FAILED *** (2 seconds, 210 milliseconds)
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1075)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:142)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:141)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:141)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:138)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:171)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:62)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:471)
at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:377)
...
Cause: java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/session/SessionState when creating Hive client using classpath: file:/Users/bountin/classpath-gradle-test-failures/build/classes/java/test, file:/Users/bountin/classpath-gradle-test-failures/build/classes/scala/test/
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
...
关于scala - JDK升级NoClassDefFoundError后未加载gradle测试依赖项,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52887332/