我正在使用 sbt 程序集来创建一个可以在 Spark 上运行的 fat jar 子。依赖于 grpc-netty
. spark上的 Guava 版本比grpc-netty
要求的版本旧我遇到了这个错误:java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument .我能够通过在 spark 上将 userClassPathFirst 设置为 true 来解决此问题,但会导致其他错误。
如果我错了,请纠正我,但据我所知,如果我正确地进行着色,则不必将 userClassPathFirst 设置为 true。这是我现在做阴影的方法:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.guava.**" -> "my_conf.@1")
.inLibrary("com.google.guava" % "guava" % "20.0")
.inLibrary("io.grpc" % "grpc-netty" % "1.1.2")
)
libraryDependencies ++= Seq(
"org.scalaj" %% "scalaj-http" % "2.3.0",
"org.json4s" %% "json4s-native" % "3.2.11",
"org.json4s" %% "json4s-jackson" % "3.2.11",
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided",
"org.clapper" %% "argot" % "1.0.3",
"com.typesafe" % "config" % "1.3.1",
"com.databricks" %% "spark-csv" % "1.5.0",
"org.apache.spark" % "spark-mllib_2.11" % "2.2.0" % "provided",
"io.grpc" % "grpc-netty" % "1.1.2",
"com.google.guava" % "guava" % "20.0"
)
我在这里做错了什么,我该如何解决?
最佳答案
你快到了。什么 shadeRule
是吗renames class names , 不是库名:
The main ShadeRule.rename rule is used to rename classes. All references to the renamed classes will also be updated.
事实上,在
com.google.guava:guava
没有包含包 com.google.guava
的类:$ jar tf ~/Downloads/guava-20.0.jar | sed -e 's:/[^/]*$::' | sort | uniq
META-INF
META-INF/maven
META-INF/maven/com.google.guava
META-INF/maven/com.google.guava/guava
com
com/google
com/google/common
com/google/common/annotations
com/google/common/base
com/google/common/base/internal
com/google/common/cache
com/google/common/collect
com/google/common/escape
com/google/common/eventbus
com/google/common/graph
com/google/common/hash
com/google/common/html
com/google/common/io
com/google/common/math
com/google/common/net
com/google/common/primitives
com/google/common/reflect
com/google/common/util
com/google/common/util/concurrent
com/google/common/xml
com/google/thirdparty
com/google/thirdparty/publicsuffix
将您的着色规则更改为:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.common.**" -> "my_conf.@1")
.inLibrary("com.google.guava" % "guava" % "20.0")
.inLibrary("io.grpc" % "grpc-netty" % "1.1.2")
)
所以你不需要改变
userClassPathFirst
.此外,您可以像这样简化着色规则:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.common.**" -> "my_conf.@1").inAll
)
自
org.apache.spark
依赖项是 provided
,它们不会包含在您的 jar 中,也不会被着色(因此 spark 将使用它自己在集群上的无着色版本的 Guava )。
关于apache-spark - sbt 程序集着色以创建 fat jar 以在 Spark 上运行,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45989052/