Scala sbt 程序集 jar 不起作用(未找到类实现)但代码在通过 IntelliJ 时起作用

标签 scala intellij-idea sbt sbt-assembly livy

关闭。这个问题需要debugging details .它目前不接受答案。












想改善这个问题吗?更新问题,使其成为 on-topic对于堆栈溢出。

6 个月前关闭。




Improve this question




启动我的代码时scala -cp assembly.jar class.A --config-path confFile我得到java.lang.IllegalStateException: No LivyClientFactory implementation was found但是当通过 IntelliJ 启动时它工作得很好。我还检查了我的程序集 jar,我得到了 LivyClientFactory 的 .class。
我怀疑是 build.sbt 错误,有人知道为什么他找不到类(class)吗?
我尝试使用 assemblyMerge 策略,但没有成功。

ThisBuild / scalaVersion := "2.12.10"
crossPaths := true
crossScalaVersions := Seq("2.12.10")
def resolveVersion(scalaV: String, versionsResolver: Map[String, String]): String = versionsResolver(scalaV.slice(0, 4))
val sparkVersions = Map("2.11" -> "2.4.3", "2.12" -> "3.0.1")
val scalaTestVersions = Map("2.11" -> "3.2.1", "2.12" -> "3.2.5")
val livyVersions = Map("2.11" -> "0.7.0-incubating", "2.12" -> "0.8.0-incubating")

// dependencies
val commonDependencies = settingKey[Seq[ModuleID]]("List of common dependencies across sub-projects.")
ThisBuild / commonDependencies := Seq(
  "org.apache.livy" % "livy-client-http" % resolveVersion(scalaVersion.value, livyVersions),
  "org.scalatest" %% "scalatest" % resolveVersion(scalaVersion.value,
                                                  scalaTestVersions
  ) % Test excludeAll excludeJbossNetty
)
libraryDependencies ++= commonDependencies.value ++ Seq(
  "com.typesafe.play" %% "play-json" % resolveVersion(scalaVersion.value, typeSafeVersions),
  "org.apache.httpcomponents" % "httpclient" % "4.5.12" % Test,
  "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2",
  "info.picocli" % "picocli" % "4.1.1",
  "io.sentry" % "sentry-logback" % "1.7.16"
)
lazy val toBuild = (project in file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .aggregate(submissions)
  .dependsOn(submissions)

lazy val submissions = project.settings(
  commonSettings,
  assemblySettings,
  libraryDependencies ++= commonDependencies.value ++ Seq(
    "org.apache.spark" %% "spark-core" % resolveVersion(scalaVersion.value, sparkVersions) % "provided",
    "org.apache.spark" %% "spark-sql" % resolveVersion(scalaVersion.value, sparkVersions) % "provided",
    "org.apache.spark" %% "spark-streaming" % resolveVersion(scalaVersion.value, sparkVersions) % "provided"
  )
)

lazy val commonResolvers = List()

lazy val assemblySettings = Seq(
  assemblyMergeStrategy in assembly := {
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case PathList(ps @ _*) if ps.last == "module-info.class" => MergeStrategy.discard
    case "module-info.class"           => MergeStrategy.discard
    case x =>
      val oldStrategy = (assemblyMergeStrategy in assembly).value
      oldStrategy(x)
  }
)
// Append Scala versions to the generated artifacts
crossPaths := true

// This forbids including Scala related libraries into the dependency
autoScalaLibrary := false

// Tests are executed sequentially
parallelExecution in Test := false
// path to scala doc with all modules
siteSubdirName in ScalaUnidoc := "api"
addMappingsToSiteDir(mappings in (ScalaUnidoc, packageDoc), siteSubdirName in ScalaUnidoc)

val excludeJbossNetty = ExclusionRule(organization = "org", "jboss.netty")

scalacOptions ++= Seq(
  "-encoding",
  "utf8",
  "-deprecation",
  "-feature",
  "-language:higherKinds",
  "-Ywarn-unused-import",
  "-Xfatal-warnings"
)
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case PathList(ps @ _*) if ps.last == "module-info.class" => MergeStrategy.discard
  case "module-info.class"           => MergeStrategy.discard
  case PathList("org", "aopalliance", xs @ _*)      => MergeStrategy.last
  case PathList("javax", "inject", xs @ _*)         => MergeStrategy.last
  case PathList("javax", "servlet", xs @ _*)        => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*)     => MergeStrategy.last
  case PathList("org", "apache", xs @ _*)           => MergeStrategy.last
  case PathList("com", "google", xs @ _*)           => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*)         => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*)           => MergeStrategy.last
  case "about.html"                                 => MergeStrategy.rename
  case "META-INF/DISCLAIMER"                        => MergeStrategy.last
  case "META-INF/ECLIPSEF.RSA"                      => MergeStrategy.last
  case "META-INF/mailcap"                           => MergeStrategy.last
  case "META-INF/mimetypes.default"                 => MergeStrategy.last
  case "plugin.properties"                          => MergeStrategy.last
  case "log4j.properties"                           => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}
artifact in (Compile, assembly) := {
  val art = (artifact in (Compile, assembly)).value
  art.withClassifier(Some("assembly"))
}
addArtifact(artifact in (Compile, assembly), assembly)

最佳答案

jar 在 META-INF 中缺少服务

  case PathList("META-INF", "services", xs @ _*) => MergeStrategy.first
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case PathList(ps @ _*) if ps.last == "module-info.class" => MergeStrategy.discard
  case "module-info.class"           => MergeStrategy.discard

关于Scala sbt 程序集 jar 不起作用(未找到类实现)但代码在通过 IntelliJ 时起作用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/66904241/

相关文章:

java - 如何构建使用 java 注释代码生成的混合 java/scala 项目?

scala - 在程序执行过程中进入 REPL 控制台的干净解决方案

bash - 为什么 Scala 使用反向 shebang (!#) 而不是仅仅将解释器设置为 scala

scala - 编译中断时,可以在SBT中启动Scala REPL吗?

scala - 如何过滤列表[选项]中的“无”?

scala - 案例类构造函数的预处理参数而不重复参数列表

git - 想法 : How to import module from git?

java - Kotlin 类文件无法在 IntelliJ for JUnit 中运行

intellij-idea - Idea 中的 OpenLiberty (WAS) 支持

scala - Hadoop与Spark不匹配的问题?