我正在尝试使用spark-streaming2.0.0消耗kafka 0.8主题,我正在尝试确定所需的依赖项,我已经在build.sbt文件中使用这些依赖项进行了尝试
libraryDependencies += "org.apache.spark" %% "spark-streaming_2.11" % "2.0.0"
当我运行sbt软件包时,我得到了所有这三个 jar 的未解决依赖关系,
但是这些 jar 确实存在
https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-8_2.11/2.0.0
请帮助调试此问题,因为我是Scala的新手,所以如果我做的不对,请通知我
最佳答案
问题是您要指定Scala版本,并且还要使用%%
尝试推断您使用的Scala版本。
要么删除一个%
:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.0.0"
或删除Scala版本:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-8" % "2.0.0"
关于apache-spark - 如何为Spark Streaming定义Kafka(数据源)依赖项?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39516992/