java - Spark Streaming App通过代码提交

标签 java maven hadoop apache-spark spark-streaming

我正在尝试通过代码提交Spark Streaming应用程序

SparkConf sparkConf= new SparkConf();
JavaStreamingContext jssc = new JavaStreamingContext(master, appName, new Duration(60*1000), sparkHome, sparkJar);

给出了绝对路径dor SparkJar和sparkHome master spark://xyz:7077
我尝试以相同的方式提交批处理,它可以工作,但不能用于流传输
我收到以下错误。
14/11/26 17:42:25 INFO spark.HttpFileServer: HTTP File server directory is /var/folders/3j/9hjkw0890sx_qg9yvzlvg64cf5626b/T/spark-cd7b30cd-cf95-4e52-8eb4-1c1dccc2d58f
14/11/26 17:42:25 INFO spark.HttpServer: Starting HTTP Server
14/11/26 17:42:25 INFO server.Server: jetty-8.1.14.v20131031
14/11/26 17:42:25 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:50016
14/11/26 17:42:25 INFO server.Server: jetty-8.1.14.v20131031
14/11/26 17:42:25 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
14/11/26 17:42:25 INFO ui.SparkUI: Started SparkUI at http://xxx.xx.xxx.xx:4040
14/11/26 17:42:30 INFO spark.SparkContext: Added JAR /Volumes/Official/workspace/ZBI/target/ZBI-0.0.1-SNAPSHOT-jar-with-dependencies.jar at http://xxx.xx.xxx.xx:50016/jars/ZBI-0.0.1-SNAPSHOT-jar-with-dependencies.jar with timestamp 1417003949988
Exception in thread "main" java.lang.NoClassDefFoundError: **org/apache/spark/ui/SparkUITab**
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

我正在使用maven,以下是我的pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project>
    <modelVersion>4.0.0</modelVersion>
    <groupId>BetaTestTool</groupId>
    <artifactId>TestTool</artifactId>
    <packaging>jar</packaging>
    <version>0.0.1-SNAPSHOT</version>
    <description></description>
    <build>
                <plugins>
                        <plugin>
                                <artifactId>maven-compiler-plugin</artifactId>
                                <version>3.1</version>
                                <configuration>
                                        <source>1.5</source>
                                        <target>1.5</target>
                                </configuration>
                        </plugin>
            <plugin>
                <artifactId>maven-war-plugin</artifactId>
                <version>2.0.1</version>
            </plugin>
              <plugin>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>2.3</version>
        <configuration>
          <descriptorRefs>
            <descriptorRef>jar-with-dependencies</descriptorRef>
          </descriptorRefs>
        </configuration>
        <executions>
          <execution>
            <id>make-assembly</id> <!-- this is used for inheritance merges -->
            <phase>package</phase> <!-- bind to the packaging phase -->
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
        </plugins>
    </build>
    <dependencies>
        **<dependency>
            <groupId>javax.servlet</groupId>
            <artifactId>servlet-api</artifactId>
            <version>2.5</version>
        </dependency>**  
        <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.0.2</version>
    </dependency>
    <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.10</artifactId>
      <version>1.0.2</version>
    </dependency>
   <!-- <dependency> Spark dependency
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-hive_2.10</artifactId>
      <version>1.0.2</version>
    </dependency> -->

  <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka_2.10</artifactId>
      <version>1.1.0</version>
    </dependency>
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <version>1.7.5</version>
    </dependency>

    </dependencies>
</project>

我在异常之后得到了这个异常
14/11/27 10:43:13 INFO spark.HttpFileServer: HTTP File server directory is /var/folders/3j/9hjkw0890sx_qg9yvzlvg64cf5626b/T/spark-b162a8c1-0d77-48db-b559-2b242449db3e
14/11/27 10:43:13 INFO spark.HttpServer: Starting HTTP Server
14/11/27 10:43:13 INFO server.Server: jetty-8.1.14.v20131031
14/11/27 10:43:13 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:62675
Exception in thread "main" java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
    at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
    at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:794)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

然后,我评论了javax.servlet的依赖关系。.在此之后,我得到了第一个提到的错误。.还请提出如何排除这种偏离的关系。.我尝试将范围作为编译器提供,但没有用。

任何帮助深表感谢

我的Pom树如下
--- maven-dependency-plugin:2.8:tree (default-cli) @ ZBI ---
[INFO] BetaBI:ZBI:jar:0.0.1-SNAPSHOT
[INFO] \- org.apache.spark:spark-core_2.10:jar:1.0.2:compile
[INFO]    \- org.apache.hadoop:hadoop-client:jar:1.0.4:compile
[INFO]       \- org.apache.hadoop:hadoop-core:jar:1.0.4:compile
[INFO]          \- commons-configuration:commons-configuration:jar:1.6:compile
[INFO]             \- commons-collections:commons-collections:jar:3.2.1:compile

如何在核心Spark的hadoop依赖关系中排除javax.servlet?

最佳答案

似乎pom.xml中缺少Spark流依赖性。

<dependency> <!-- Spark streaming dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming_2.10</artifactId>
  <version>1.0.2</version>
</dependency>

关于java - Spark Streaming App通过代码提交,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27163574/

相关文章:

java - Maven API 搜索不检索 Google 依赖项

java - 将 quartz 添加到项目会破坏 jdbc 数据源

scala - Spark 作为 Hive 的执行引擎

java - 使用 Hudson 和 Maven 构建时间戳?

java - 如何在 sbt 中创建可跨构建重用的属性(变量)?

java - 如何在Hadoop配置中设置对象

date - Hive UDF 字符串到日期的转换

java - Spring+Vue登录: timeout and error handling

Java 嵌套作用域和变量名隐藏

java - java中使用四种方法统计并显示文件中的行数、单词数、字符数