java - 由于 kafka log4jappender,启动 yarn 作业时出现异常

标签 java apache-kafka log4j hadoop-yarn slf4j

当我使用此 log4j.properties 作为配置运行 yarn 作业时,它失败并出现以下异常。如果我从 rootLogger 中删除 KAFKA,作业将正常启动。

这与此处报告的问题相同: https://github.com/wso2/product-ei/issues/2786

但是我还没有找到解决办法。

环境:CDH 6.3.3

这是我的 log4j.properties 文件。

log4j.rootLogger=DEBUG,stdout,KAFKA
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Threshold=DEBUG
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%-5p %L %X{taskId} %X{stsId}     %d{yyyy-MM-dd HH:mm:ss}     %c     %t     %m%n  

log4j.appender.alog=org.apache.log4j.RollingFileAppender
log4j.appender.alog.maxFileSize=10MB
log4j.appender.alog.maxBackupIndex=5
log4j.appender.alog.file=../logs/serverx.log
log4j.appender.alog.append=false
log4j.appender.alog.layout=org.apache.log4j.PatternLayout
log4j.appender.alog.layout.conversionPattern=%-5p %X{taskId} 
%X{stsId}  %d{yyyy-MM-dd HH:mm:ss}     %c     %t     %m%n
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.KAFKA.layout.conversionPattern=%d%C{1}%t%5p %-4p%X{taskId} %X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n%throwable

log4j.appender.KAFKA.topic=cdhuser_rocplus_roclog
log4j.appender.KAFKA.securityProtocol=PLAINTEXT
log4j.appender.KAFKA.ignoreExceptions=false

异常(exception):

Unexpected problem occured during version sanity check
Reported exception:
java.lang.NullPointerException
    at org.slf4j.LoggerFactory.versionSanityCheck(LoggerFactory.java:267)
    at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:126)
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
    at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
    at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
    at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
    at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
    at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
    at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
    at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
    at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.lang.NullPointerException
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:418)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
    at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
    ... 28 more

最佳答案

我们在 CDH 6.3.3 上遇到了同样的问题

看起来库 kafka-log4j-appender:2.2.1-cdh6.3.3 和 slf4j-api:1.7.25 有问题。

使用的解决方法是指定不同版本的 slf4j-api 和 slf4j-log4j12 库(不适用于 1.7 版本,我们必须使用 1.8 版本)。

解决这个问题的步骤是:

  • 首先,您必须在 pom.xml 文件中将这些库指定为依赖项...

    1.8.0-beta4
      <dependency>
          <groupId>org.slf4j</groupId>
          <artifactId>slf4j-api</artifactId>
          <version>${slf4j.version}</version>
      </dependency>
    
      <dependency>
          <groupId>org.slf4j</groupId>
          <artifactId>slf4j-log4j12</artifactId>
          <version>${slf4j.version}</version>
      </dependency>
    
  • 使用插件将此依赖项复制到target 目录...

          <plugin>
                  <groupId>org.apache.maven.plugins</groupId>
                  <artifactId>maven-dependency-plugin</artifactId>
                  <version>3.1.1</version>
                  <executions>
                      <execution>
                          <phase>package</phase>
                          <goals>
                              <goal>copy-dependencies</goal>
                          </goals>
                      </execution>
                  </executions>
                  <configuration>
                      <includeScope>provided</includeScope>
                      <outputDirectory>target</outputDirectory>
                      <includeArtifactIds>slf4j-api,slf4j-log4j12</includeArtifactIds>
                      <stripVersion>true</stripVersion>
                  </configuration>
              </plugin>
    
  • 在 spark 中包含以下属性:

spark.driver.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
spark.executor.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
  • 最后将此参数与 spark-submit 一起使用:
--jars slf4j-api.jar,slf4j-log4j12.jar

关于java - 由于 kafka log4jappender,启动 yarn 作业时出现异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/64098750/

相关文章:

java - log4j 支持 JSON 格式吗?

apache-kafka - 了解 Kafka 主题和分区

apache-kafka - 从副本消费

Java:将 Marshaller 输出重定向到 log4j

java - 正则表达式,获取#后面的单词,以及#后面的单词,如果单词是 'the'

java - 为什么 Kafka 消费者忽略了我在 auto.offset.reset 参数中的 "earliest"指令,因此没有从绝对第一个事件中读取我的主题?

java - 如何在 log4j2 属性中设置类的日志级别

java - 指定字段对于 MongoDB 是 transient 的,但对于 RestController 不是

java - 如何在 Java 中使用 ResponseEntity 获取 JSON 格式的响应?

java - 如何确保该文件是我创建的?