FluentD 将日志从 kafka 转发到另一个 fluentD

标签 fluentd efk

我需要将我的应用程序日志发送到作为 EFK 服务一部分的 FluentD。所以我尝试配置另一个 FluentD 来执行此操作。

my-fluent.conf:

<source>
  @type kafka_group
  consumer_group cgrp
  brokers "#{ENV['KAFKA_BROKERS']}"
  scram_mechanism sha512
  username "#{ENV['KAFKA_USERNAME']}"
  password "#{ENV['KAFKA_PASSWORD']}"
  ssl_ca_certs_from_system true
  topics "#{ENV['KAFKA_TOPICS']}"
  format json
</source>
<filter TOPIC>
  @type parser
  key_name log 
  reserve_data false
  <parse>
    @type json
  </parse>
</filter>
<match TOPIC>
  @type copy
  <store>
    @type stdout
  </store>
  <store>
    @type forward
    <server>
      host "#{ENV['FLUENTD_HOST']}"
      port "#{ENV['FLUENTD_PORT']}"
      shared_key "#{ENV['FLUENTD_SHARED_KEY']}"
    </server>
  </store>
</match>

我能够正确地看到 stdout 的输出

2021-07-06 07:36:54.376459650 +0000 TOPIC: {"foo":"bar", ...}

但是我看不到来自 kibana 的日志。跟踪后我发现第二个 fluentd 在接收数据时抛出错误:

{"time":"2021-07-05 11:21:41 +0000","level":"error","message":"unexpected error on reading data host="X.X.X.X" port=58548 error_class=MessagePack::MalformedFormatError error="invalid byte"","worker_id":0} {"time":"2021-07-05 11:21:41 +0000","level":"error","worker_id":0,"message":"/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.12.2/lib/fluent/plugin/in_forward.rb:262:in feed_each'\n/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.12.2/lib/fluent/plugin/in_forward.rb:262:in block (2 levels) in read_messages'\n/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.12.2/lib/fluent/plugin/in_forward.rb:271:in block in read_messages'\n/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.12.2/lib/fluent/plugin_helper/server.rb:613:in on_read_without_connection'\n/usr/lib/ruby/gems/2.7.0/gems/cool.io-1.7.1/lib/cool.io/io.rb:123:in on_readable'\n/usr/lib/ruby/gems/2.7.0/gems/cool.io-1.7.1/lib/cool.io/io.rb:186:in on_readable'\n/usr/lib/ruby/gems/2.7.0/gems/cool.io-1.7.1/lib/cool.io/loop.rb:88:in run_once'\n/usr/lib/ruby/gems/2.7.0/gems/cool.io-1.7.1/lib/cool.io/loop.rb:88:in run'\n/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.12.2/lib/fluent/plugin_helper/event_loop.rb:93:in block in start'\n/usr/lib/ruby/gems/2.7.0/gems/fluentd-1.12.2/lib/fluent/plugin_helper/thread.rb:78:in block in thread_create'"}

最佳答案

问题是 first fluentd 中缺少安全标签。

<match TOPIC>
  @type copy
  <store>
    @type stdout
  </store>
  <store>
    @type forward
    <server>
      host "#{ENV['FLUENTD_HOST']}"
      port "#{ENV['FLUENTD_PORT']}"
      shared_key "#{ENV['FLUENTD_SHARED_KEY']}"
    </server>
    <security>
      self_hostname HOSTNAME
      shared_key "#{ENV['FLUENTD_SHARED_KEY']}"
    </security>
  </store>
</match>

关于FluentD 将日志从 kafka 转发到另一个 fluentD,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/68266800/

相关文章:

kubernetes - 通过常见的Kibana UI监视来自多个Kubernetes集群的Pod日志(使用GCP和Kops)

kubernetes - 使用fluidd,我想从json数据中只输出一个关键数据

elasticsearch - 缓冲区刷新花费的时间比slow_flush_log_threshold长

linux - Fluent Bit 中连续重试之间的时间

azure - 为什么 Fluentd Azure blob 插件在 kubernetes 中不起作用

node.js - Fluentd 到 elasticsearch-将数据从 fluentd 刷新到 elasticsearch 时出现错误 406(不支持 Content-Type header [])

ssl - Fluentd ssl 验证错误选项

fluentd - Fluent Bit 1.8+ 和 MULTILINE_PARSER