elasticsearch - 如何在logstash中基于grok创建过滤器

标签 elasticsearch logstash logstash-grok grok

我正在尝试使用logstash将此条目插入elasticsearch:

2016-05-18 00:14:30,915 DEBUG http-bio-/158.134.18.57-8200-exec-1, HTTPReport - Saved report job 1000 for report
2016-05-18 00:14:30,937 DEBUG http-bio-/158.134.18.57-8200-exec-1, JavaReport - 
************************************************************************************************
Report Job information
Job ID : 12000
Job name : 101
Job priority : 1
Job group : BACKGROUND
Report : Month End
2016-05-18 00:17:38,868 DEBUG JobsMaintenanceScheduler_Worker-1, DailyReport - System information: available processors = 12; memory status : 2638 MB of 4096 MB

我在logstash conf文件中有此过滤器:
        input {
  file {
    path => "/data/*.log"
    type => "app_log"
    start_position => "beginning"
  }
}

filter {

      multiline {
                pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-"
                negate => true
                what => "previous"
                }

  if [type] == "app_log" {
    grok {
                patterns_dir => ["/pattern"]
                match => {"message" => "%{TIMESTAMP_ISO8601:timestamp},%{NUMBER:Num_field} %{WORD:error_level} %{GREEDYDATA:origin}, %{WORD:logger} - %{GREEDYDATA:event%}"}
        }
    }

     mutate { add_field => {"type" => "app_log"}}
     mutate { add_field => {"machine_name" => "server101"}}
}
output {
  elasticsearch {
    hosts=> "localhost:9200"
    index => "app_log-%{+YYYY.MM.dd}"
    manage_template => false
  }
}

我收到此错误:
translation missing: en.logstash.runner.configuration.file-not-found {:level=>:error}

无法插入。任何想法可能有什么问题吗?

最佳答案

升级到最新版本的Logstash(= 2.3.2),修复如下所示的grok过滤器,它将起作用:

 grok {
       add_field => {"machine_name" =>"server010"}
       match =>{"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:error_level} %{DATA:origin}, %{DATA:logger_name} - %{GREEDYDATA:EVENT}"}
 }

更新

enter image description here

关于elasticsearch - 如何在logstash中基于grok创建过滤器,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37487020/

相关文章:

ruby - elasticsearch无法理解签名日期

elasticsearch - 不同领域的不同分析仪

elasticsearch - Logstash doc_as_upsert在Elasticsearch中交叉索引消除重复

mysql - 使用 Elasticsearch/Logstash/Kibana 的数据库 SQL

logstash - 如果多个 grok 匹配之一,则防止 _grokparsefailure

php - 同时安装Composer依赖项

mysql - 预处理后使用logstash将数据从MySQL导入到elasticsearch

logstash - 无法正确使用 grok 日期匹配

elasticsearch - 当 `target` 不是 `source` 子字段时,ECS 兼容模式需要 `ip`,例如。 [客户端][ip]

Elasticsearch 重新索引竞争条件