我正在尝试使用logstash将此条目插入elasticsearch:
2016-05-18 00:14:30,915 DEBUG http-bio-/158.134.18.57-8200-exec-1, HTTPReport - Saved report job 1000 for report
2016-05-18 00:14:30,937 DEBUG http-bio-/158.134.18.57-8200-exec-1, JavaReport -
************************************************************************************************
Report Job information
Job ID : 12000
Job name : 101
Job priority : 1
Job group : BACKGROUND
Report : Month End
2016-05-18 00:17:38,868 DEBUG JobsMaintenanceScheduler_Worker-1, DailyReport - System information: available processors = 12; memory status : 2638 MB of 4096 MB
我在logstash conf文件中有此过滤器:
input {
file {
path => "/data/*.log"
type => "app_log"
start_position => "beginning"
}
}
filter {
multiline {
pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-"
negate => true
what => "previous"
}
if [type] == "app_log" {
grok {
patterns_dir => ["/pattern"]
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp},%{NUMBER:Num_field} %{WORD:error_level} %{GREEDYDATA:origin}, %{WORD:logger} - %{GREEDYDATA:event%}"}
}
}
mutate { add_field => {"type" => "app_log"}}
mutate { add_field => {"machine_name" => "server101"}}
}
output {
elasticsearch {
hosts=> "localhost:9200"
index => "app_log-%{+YYYY.MM.dd}"
manage_template => false
}
}
我收到此错误:
translation missing: en.logstash.runner.configuration.file-not-found {:level=>:error}
无法插入。任何想法可能有什么问题吗?
最佳答案
升级到最新版本的Logstash(= 2.3.2),修复如下所示的grok过滤器,它将起作用:
grok {
add_field => {"machine_name" =>"server010"}
match =>{"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:error_level} %{DATA:origin}, %{DATA:logger_name} - %{GREEDYDATA:EVENT}"}
}
更新
关于elasticsearch - 如何在logstash中基于grok创建过滤器,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37487020/