我正在尝试使用Logstash解析nginx日志,除了使用包含Nginx $ remote_user的行获取此_grokparsefailure
标记之外,其他一切看起来都很好。当$ remote_user为'-'(未指定$ remote_user时的默认值)时,Logstash会执行此操作,但是使用像user@gmail.com
这样的真实$ remote_user它将失败,并放置一个_grokparsefailure
标记:
127.0.0.1 - - [17/Feb/2017:23:14:08 +0100] "GET /favicon.ico HTTP/1.1" 302 169 "http://training-hub.tn/trainer/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36"
=====>工作正常
127.0.0.1 - jemlifathi@gmail.com [17/Feb/2017:23:14:07 +0100] "GET /trainer/templates/home.tmpl.html HTTP/1.1" 304 0 "http://training-hub.tn/trainer/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36"
=====>
_grokparsefailure
标记并且无法解析日志行我正在使用此配置文件:
input {
file {
path => "/home/dev/node/training-hub/logs/access_log"
start_position => "beginning"
sincedb_path => "/dev/null"
ignore_older => 0
type => "logs"
}
}
filter {
if[type] == "logs" {
mutate {
gsub => ["message", "::ffff:", ""]
}
grok {
match=> [
"message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}",
"message" , "%{COMMONAPACHELOG}+%{GREEDYDATA:extra_fields}"
]
overwrite=> [ "message" ]
}
mutate {
convert=> ["response", "integer"]
convert=> ["bytes", "integer"]
convert=> ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLite2-City.mmdb"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
date {
match=> [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field=> [ "timestamp" ]
}
useragent {
source=> "agent"
}
}
}
output { elasticsearch { hosts => "localhost:9200" } }
最佳答案
在使用多个值测试输出后,我意识到Logstash无法解析包含此类$remote_user
的日志行,因为它不是有效的用户名(电子邮件地址),因此我添加了mutate gsub
过滤器以删除@和其余邮件地址具有有效的$remote_user
。
gsub => ["message", "@(?:(?:a-z0-9?.)+a-z0-9?|[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?).){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9]:(?:[\x01-\x08\x0b\x0c\x0e-\x1f\x21-\x5a\x53-\x7f]|\[\x01-\x09\x0b\x0c\x0e-\x7f])+)]) [", " ["]
现在,它工作正常
关于nginx - 解析Nginx日志时出现Logstash _grokparsefailure,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42308769/