elasticsearch - Logstash无法创建单独的索引

标签 elasticsearch logstash filebeat

我有两个具有标签和字段的filebeat输入。
在我的pipeline.conf中,我通过它们的标签使用了过滤器日志。
但是在创建索引时,logstash会将索引名称作为%{[fields] [log_type]}-2020-10-07。
我该如何解决?我可以创建两个单独的索引吗?

这是我的文件。
filebeat.yml

- type: log 
  enabled: true
  paths:
    - D:\Git\gbase.API\Logs\*.log
  tags: ["gbaseapi"]    
  fields: {log_type: gbase}

- type: log 
  enabled: true
  paths:
    - D:\Git\finance.api\FinanceAPI\logs\*.log
  tags: ["financeapi"]
  fields: {log_type: finance}

multiline.pattern: '^[[:space:]]'
multiline.negate: false
multiline.match: after
mypipeline.conf
input {
 beats {
    type=>"mytest"
    port => 5044
  }
} 
filter{
    if "gbase" in [tags]
    {
       if [level] in [ "Error", "Fatal" ] 
        {
            grok { match=> ["message","%{DATESTAMP:timestamp} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{URI:requestUrl} %{USER:method}  %{GREEDYDATA:message}"] }
        }
        else
        {
            grok { match=> ["message","%{DATESTAMP:timestamp} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{GREEDYDATA:message}" ] }
        }
         mutate { gsub => ["message", "\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{4} ",""]} 
         mutate { gsub =`enter code here`> ["message", "%{level}",""]}
         mutate { gsub => ["message", "%{logger}",""]}
         mutate { gsub => ["message", "%{clientIp}",""]}
    }
    if "finance" in [tags]
    {
       if [level] in [ "Error", "Fatal" ] 
        {
            grok { match=> ["message","%{DATESTAMP:time} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{URI:requestUrl} %{USER:method} %{GREEDYDATA:message}"]}
        }
        else
        {
            grok { match=> ["message","%{DATESTAMP:time} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{GREEDYDATA:message}" ]}
        }
        mutate { gsub => ["message", "\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{4} ",""]}
        mutate { gsub => ["message", "%{level}",""]}
        mutate { gsub => ["message", "%{logger}",""]}
        mutate { gsub => ["message", "%{clientIp}",""]}
    }
   date {
        match => [ "time" , "dd/MMM/yyyy:HH:mm:ss Z" ]
        target=> "@time"
    }
}
output {
        elasticsearch 
        {
            hosts => ["http://localhost:9200"]
            index => "%{[fields][log_type]}-%{+YYYY.MM.dd}"
            user => "something"
            password => "something"     
        }
  stdout { codec => rubydebug }
 }

最佳答案

您应该这样指定fields:

- type: log 
  enabled: true
  paths:
    - D:\Git\gbase.API\Logs\*.log
  tags: ["gbaseapi"]    
  fields:
    log_type: gbase                      <--- change this

关于elasticsearch - Logstash无法创建单独的索引,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/64244257/

相关文章:

elasticsearch - Elasticsearch 查询,仅返回最独特的结果

java - Apache Flink 与 Elasticsearch 集成

elasticsearch - 如何从ansible_results解析logstash/grok中的json

regex - 如何匹配 grok/logstash 中的换行符?

elasticsearch - 如何仅收集Filebeat中的错误级别日志并发送给elasticsearch

elasticsearch - 如果它们包含特定字段,如何限制Filebeat仅将日志发送到ELK?

elasticsearch - Elasticsearch查询数字的字符串表示形式

elasticsearch - 如何减少 Logstash 的 RAM 使用量?

elasticsearch - Filebeat可以用空格来剖析日志行吗?

elasticsearch - 带有filebeat的Elasticsearch中的动态索引