json - 无法通过Logstash将JSON数据导入Elastic

标签 json elasticsearch kibana elastic-stack

我是ELK堆栈的新手,试图用logstach将单行json文件导入elasticsearch时,我很沮丧。
在Elasticsearch(10.10.20.13:9200/monitor/_search?q=*)或kibana上什么都没有显示。

我的json看起来像:

{"host":"*********","cpu":"2.1","disk":"0.628242","memory":"0.324597","createAt":"2017-10-03T00:18:01"}

我的配置文件:(经过数小时的搜索,我还添加了json编解码器\&过滤器,但没有任何变化)
input{
    file{
        path => "/usr/share/logstash/log/monitor-sys-1506979201881.json"
    sincedb_path => "/dev/null" 
    start_position => "beginning"   
    }
}
output{
    elasticsearch {
        hosts =>["10.10.20.13:9200"]
        index => ["monitor"]        
    }
    stdout { 
    codec => rubydebug 
    }
}

我尝试不成功的另一种配置是:
input{
    file{
        path => "/usr/share/logstash/log/monitor-sys-1506979201881.json"
    sincedb_path => "/dev/null" 
    start_position => "beginning"   
    type => "json"
    }
}

filter{
json {
source => "message"
}
}
output{
    elasticsearch {
        hosts =>["10.10.20.13:9200"]
        index => ["monitor"]        
    }
    stdout { 
    codec => rubydebug 
    }
}

我正在运行的命令:
/usr/share/logstash/bin/logstash -f /opt/*****/sys-monit/logstash-sys-monitor.conf --path.settings /etc/logstash --verbose --debug

调试产生以下结果:
[2017-10-16T15:56:29,118][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-10-16T15:56:29,122][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x4f2ed590 @kibana_version_parts=["5", "6", "0"], @module_name="fb_apache", @directory="/usr/share/logstash/modules/fb_apache/configuration">}
[2017-10-16T15:56:29,123][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-10-16T15:56:29,124][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0xb732fcc @kibana_version_parts=["5", "6", "0"], @module_name="netflow", @directory="/usr/share/logstash/modules/netflow/configuration">}
[2017-10-16T15:56:29,292][DEBUG][logstash.agent           ] Agent: Configuring metric collection
[2017-10-16T15:56:29,295][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-10-16T15:56:29,420][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-10-16T15:56:29,507][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-10-16T15:56:29,508][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-10-16T15:56:29,529][DEBUG][logstash.agent           ] Reading config file {:config_file=>"/opt/experis-cyber/sys-monitor/logstash-sys-monitor.conf"}
[2017-10-16T15:56:29,709][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"input", :class=>LogStash::Inputs::File}
[2017-10-16T15:56:29,733][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2017-10-16T15:56:29,750][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_2c3eb918-337a-4935-bf78-bfe3ab709129"
[2017-10-16T15:56:29,750][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2017-10-16T15:56:29,750][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@path = ["/usr/share/logstash/log/monitor-sys-1506979201881.json"]
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_path = "/dev/null"
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@start_position = "beginning"
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@id = "9e9162561d919c7b40b4a16e9f4e8e6e81267f8d-1"
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@enable_metric = true
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain id=>"plain_2c3eb918-337a-4935-bf78-bfe3ab709129", enable_metric=>true, charset=>"UTF-8">
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@add_field = {}
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@stat_interval = 1
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@discover_interval = 15
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_write_interval = 15
[2017-10-16T15:56:29,754][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@delimiter = "\n"
[2017-10-16T15:56:29,754][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@close_older = 3600
[2017-10-16T15:56:29,933][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2017-10-16T15:56:29,968][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_ddd82ced-4bda-414a-a9c2-d70ea27bde23"
[2017-10-16T15:56:29,969][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2017-10-16T15:56:29,969][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//10.10.20.13:9200]
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "monitor"
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "9e9162561d919c7b40b4a16e9f4e8e6e81267f8d-2"
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_ddd82ced-4bda-414a-a9c2-d70ea27bde23", enable_metric=>true, charset=>"UTF-8">
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2017-10-16T15:56:30,018][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2017-10-16T15:56:30,044][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2017-10-16T15:56:30,118][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2017-10-16T15:56:30,122][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_9c1ee8e6-9fa4-4553-96d1-803214216fd9"
[2017-10-16T15:56:30,122][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2017-10-16T15:56:30,122][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_9c1ee8e6-9fa4-4553-96d1-803214216fd9", enable_metric=>true, metadata=>false>
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "9e9162561d919c7b40b4a16e9f4e8e6e81267f8d-3"
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2017-10-16T15:56:30,364][DEBUG][logstash.agent           ] starting agent
[2017-10-16T15:56:30,367][DEBUG][logstash.agent           ] starting pipeline {:id=>"main"}
[2017-10-16T15:56:30,384][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2017-10-16T15:56:31,479][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.10.20.13:9200/]}}
[2017-10-16T15:56:31,480][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.10.20.13:9200/, :path=>"/"}
[2017-10-16T15:56:31,945][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.10.20.13:9200/"}
[2017-10-16T15:56:31,968][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-10-16T15:56:32,144][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-10-16T15:56:32,188][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2017-10-16T15:56:32,189][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.10.20.13:9200"]}
[2017-10-16T15:56:32,192][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
[2017-10-16T15:56:33,141][INFO ][logstash.pipeline        ] Pipeline main started
[2017-10-16T15:56:33,202][DEBUG][logstash.inputs.file     ] _globbed_files: /usr/share/logstash/log/monitor-sys-1506979201881.json: glob is: ["/usr/share/logstash/log/monitor-sys-1506979201881.json"]
[2017-10-16T15:56:33,203][DEBUG][logstash.inputs.file     ] _discover_file: /usr/share/logstash/log/monitor-sys-1506979201881.json: new: /usr/share/logstash/log/monitor-sys-1506979201881.json (exclude is [])
[2017-10-16T15:56:33,204][DEBUG][logstash.inputs.file     ] _open_file: /usr/share/logstash/log/monitor-sys-1506979201881.json: opening
[2017-10-16T15:56:33,205][DEBUG][logstash.inputs.file     ] /usr/share/logstash/log/monitor-sys-1506979201881.json: initial create, no sincedb, seeking to beginning of file
[2017-10-16T15:56:33,206][DEBUG][logstash.inputs.file     ] writing sincedb (delta since last write = 1508158593)
[2017-10-16T15:56:33,207][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:33,208][DEBUG][logstash.agent           ] Starting puma
[2017-10-16T15:56:33,211][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2017-10-16T15:56:33,212][DEBUG][logstash.api.service     ] [api-service] start
[2017-10-16T15:56:33,321][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-10-16T15:56:34,211][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:35,214][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:36,223][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:37,227][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:38,158][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-10-16T15:56:38,230][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:39,233][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:40,238][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:41,240][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:42,243][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:43,159][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-10-16T15:56:43,245][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:44,247][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:45,250][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:46,252][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:47,255][DEBUG][logstash.inputs.file     ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115
[2017-10-16T15:56:47,257][DEBUG][logstash.inputs.file     ] _globbed_files: /usr/share/logstash/log/monitor-sys-1506979201881.json: glob is: ["/usr/share/logstash/log/monitor-sys-1506979201881.json"]

我将不胜感激。

最佳答案

IßM使用Logstash-5.5.0
我的Logstash配置示例:

input {
file {
    path => "/opt/logs/json.log"
    start_position => "beginning"
    type => "logstash"
  }
}
filter {
    json {
        source => "message"
        skip_on_invalid_json => true
        # add_field => { "testfield" => "test_static_value" }
        # add_tag => [ "test_tag" ]
        # target => "test_target"
    }
}
output {
    if [type] == "logstash" {
        elasticsearch { 
            hosts => ["elasticsearch:9200"] 
            index => "logstash"
        }
    }
    stdout { codec => rubydebug }
}

关于json - 无法通过Logstash将JSON数据导入Elastic,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46771001/

相关文章:

elasticsearch - 在Elasticsearch中将选择具有特定字段的文档设置为NULL

javascript - 如何使用嵌套的 ng-repeat 循环两个 JSON 对象?

python - Python Elasticsearch范围查询

json - 使用 JQ 将 JSON 对象添加到 JSON 文件

elasticsearch - 如何使用 elasticsearch 5.2.1 配置 Spring Boot?

elasticsearch - Geoip Logstash过滤器

elasticsearch - 更新Elasticsearch中的嵌套对象

elasticsearch - 使用 logstash 和 kibana 分析来自日志文件的延迟

python - Python 中 Twitter API 的无效 JSON 对象

json - Spring-Data-REST:HAL 中 @Embeddable 的自定义序列化