elasticsearch - 将坐标存储为geo_point的问题

标签 elasticsearch logstash logstash-configuration

我面临有关将坐标存储为geo_point的问题。我的目标是通过我收到的JSON格式的Logstash管道将文档添加到ElasticSearch。省略大多数字段,JSON如下所示:

{
    "text": "hello",
    "location": {
    "lat": "42.5064128",
    "lon": "1.52069438894224"
    }
}

由于我不熟悉ELK堆栈,因此创建了一个最小,完整和可验证的示例:

/etc/logstash/conf.d/mycompany-tweet-demo.conf:
input {
    tcp {
        port => 10000
        codec => "json"
    }
}

filter {    
    mutate {
        # this "works", however the location in no way matches the coordinates when displayed via tile map in Kibana
        #add_field => { "mylocation" => 52 }
        #add_field => { "mylocation" => 8 }

        # this yields "illegal latitude value [268.76953125] for mylocation"
        add_field => { "mylocation" => 51.9 }
        add_field => { "mylocation" => 7.9 }

        # this yields "illegal latitude value [269.1322946548462] for mylocation" for the demo JSON
        #add_field => { "mylocation" => "%{[location][lat]}" }
        #add_field => { "mylocation" => "%{[location][lon]}" }
    }
}

output {
    elasticsearch {
        hosts => "http://localhost:9200"
        index => "mycompany-tweet-demo"
        document_type => "tweet"

        template => "/etc/logstash/templates/mycompany-demo-template.json"
        template_overwrite => true
    }
}

/etc/logstash/templates/mycompany-demo-template.json:
{
    "template": "mycompany-*",
    "mappings": {
        "_default_": {
            "properties": {
                "mylocation" : {
                    "type" : "geo_point"
                }
            }
        }
    }
}

〜/ one.json:
{"text":"hello","location":{"lat":"42.5064128","lon":"1.52069438894224"}}

猫〜/ one.json | nc本地主机10000
[2017-01-13T17:41:38,504][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"mycompany-tweet-demo", :_type=>"tweet", :_routing=>nil}, 2017-01-13T16:41:38.458Z 0:0:0:0:0:0:0:1 %{message}], :response=>{"index"=>{"_index"=>"mycompany-tweet-demo", "_type"=>"tweet", "_id"=>"AVmYtLvEovM5deO5CNpA", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"illegal latitude value [269.1322946548462] for mylocation"}}}}}

根据geo_point的文档以及我读过的一些关于stackoverflow的文章,我希望我的示例可以使用0到90之间的值。谁能指出我做错了什么?

最佳答案

您快要准备好了,您需要做的是:

    add_field => { "[mylocation][lat]" => "%{[location][lat]}" }
    add_field => { "[mylocation][lon]" => "%{[location][lon]}" }

关于elasticsearch - 将坐标存储为geo_point的问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41640098/

相关文章:

elasticsearch - 使用 logstash 创建自定义 elasticsearch 索引

ruby-on-rails - ELK stack + Filebeat 收集Rails日志

elasticsearch - 为具有模糊性的cross_fields构建有效的Elasticsearch查询

apache-spark - 无法使用 PySpark 从 Elasticsearch 读取

elasticsearch - Elasticsearch + Filebeat + Logstash

elasticsearch - 为什么Logstash写入两个索引? (默认和自定义)

elasticsearch - 如何在Kibana中的两个过滤器之间汇总数据?

elasticsearch - ELK Docker-Kibana保存的对象

kubernetes - 按 pod 名称过滤 Kubernetes API

docker - logstash-5.x gelf输入多行编解码器不起作用