elasticsearch - 程序生成样本日志以提供给logstash?

标签 elasticsearch logstash kibana elastic-stack log-analysis

我编写了一个小的Java程序,该程序会生成一些虚拟日志(基本上将内容写入txt文件)。现在,我想将此数据提供给ELK堆栈。基本上,logstash应该从txt文件读取此数据,而我想在kibana上可视化这些更改,以了解它。

然后,我基本上想要做的就是更改程序将虚拟日志写入txt文件的速度,以便我可以看到kibana上的更改。

我刚刚开始探索ELK堆栈,这可能是进行这种分析的完全错误的方法。请建议是否还有其他更好的方法(考虑到我现在没有实际的日志可以使用)

编辑:@Val

input {
    generator {
        message => “’83.149.9.216 - - [17/May/2015:10:05:03 +0000] "GET /presentations/logstash-monitorama-2013/images/kibana-search.png HTTP/1.1" 200 203023 "http://semicomplete.com/presentations/logstash-monitorama-2013/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.77 Safari/537.36””
        count => 10
    }
}

所以这是我的logstash.conf:
input {

 stdin { }

}


filter {
  grok {
    match => {
      "message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:response:int} (?:-|%{NUMBER:bytes:int}) %{QS:referrer} %{QS:agent}'
    }
  }

  date {
    match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
    locale => en
  }

  geoip {
    source => "clientip"
  }

  useragent {
    source => "agent"
    target => "useragent"
  }
}

output {
  stdout {
codec => plain {
                        charset => "ISO-8859-1"
                }

}
  elasticsearch {
    hosts => "http://localhost:9200"
    index => "apache_elk_example"
    template => "./apache_template.json"
    template_name => "apache_elk_example"
    template_overwrite => true
  }
}

现在,在开始elasticsearch和kabana之后,我做了:
cat apache_logs | /usr/local/opt/logstash/bin/logstash -f apache_logs

从我的Java程序中获取apache_logs的位置:
public static void main(String[] args) {
    // TODO Auto-generated method stub
    try {
        PrintStream out = new PrintStream(new FileOutputStream("/Users/username/Desktop/user/apache_logs"));
        System.setOut(out);
    } catch (FileNotFoundException ex) {
        System.out.print("Exception");
    }
    while(true)
    //for(int i=0;i<5;++i)
    {
        System.out.println(generateRandomIPs() + //other log stuff);
        try {
            Thread.sleep(1000);                 //1000 milliseconds is one second.
        } catch(InterruptedException ex) {
            Thread.currentThread().interrupt();
        }
    }
}

所以这是问题所在:

Kibana不会向我显示实时可视化,即当我的Java程序将数据输入apache_log文件时,它不会向我显示数据。它仅显示直到执行以下命令时已将任何数据写入“apache_log”:
cat apache_logs | /usr/local/opt/logstash/bin/logstash -f apache_logs

最佳答案

可能有点晚了,但我写了我的意思的一小部分样本。

我修改了您的java程序以添加如下所示的时间戳:

public class LogWriter {


    public static Gson gson = new Gson();

    public static void main(String[] args) {

        try {
            PrintStream out = new PrintStream(new FileOutputStream("/var/logstash/input/test2.log"));
            System.setOut(out);
        } catch (FileNotFoundException ex) {
            System.out.print("Exception");
        }

        Map<String, String> timestamper = new HashMap<>();

        while(true)
        {

            String format = LocalDateTime.now().format(DateTimeFormatter.ISO_DATE_TIME);

            timestamper.put("myTimestamp", format);
            System.out.println(gson.toJson(timestamper));
            try {
                Thread.sleep(1000);                 //1000 milliseconds is one second.
            } catch(InterruptedException ex) {
                Thread.currentThread().interrupt();
            }
        }

    }
}

现在这样写json:
{"myTimestamp":"2016-06-10T10:42:16.299"}
{"myTimestamp":"2016-06-10T10:42:17.3"}
{"myTimestamp":"2016-06-10T10:42:18.301"}

然后,我设置logstash以读取该文件并将其解析并输出到stdout:
input {
  file {
     path => "/var/logstash/input/*.log"
     start_position => "beginning"
     ignore_older => 0
     sincedb_path => "/dev/null"
  }   
}

filter {
   json {
      source => "message"
   }
}

output {
    file {
           path => "/var/logstash/out.log"
    }
    stdout { codec => rubydebug }
}

因此,它将获取我的日志,该日志知道何时创建,解析该日志,并创建一个新的时间戳,表示何时看到该日志:
{
        "message" => "{\"myTimestamp\":\"2016-06-10T10:42:17.3\"}",
       "@version" => "1",
     "@timestamp" => "2016-06-10T09:42:17.687Z",
           "path" => "/var/logstash/input/test2.log",
           "host" => "pandaadb",
    "myTimestamp" => "2016-06-10T10:42:17.3"
}
{
        "message" => "{\"myTimestamp\":\"2016-06-10T10:42:18.301\"}",
       "@version" => "1",
     "@timestamp" => "2016-06-10T09:42:18.691Z",
           "path" => "/var/logstash/input/test2.log",
           "host" => "pandaadb",
    "myTimestamp" => "2016-06-10T10:42:18.301"
}

现在,您可以在这里看到处理日志需要多长时间。这大约是300毫秒,我认为您的java编写器是异步编写器,不会立即刷新。

您甚至可以通过使用过去的插件使此过程变得“凉爽”,该插件将为您计算这些时间戳之间的时差。

我希望这对您的测试有所帮助:)可能不是最先进的测试方法,但它很容易理解,并且非常快捷。

阿图尔

关于elasticsearch - 程序生成样本日志以提供给logstash?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37725689/

相关文章:

elasticsearch - 找不到[@timestamp]的映射以对logstash进行排序

redis - Logstash、elasticsearch、Kibana,包括IP

amazon-web-services - 在 Nginx 后面的 vpc 中运行的 AWS OpenSearch 不显示租户

angularjs - 将 kibana 仪表板集成到 angularjs 应用程序中

lucene - 如何在 Elasticsearch 中启用词干提取?

indexing - 如何优化Elasticsearch的索引编制?

spring - Elasticsearch 5.x存储库Java Spring Boot

elasticsearch - 与ELK查找日志中最常见的错误字符串

elasticsearch - 是否可以为文件拍配置多个输出?

java - 使 ElasticSearch 在配置错误时失败