jdbc - Logstash 'com.mysql.jdbc.Driver' 未加载

标签 jdbc logstash logstash-configuration logstash-jdbc

我有问题 jdbc_driver_library .我正在使用 ELK_VERSION = 6.4.2我将 Docker 用于 ELK。
当我运行时:

/opt/logstash# bin/logstash -f /etc/logstash/conf.d/mysql.conf
我收到一个错误:
error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
驱动路径:
root@xxxxxxx:/etc/logstash/conectors# ls
mysql-connector-java-8.0.12.jar 
root@xxxxxxxxxx:/etc/logstash/conectors#
mysql.conf:
input {
  jdbc {
    jdbc_driver_library => "/etc/logstash/conectors/mysql-connector-java-8.0.12.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
    jdbc_user => "demouser"
    jdbc_password => "demopassword"
    statement => "SELECT id,name,city from ads"
  }
}

output {
    stdout { codec => rubydebug }
    elasticsearch {
        index => 'test'
        document_type => 'tes'
        document_id => '%{id}'
        hosts => ['http://localhost:9200']
    }
}
整个错误:
root@xxxxx:/opt/logstash# bin/logstash -f /etc/logstash/conf.d/mysql.conf
Sending Logstash logs to /opt/logstash/logs which is now configured via log4j2.properties
[2018-11-10T09:03:22,081][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-11-10T09:03:23,628][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.2"}
[2018-11-10T09:03:30,482][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-10T09:03:31,479][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-11-10T09:03:31,928][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-11-10T09:03:32,067][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-10T09:03:32,076][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-11-10T09:03:32,154][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-11-10T09:03:32,210][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-10T09:03:32,267][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-10T09:03:32,760][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x202f727c run>"}
[2018-11-10T09:03:32,980][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-10T09:03:33,877][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-10T09:03:34,315][ERROR][logstash.pipeline        ] A plugin had an unrecoverable error. Will restart this plugin.
  Pipeline_id:main
  Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"demouser", jdbc_password=><password>, statement=>"SELECT id,name,city from ads", jdbc_driver_library=>"/etc/logstash/conectors/mysql-connector-java-8.0.12.jar", jdbc_connection_string=>"jdbc:mysql://localhost:3306/mydb", id=>"233c4411c2434e93444c3f59eb9503f3a75cab4f85b0a947d96fa6773dac56cd", jdbc_driver_class=>"com.mysql.jdbc.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_cf5ab80c-91e4-4bc4-8d20-8c5a0f9f8077", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 +0000}, last_run_metadata_path=>"/root/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
  Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
  Exception: LogStash::ConfigurationError
  Stack: /opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in `open_jdbc_connection'
/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in `execute_statement'
/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in `execute_query'
/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:in `run'
/opt/logstash/logstash-core/lib/logstash/pipeline.rb:409:in `inputworker'
/opt/logstash/logstash-core/lib/logstash/pipeline.rb:403:in `block in start_input'
当我构建图像并使用 docker run 时,我收到另一个错误:
[2018-11-10T10:32:52,935][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/opt/logstash/data/queue"}
[2018-11-10T10:32:52,966][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/logstash/data/dead_letter_queue"}
[2018-11-10T10:32:54,509][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
当我使用 PostgreSQL 时,同样的问题。
配置文件
input {
   jdbc {
     type => 'test'
     jdbc_driver_library => '/etc/logstash/postgresql-9.1-901-1.jdbc4.jar'
     jdbc_driver_class => 'org.postgresql.Driver'
     jdbc_connection_string => 'jdbc:postgresql://localhost:5432/mytestdb'
     jdbc_user => 'postgres'
     jdbc_password => 'xxxxxx'
     jdbc_page_size => '50000'
     statement => 'SELECT id, name, city FROM ads'
   }
 }
然后我运行:
/opt/logstash# bin/logstash -f /etc/logstash/conf.d/psql.conf
错误:
error: org.postgresql.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?

最佳答案

我遇到了同样的问题,下面的解决方案解决了我的问题。

对于 logstash 6.2.x 及更高版本,在下面添加所需的驱动程序:

logstash_install_dir/logstash-core/lib/jars/
并且不要在配置文件中提供任何驱动程序路径。

关于jdbc - Logstash 'com.mysql.jdbc.Driver' 未加载,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53237741/

相关文章:

java - 从数据库中检索最大值并在 java eclipse 中的 JTextfield 中显示

java - 组织.postgresql.util.PSQLException : Malformed function or procedure escape syntax at offset 1

java - 添加的 H2 数据在断开连接之前不会刷新

json - 如何在 Log-stash HTTP 输出中映射嵌套的 JSON

elasticsearch - 在customFields LogstashEncoder中添加动态值(日期)

java - HQL如何查询String的ElementCollection

docker - Syslog 驱动程序不适用于 docker compose 和 elk stack

elasticsearch - Logstash/Elasticsearch 保持自动映射 geoip 到对象而不是 geo_point

elasticsearch - logstash 与 spark streaming 和 storm

csv - 当使用Logstash导入csv时,ElasticSearch抛出字符串到日期转换错误