gpt4 book ai didi

mysql - Logstash sql_last_value 未更新

转载 作者:行者123 更新时间:2023-11-29 15:18:19 25 4
gpt4 key购买 nike

我希望使用logstash配置将记录从一个mysql表迁移到elasticsearch。我正在检查文件logstash_jdbc_last_run_issued 没有更改/更新,因此sql_last_value 也没有更改。当我在表工件上添加一条记录时,索引 emp7 正在插入复制信息而不停止。所以它的指数一直在增长。除非我破坏logstash进程。

Logstash 配置:

input {
jdbc {
jdbc_driver_library => "e:\Data\logstash\bin\mysql-connector-java-5.1.18-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/kibana"
jdbc_user => "userdb"
jdbc_password => "passdb"
last_run_metadata_path => "e:\Data\logstash\bin\logstash_jdbc_last_run_issued"
tracking_column => "issuedDate"
tracking_column_type => "numeric"
use_column_value=>true
statement => "SELECT serial, name, issuedDate FROM artifact where issuedDate > :sql_last_value; "
schedule => " * * * * * *"
}
}
output {
elasticsearch {
hosts => "http://127.0.0.1:9200"
index => "emp7"
document_type => "doc"
user => "user"
password => "pass"
}
stdout {
codec => rubydebug
}
}

表结构信息:工件

serial     varchar(40)
name varchar(40)
issuedDate bigint(20)

我给你logstash调试结果:

[2019-12-30T11:38:46,004][INFO ][logstash.inputs.jdbc     ] (0.000000s) SELECT serial, name, issuedDate FROM artifact where issuedDate > 0;
[2019-12-30T11:38:46,004][WARN ][logstash.inputs.jdbc ] tracking_column not found in dataset. {:tracking_column=>"issuedDate"}

文件logstash_jdbc_last_run_issued内容:

--- 0

我正在使用logstash 6.0、elasticsearch 6.0和kibana 6.0

我的问题是logstash配置缺少什么?

最佳答案

我明白发生了什么了。该问题与数据集中找不到跟踪列有关。 {:tracking_column...}。

我在 jdbc 部分添加了 lowercase_column_names => false。此外,我添加了 clean_run => false。终于开始工作了。我理解logstash默认情况下将tracking_column小写。所以我禁用了它。

input {
jdbc {
jdbc_driver_library => "e:\Data\logstash\bin\mysql-connector-java-5.1.18-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/kibana"
jdbc_user => "userdb"
jdbc_password => "passdb"
last_run_metadata_path => "e:\Data\logstash\bin\logstash_jdbc_last_run_issued"
tracking_column => "issuedDate"
use_column_value=>true
lowercase_column_names => false
clean_run => false
statement => "SELECT serial, name, issuedDate FROM artifact where issuedDate > :sql_last_value; "
schedule => " * * * * * *"
}
}
output {
elasticsearch {
hosts => ["http://127.0.0.1:9200"]
index => "emp7"
document_type => "doc"
user => "user"
password => "pass"
}
stdout {
codec => rubydebug
}
}

关于mysql - Logstash sql_last_value 未更新,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59534678/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com