gpt4 book ai didi

sql-server - 无法使用Logstash conf文件将数据推送到Elasticsearch中Windows Powershell中显示执行操作错误失败

转载 作者:行者123 更新时间:2023-12-03 02:33:16 26 4
gpt4 key购买 nike

当我按照这些步骤操作时,我在a blog上将数据导入Kibana时,在Logstash中遇到错误。

这是我的配置文件:

    input { 
file {
path => "C:/SalesJan2009/SalesJan2009.csv"
type => "csv"
start_position => "beginning"
sincedb_path => "C:/SalesJan2009/sinceDb" }
}
filter {
csv {
separator => ","
columns => ["Transaction_date","Product","Price","Payment_Type","Name","City","State","Country","Account_Created","Last_Login","Latitude","Longitude"]
skip_empty_columns => "true"
}
mutate {
convert => [ "Product" => "string" ]
convert => [ "Price" => "float" ]
convert => [ "Payment_Type" => "string" ]
convert => [ "Name" => "string" ]
convert => [ "City" => "string" ]
convert => [ "State" => "string" ]
convert => [ "Country" => "string" ]
convert => [ "Longitude" => "float" ]
convert => [ "Latitude" => "float" ]
}
date
{
match => ["Transaction_date", "dd-MM-yyyyHH:mm:ss"]
match => ["Account_Created", "dd-MM-yyyyHH:mm:ss"]
match => ["Last_Login", "dd-MM-yyyyHH:mm:ss"]
}
}
output {
elasticsearch { hosts => ["http://localhost:9200"]
index => "salestansactions2009"
}
stdout {
codec => dots
}
}

错误是:
PS A:\elk\logstash\logstash-7.4.2\bin> ./logstash -f C:\SalesJan2009\testdata.conf                                                                                      Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to A:/elk/logstash/logstash-7.4.2/logs which is now configured via log4j2.properties
[2019-12-24T12:54:26,310][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-12-24T12:54:26,354][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.2"}
[2019-12-24T12:54:29,991][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 15, column 32 (byte 465) after filter { \r\n csv { \r\n\t separator => \",\" \r\n columns => [\"Transaction_date\",\"Product\",\"Price\",\"Payment_Type\",\"Name\",\"City\",\"State\",\"Country\",\"Account_Created\",\"Last_Login\",\"Latitude\",\"Longitude\"] \r\n\t skip_empty_columns => \"true\"\r\n\t } \r\n mutate { \r\n \tconvert => [ \"Product\" ", :backtrace=>["A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2584:in `map'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:153:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:26:in `initialize'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}
[2019-12-24T12:54:30,882][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-12-24T12:54:35,502][INFO ][logstash.runner ] Logstash shut down.

最佳答案

mutate-convert的语法错误。
错误消息显示:

"Expected one of #, {, ,, ] at line 15, column 32 (byte 465) after filter { \r\n csv { \r\n\t separator => \",\" \r\n columns => [\"Transaction_date\",\"Product\",\"Price\",\"Payment_Type\",\"Name\",\"City\",\"State\",\"Country\",\"Account_Created\",\"Last_Login\",\"Latitude\",\"Longitude\"] \r\n\t skip_empty_columns => \"true\"\r\n\t } \r\n mutate { \r\n \tconvert => [ \"Product\" ", ...



所以错误是在转换=> [“产品”之后

看看 documentation

值类型为 哈希。因此,您对方括号([])的使用是错误的。它应该是:
input { 
file {
path => "C:/SalesJan2009/SalesJan2009.csv"
type => "csv"
start_position => "beginning"
sincedb_path => "C:/SalesJan2009/sinceDb"
}
}

filter {
csv {
separator => ","
columns => ["Transaction_date","Product","Price","Payment_Type","Name","City","State","Country","Account_Created","Last_Login","Latitude","Longitude"]
skip_empty_columns => "true"
}

mutate {
convert => {
"Product" => "string"
"Price" => "float"
"Payment_Type" => "string"
"Name" => "string"
"City" => "string"
"State" => "string"
"Country" => "string"
"Longitude" => "float"
"Latitude" => "float"
}
}

date {
match => ["Transaction_date", "dd-MM-yyyyHH:mm:ss"]
match => ["Account_Created", "dd-MM-yyyyHH:mm:ss"]
match => ["Last_Login", "dd-MM-yyyyHH:mm:ss"]
}
}

output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "salestansactions2009"
}
stdout {
codec => dots
}
}

关于sql-server - 无法使用Logstash conf文件将数据推送到Elasticsearch中Windows Powershell中显示执行操作错误失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59465452/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com