- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我无法使用logstash将数据加载到本地主机上的 flex 搜索节点上...我想让logstash读取csv文件并将这些数据加载到 flex 搜索中。但是没有任何效果,我只能读取手动添加到 flex 搜索中的数据,logstash似乎无能为力。
我的logstash配置是这样的:
input {
file {
path => [ "C:\Users\Michele\Downloads\logstash-1.5.3\logstash-1.5.3\Users\*.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ["timestamp", "impianto", "tipo_misura", "valore","unita_misura"]
separator => ","
}
}
output {
elasticsearch {
action => "index"
host => "localhost"
cluster => "elasticsearch"
node_name => "NCC-1701-A"
index => "myindex"
index_type => "pompe"
workers => 1
}
}
我的csv文件是:
2015-08-03T18:46:00,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:46:10,Abbiategrasso,Pressione gruppo 1,44.4,m
2015-08-03T18:46:20,Abbiategrasso,Pressione gruppo 1,66.6,m
2015-08-03T18:46:30,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:46:40,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:46:50,Abbiategrasso,Pressione gruppo 1,77.7,m
2015-08-03T18:47:00,Abbiategrasso,Pressione gruppo 1,11.1,m
2015-08-03T18:47:10,Abbiategrasso,Pressione gruppo 1,44.4,m
2015-08-03T18:47:20,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:47:30,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:47:40,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:47:50,Abbiategrasso,Pressione gruppo 1,66.6,m
2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:48:10,Abbiategrasso,Pressione gruppo 1,77.7,m
2015-08-03T18:48:20,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:48:30,Abbiategrasso,Pressione gruppo 1,88.8,m
2015-08-03T18:48:40,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:48:50,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:49:00,Abbiategrasso,Pressione gruppo 1,55.5,m
“myindex”索引上没有新内容!但是我不知道为什么
←[33mfailed action with response of 400, dropping action: ["index", {:_id=>nil,
:_index=>"abbiategrasso", :_type=>"pompe", :_routing=>nil}, #<LogStash::Event:0x
1cea7b7 @metadata_accessors=#<LogStash::Util::Accessors:0x1e577ee @store={"path"
=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiate
grasso.csv", "retry_count"=>0}, @lut={"[path]"=>[{"path"=>"C:\\Users\\Michele\\D
ownloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "retry_coun
t"=>0}, "path"]}>, @cancelled=false, @data={"message"=>["2015-08-03T18:48:00,Abb
iategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-0
9-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Download
s\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015
-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo
1", "valore"=>"66.6", "unita_misura"=>"m"}, @metadata={"path"=>"C:\\Users\\Miche
le\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "retry
_count"=>0}, @accessors=#<LogStash::Util::Accessors:0x2ff785 @store={"message"=>
["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"
1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\
\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso
.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_mi
sura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, @lut={"host
"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"
], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-H
P", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\User
s\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiateg
rasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"
m"}, "host"], "path"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione
gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z",
"host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\l
ogstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "i
mpianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6"
, "unita_misura"=>"m"}, "path"], "message"=>[{"message"=>["2015-08-03T18:48:00,A
bbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015
-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downlo
ads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"20
15-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione grupp
o 1", "valore"=>"66.6", "unita_misura"=>"m"}, "message"], "timestamp"=>[{"messag
e"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version
"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>
"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategr
asso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tip
o_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "timest
amp"], "impianto"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gr
uppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "h
ost"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logs
tash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impi
anto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "
unita_misura"=>"m"}, "impianto"], "tipo_misura"=>[{"message"=>["2015-08-03T18:48
:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>
"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\D
ownloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"
=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione
gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "tipo_misura"], "valore"=>[{"
message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@v
ersion"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "p
ath"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abb
iategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso"
, "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "
valore"], "unita_misura"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Press
ione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.50
1Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.
3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00"
, "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"6
6.6", "unita_misura"=>"m"}, "unita_misura"]}>>] {:level=>:warn, :file=>"/Users/M
ichele/Downloads/logstash-1.5.3/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logs
tash-output-elasticsearch-1.0.5-java/lib/logstash/outputs/elasticsearch.rb", :li
ne=>"531", :method=>"submit"}←[0m
Logstash无法上传数据...
input {
file {
path => [ "C:\Users\Michele\Downloads\logstash-1.5.3\logstash-1.5.3\Users\abbiategrasso4.csv" ]
start_position => "beginning"
}
}
filter {
csv {
columns => ["timestamp", "impianto", "tipo_misura", "valore","unita_misura"]
separator => ","
}
mutate {
convert => { "valore" => "float" }
}
}
output {
elasticsearch {
action => "index"
host => "localhost"
cluster => "elasticsearch"
node_name => "NCC-1701-A"
index => "abbiategrasso"
document_type => "pompe"
workers => 1
}
stdout { codec => rubydebug }
}
现在可以使用了,我将尝试Kibana :)
最佳答案
在进行故障排除时,我将从打印标准输出开始。
output { stdout { codec => rubydebug } }
sincedb_path => "/dev/null"
关于csv - Logstash不会将CSV数据加载到 Elasticsearch ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31974293/
我在 logstash 中使用文件作为日志的输入。我的日志文件每天轮换,所以我想问一下我们如何配置 logstash 的文件插件,以便它可以处理每天轮换的文件。除此之外,文件节拍也可以进行日志轮换。
我正在我公司服务的服务器上实现监控工具。为此,我正在使用 logstash。我们的应用程序通过 log4net udp appender 将它们的日志发送到 logstash(输入 udp),然后 l
我期待对 Logstash 中收到的输入使用数学运算,但无法看到任何此类 过滤器 . 输入如下: { "user_id": "User123", "date": "2016 Jun 26 12
我对 logstash 和 Elasticsearch 很陌生。我正在尝试将日志文件存储在 elasticsearch 和平面文件中。我知道 logstash 支持两种输出。但是它们是同时处理的吗?还
寻求一些入门帮助...我已经安装了 Logstash(以及 ElasticSearch),但我正在为我的第一个过滤器而苦苦挣扎。 作为测试,我将其配置为从包含 6 行的修剪日志文件中读取,每行以时间戳
我已经按照下面提到的架构实现了 logstash(在测试中)。 成分分解 Rsyslog 客户端:默认情况下,所有 Linux destros 中都安装了 syslog,我们只需要配置 rsyslog
我无法在 LogStash 中使用负正则表达式(如 the docs 中所述) 考虑以下正则表达式,它可以正常工作以检测已分配值的字段: if [remote_ip] =~ /(.+)/ {
我在云中使用两台服务器,在一台服务器上 (A) 我安装了 filebeat,在第二台服务器上 (B) 我安装了 logstash、elasticsearch 和 kibana。所以我在 logstas
我有一个来自 Windows 事件日志的 IP 地址字段,它在 IP 地址前面包含类似“::fffff:”的字符。我无法在此处更改源,因此我必须在 Logstash 中修复此问题。 我一定很不擅长谷歌
我正在尝试将此日期结构 YYYY-MM-DD_HH-MM-SS 转换为 logstash 中的 YYYY-MM-DD HH:MM:SS。这是我的过滤器: filter { csv {
我正在使用 Logstash(以 Kibana 作为 UI)。我想从我的日志中提取一些字段,以便我可以在 UI 的 LHS 上按它们进行过滤。 我日志中的示例行如下所示: 2013-07-04 00:
如何将此 Logstash 过滤器更改为不区分大小写? filter { if "foo" in [message] { mutate { add_field => { "Alert_le
我正在尝试将事件消息与几个正则表达式相匹配。我打算使用 grep 过滤器,但它已被弃用,所以我正在尝试使用否定的方法。 我正在寻找的功能是删除所有事件,除非消息匹配多个正则表达式。 过滤器波纹管不起作
我用过logstash的RPM安装。因此,logstash 作为 linux 服务运行。我想调试一个管道,需要查看的内容 output { stdout { codec => rubydebug
如何在 logstash 中比较日期。我想将日期与恒定日期值进行比较。以下代码在 Logstash 中失败并出现 ruby 异常。 if [start_dt] { "str_dt" => "20
我正在从logstash-1.1.3升级到logstash-1.3.3。问题是,1.1.3 中的标签和字段配置在 1.3.3 版本中已弃用。这些允许仅将那些事件发送到具有给定标签或包含给定字段的输出。
我想在同一台机器上运行两个 logstash 实例。现在我使用命令启动 logstash。logstash.bat agent -f logstashconf.conf。但是当我要通过相同的命令启动第
我有这种格式的 php 日志 [Day Mon DD HH:MM:SS YYYY] [Log-Type] [client ] : [Day Mon DD HH:MM:SS YYYY] [Log-Ty
我的 logstash 中的一些请求使 http 输出插件失败,并且日志显示 [2020-10-16T18:44:54,574][ERROR][logstash.outputs.http ] [HTT
我正在探索Logstash来接收HTTP上的输入。我已经使用以下方法安装了http插件: 插件安装logstash-input-http 安装成功。然后我尝试使用以下命令运行logstash: log
我是一名优秀的程序员,十分优秀!