gpt4 book ai didi

python - 在 hive 或 pyspark 中透视日志

转载 作者:可可西里 更新时间:2023-11-01 15:55:48 26 4
gpt4 key购买 nike

我有很多这种格式的文件日志:

[Windows user      ] Pâmela
[Host name ] DV6000
[Local time ] 14:25:07
[System time ] 17:25:07
[ASCWebBrowser info] 1.1.1
[Last Write Time ] 07/19/2016 14:01
[HD Info ] Volume name: , Serial: 1713925408, File System: NTFS, Max Component Length: 255
[Network Info
[Index ] 48
[Type ] 1
[Description ] TAP-Win32 Adapter OAS #6
[Name ] {343D77F2-B3CE-414B-AE01-E248D3FC85F6}
[Ip address ] 169.254.92.162
[MAC Address ] 00-FF-34-3D-77-F2
[Gateway ] 0.0.0.0
[Mask ] 255.255.0.0

[Index ] 38
[Type ] 1
[Description ] TAP-Windows Adapter V9 #6
[Name ] {C81FC3F7-19F9-44DD-9470-4982F48A141D}
[Ip address ] 169.254.96.118
[MAC Address ] 00-FF-C8-1F-C3-F7
[Gateway ] 0.0.0.0
[Mask ] 255.255.0.0

[Index ] 36
[Type ] 1
[Description ] TAP-Win32 Adapter OAS #5
[Name ] {72115AC7-4EE2-4CB3-A8D2-
]

我需要将每一行转换成一列。如您所见,有 1 个或多个网络信息。那将是一张子表,其余的都是父表。我已经通过 Hive 阅读了这个日志,但我现在仍然停留在如何旋转/转置它上。

到目前为止,我已经尝试了以下方法:

  1. Spark DataFrame 枢轴。没办法,因为需要聚合。
  2. Pandas DataFrame 枢轴。它提示索引重复。相同的信息可以出现在不同的日志中,因此唯一唯一的是文件名。
  3. Hive 中的 SQL CASE 方法。它不生成线性信息。有许多 NULL。
  4. 加入。尝试使用文件名作为连接列进行自连接,但它会生成笛卡尔结果。 RowNumber 是由 dense_rank) 在 fname 上生成的列。问题是对于每个 IP,它都连接到每个描述,而不仅仅是相同的描述。因此,对于 2 个 IP,它为每个掩码创建 4 行,8 行,依此类推。

    select coalesce(hn.value, "No_Name") as hostname, d.value as decription, 
    g.value as gateway,i.value as "index", p.value as IP, mc.value as MAC,
    m.value as Mask, n.value as "Name", t.value as "Type"
    from net_asclogs_p hn left join net_asclogs_p d on hn.fname=d.fname and d.rownumber= 1
    left join net_asclogs_p g on hn.fname=g.fname and g.rownumber=2
    left join net_asclogs_p i on hn.fname=i.fname and i.rownumber=4
    left join net_asclogs_p p on hn.fname=p.fname and p.rownumber=5
    left join net_asclogs_p mc on hn.fname=mc.fname and mc.rownumber=6
    left join net_asclogs_p m on hn.fname=m.fname and m.rownumber=7
    left join net_asclogs_p n on hn.fname=n.fname and n.rownumber=8
    left join net_asclogs_p t on hn.fname=t.fname and t.rownumber=9
    where hn.rownumber=3;
  5. 尝试了 Brickhouse 的 Collect,但它只带来了最后一条记录,而不是全部。

  6. 尝试了 RegexSerde,但我确定我在这里没有得到任何东西,因为所有字段都是空的:

    CREATE EXTERNAL TABLE IF NOT EXISTS asclogs1 (host string, index string) 
    ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.RegexSerDe' WITH SERDEPROPERTIES
    ( "input.regex" = "Host name\\s{2,}\\]\\s(\\w+)|Index\\s{2,}\\]\\s(\\w+).*",
    "output.format.string" = "%1$s %2$s" )
    STORED AS TEXTFILE LOCATION 'hdfs:///asclogs/'

好吧,我没主意了。最后的办法是用 Java 编写自定义类。有其他选择吗?

最佳答案

create external table log (key string,val string)
row format serde 'org.apache.hadoop.hive.serde2.RegexSerDe'
with serdeproperties ("input.regex" = "\\s*\\[(.*?)\\s*(?:\\]|$)\\s*(.*)")
;

select * from log
;

+--------------------+---------------------------------------------------------------------------------+
| key | val |
+--------------------+---------------------------------------------------------------------------------+
| Windows user | Pâmela |
| Host name | DV6000 |
| Local time | 14:25:07 |
| System time | 17:25:07 |
| ASCWebBrowser info | 1.1.1 |
| Last Write Time | 07/19/2016 14:01 |
| HD Info | Volume name: , Serial: 1713925408, File System: NTFS, Max Component Length: 255 |
| Network Info | |
| Index | 48 |
| Type | 1 |
| Description | TAP-Win32 Adapter OAS #6 |
| Name | {343D77F2-B3CE-414B-AE01-E248D3FC85F6} |
| Ip address | 169.254.92.162 |
| MAC Address | 00-FF-34-3D-77-F2 |
| Gateway | 0.0.0.0 |
| Mask | 255.255.0.0 |
| (null) | (null) |
| Index | 38 |
| Type | 1 |
| Description | TAP-Windows Adapter V9 #6 |
| Name | {C81FC3F7-19F9-44DD-9470-4982F48A141D} |
| Ip address | 169.254.96.118 |
| MAC Address | 00-FF-C8-1F-C3-F7 |
| Gateway | 0.0.0.0 |
| Mask | 255.255.0.0 |
| (null) | (null) |
| Index | 36 |
| Type | 1 |
| Description | TAP-Win32 Adapter OAS #5 |
| Name | {72115AC7-4EE2-4CB3-A8D2- |
| (null) | (null) |
+--------------------+---------------------------------------------------------------------------------+

select      max (Windows_user)          as Windows_user 
,max (Host_name) as Host_name
,max (Local_time) as Local_time
,max (System_time) as System_time
,max (ASCWebBrowser_info) as ASCWebBrowser_info
,max (Last_Write_Time) as Last_Write_Time
,max (HD_Info) as HD_Info

,collect_list
(
case when nwi_seq > 0 then
named_struct
(
'Index' ,Index
,'Type' ,Type
,'Description' ,Description
,'Name' ,Name
,'Ip_address' ,Ip_address
,'MAC_Address' ,MAC_Address
,'Gateway' ,Gateway
,'Mask' ,Mask
)
end
) as Network_Info

from (select ifn
,log_seq
,nwi_seq

,max (case when nwi_seq = 0 and key = 'Windows user' then val end) as Windows_user
,max (case when nwi_seq = 0 and key = 'Host name' then val end) as Host_name
,max (case when nwi_seq = 0 and key = 'Local time' then val end) as Local_time
,max (case when nwi_seq = 0 and key = 'System time' then val end) as System_time
,max (case when nwi_seq = 0 and key = 'ASCWebBrowser info' then val end) as ASCWebBrowser_info
,max (case when nwi_seq = 0 and key = 'Last Write Time' then val end) as Last_Write_Time
,max (case when nwi_seq = 0 and key = 'HD Info' then val end) as HD_Info

,max (case when nwi_seq > 0 and key = 'Index' then val end) as Index
,max (case when nwi_seq > 0 and key = 'Type' then val end) as Type
,max (case when nwi_seq > 0 and key = 'Description' then val end) as Description
,max (case when nwi_seq > 0 and key = 'Name' then val end) as Name
,max (case when nwi_seq > 0 and key = 'Ip address ' then val end) as Ip_address
,max (case when nwi_seq > 0 and key = 'MAC Address' then val end) as MAC_Address
,max (case when nwi_seq > 0 and key = 'Gateway' then val end) as Gateway
,max (case when nwi_seq > 0 and key = 'Mask' then val end) as Mask

from (select key
,val
,ifn
,log_seq

,count(case when key = 'Index' then 1 end) over
(
partition by ifn,log_seq
order by boif
) as nwi_seq

from (select key
,val
,input__file__name as ifn
,block__offset__inside__file as boif

,count(case when key = 'Windows user' then 1 end) over
(
partition by input__file__name
order by block__offset__inside__file
) as log_seq


from log
) l
) l

group by ifn
,log_seq
,nwi_seq
) l

group by ifn
,log_seq
;

+--------------+-----------+------------+-------------+--------------------+-------------------+---------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| windows_user | host_name | local_time | system_time | ascwebbrowser_info | last_write_time | hd_info | network_info |
+--------------+-----------+------------+-------------+--------------------+-------------------+---------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Pâmela | DV6000 | 14:25:07 | 17:25:07 | 1.1.1 | 07/19/2016 14:01 | Volume name: , Serial: 1713925408, File System: NTFS, Max Component Length: 255 | [{"index":"48","type":"1","description":"TAP-Win32 Adapter OAS #6","name":"{343D77F2-B3CE-414B-AE01-E248D3FC85F6}","ip_address":null,"mac_address":"00-FF-34-3D-77-F2","gateway":"0.0.0.0","mask":"255.255.0.0"},{"index":"38","type":"1","description":"TAP-Windows Adapter V9 #6","name":"{C81FC3F7-19F9-44DD-9470-4982F48A141D}","ip_address":null,"mac_address":"00-FF-C8-1F-C3-F7","gateway":"0.0.0.0","mask":"255.255.0.0"},{"index":"36","type":"1","description":"TAP-Win32 Adapter OAS #5","name":"{72115AC7-4EE2-4CB3-A8D2-","ip_address":null,"mac_address":null,"gateway":null,"mask":null}] |
+--------------+-----------+------------+-------------+--------------------+-------------------+---------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

关于python - 在 hive 或 pyspark 中透视日志,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44558044/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com