gpt4 book ai didi

sql - hive 加入选择非常慢

转载 作者:行者123 更新时间:2023-12-02 20:15:46 25 4
gpt4 key购买 nike

嗨,我有两个表:user_info,ip_location,一个表是50,000,另一个是100,000。
现在需要使用用户表的ip来检查属性,将ip转换为int并将间隔与ip_location比较。
我的 hive 版本是3.0.0,这个版本没有索引。
ip_location:
enter image description here
此操作在pg中非常快:

set search_path=res;
select * from(
select ip,
(split_part(ip,'.',1)::bigint*256*256*256
+split_part(ip,'.',2)::bigint*256*256
+split_part(ip,'.',3)::bigint*256
+split_part(ip,'.',4)::bigint)::int8 as ipvalue
from user_info) t1
left join ip_location t2 on
ipv4_val_begin=(select max(ipv4_val_begin) from ip_location where ipv4_val_begin <= ipvalue);
但是我没有在 hive 中找到此语法的替代方法:
select ip,
t2.location_country
cast(split(ip,"\\.")[0] as bigint)*256*256*256
+cast(split(ip,"\\.")[0] as bigint)*256*256
+cast(split(ip,"\\.")[0] as bigint)*256
+cast(split(ip,"\\.")[0] as bigint) as ipvalue
from source.v_dm_vip_user t1
left join res.ip_location t2 on
ipv4_val_begin=(select max(ipv4_val_begin) from res.ip_location where ipv4_val_begin <= ipvalue);
错误:
enter image description here
更改为以下sql,您可以成功查询,但是速度很慢,需要1天的时间:
select ip,
t2.location_country
cast(split(ip,"\\.")[0] as bigint)*256*256*256
+cast(split(ip,"\\.")[0] as bigint)*256*256
+cast(split(ip,"\\.")[0] as bigint)*256
+cast(split(ip,"\\.")[0] as bigint) as ipvalue
from source.v_dm_vip_user t1
left join res.ip_location t2 on
cast(split(ip,"\\.")[0] as bigint)*256*256*256
+cast(split(ip,"\\.")[0] as bigint)*256*256
+cast(split(ip,"\\.")[0] as bigint)*256
+cast(split(ip,"\\.")[0] as bigint) > ipv4_val_begin
and
cast(split(ip,"\\.")[0] as bigint)*256*256*256
+cast(split(ip,"\\.")[0] as bigint)*256*256
+cast(split(ip,"\\.")[0] as bigint)*256
+cast(split(ip,"\\.")[0] as bigint) < ipv4_val_end;
是否有更好更好的SQL?我做了很多尝试,但是没有用,谢谢。

最佳答案

我尝试了 View 和行组索引,但是它不能加快速度。我想问一下如何使用配置单元来加快IP地址范围,这样,配置单元在spark上的速度也很慢。

关于sql - hive 加入选择非常慢,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/63995528/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com