gpt4 book ai didi

hadoop - from_unixtime认为数据是字符串,其中hadoop配置单元中的数据为int/bigint

转载 作者:行者123 更新时间:2023-12-02 21:16:27 26 4
gpt4 key购买 nike

我的代码如下。

SELECT
to_date(from_unixtime(time_first_touch)) AS sDate
FROM (
SELECT
MIN(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time')) as time_first_touch,
COUNT(*) as number_of_events
FROM swanviraw
) v

它引发错误-编译语句时出错:FAILED:SemanticException [Error 10014]:第2行:10错误的参数'time_first_touch':类org.apache.hadoop.hive.ql.udf.UDFFromUnixTime没有匹配的方法,带有(string) 。可能的选择:FUNC(bigint)FUNC(bigint,字符串)FUNC(int)FUNC(int,字符串)[ERROR_STATUS]

现在,关键是下面的查询工作正常。.ev_time具有int / bigint值,因为MIN在下面可以完美地工作。
 SELECT
MIN(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time')) as time_first_touch,
COUNT(*) as number_of_events
FROM swanviraw

真诚的感谢您的帮助。

谢谢

最佳答案

作为GET_JSON_OBJECT returns json string,并且由于错误指示from_unixtime期望intbigint,您需要将time_first_touch转换为bigint:

SELECT
to_date(from_unixtime(time_first_touch)) AS sDate
FROM (
SELECT
MIN(cast(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time') as bigint)) as time_first_touch,
COUNT(*) as number_of_events
FROM swanviraw
) as v

要么
SELECT
to_date(from_unixtime(time_first_touch)) AS sDate
FROM (
SELECT
MIN(unix_timestamp(GET_JSON_OBJECT(swanviraw.textcol,'$.ev_time'))) as time_first_touch,
COUNT(*) as number_of_events
FROM swanviraw
) as v

关于hadoop - from_unixtime认为数据是字符串,其中hadoop配置单元中的数据为int/bigint,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38662673/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com