gpt4 book ai didi

mysql - 无法删除配置单元表

转载 作者:可可西里 更新时间:2023-11-01 16:58:17 24 4
gpt4 key购买 nike

我使用 Cloudera Manager 将我的 CDH 从 5.0.0 beta 升级到 5.2.1。除 hive metastore 外,所有服务都正常运行。我正在使用 MySQL用于配置单元 Metastore。

当我触发如下查询时,我无法从配置单元中删除表:

`drop table test;`

我收到以下错误。

hive> drop table test;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException: Exception thrown when executing query
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)
at org.apache.hadoop.hive.metastore.ObjectStore.deleteTableColumnStatistics(ObjectStore.java:6030)
at org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:813)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
at com.sun.proxy.$Proxy0.dropTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1386)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1525)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:106)
at com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:8072)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:8056)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:502)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column AO.BIG_DECIMAL_HIGH_VALUE in field list

引用链接

http://community.cloudera.com/t5/Interactive-Short-cycle-SQL/CDH-upgrade-from-4-7-to-CDH-5-2-hive-metastore-issue/td-p/20626

谁能给点建议?

最佳答案

我遇到了完全相同的问题。我尝试按照引用链接中的建议进行操作,但我仍然遇到同样的问题(即使我修复了运行脚本时可以看到的所有错误)。

最后,我在 Hive 元存储脚本中查找了 BIG_DECIMAL_HIGH_VALUE,我在以下创建语句中的每个 hive-schema-0.1*.0.mysql.sql 中都看到了它:

CREATE TABLE IF NOT EXISTS `TAB_COL_STATS` (
`CS_ID` bigint(20) NOT NULL,
`DB_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`TABLE_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`COLUMN_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`COLUMN_TYPE` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`TBL_ID` bigint(20) NOT NULL,
`LONG_LOW_VALUE` bigint(20),
`LONG_HIGH_VALUE` bigint(20),
`DOUBLE_HIGH_VALUE` double(53,4),
`DOUBLE_LOW_VALUE` double(53,4),
`BIG_DECIMAL_LOW_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
`BIG_DECIMAL_HIGH_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
`NUM_NULLS` bigint(20) NOT NULL,
`NUM_DISTINCTS` bigint(20),
`AVG_COL_LEN` double(53,4),
`MAX_COL_LEN` bigint(20),
`NUM_TRUES` bigint(20),
`NUM_FALSES` bigint(20),
`LAST_ANALYZED` bigint(20) NOT NULL,
PRIMARY KEY (`CS_ID`),
CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;

CREATE TABLE IF NOT EXISTS `PART_COL_STATS` (
`CS_ID` bigint(20) NOT NULL,
`DB_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`TABLE_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`PARTITION_NAME` varchar(767) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`COLUMN_NAME` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`COLUMN_TYPE` varchar(128) CHARACTER SET latin1 COLLATE latin1_bin NOT NULL,
`PART_ID` bigint(20) NOT NULL,
`LONG_LOW_VALUE` bigint(20),
`LONG_HIGH_VALUE` bigint(20),
`DOUBLE_HIGH_VALUE` double(53,4),
`DOUBLE_LOW_VALUE` double(53,4),
`BIG_DECIMAL_LOW_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
`BIG_DECIMAL_HIGH_VALUE` varchar(4000) CHARACTER SET latin1 COLLATE latin1_bin,
`NUM_NULLS` bigint(20) NOT NULL,
`NUM_DISTINCTS` bigint(20),
`AVG_COL_LEN` double(53,4),
`MAX_COL_LEN` bigint(20),
`NUM_TRUES` bigint(20),
`NUM_FALSES` bigint(20),
`LAST_ANALYZED` bigint(20) NOT NULL,
PRIMARY KEY (`CS_ID`),
CONSTRAINT `PART_COL_STATS_FK` FOREIGN KEY (`PART_ID`) REFERENCES `PARTITIONS` (`PART_ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;

当我在 Hive 元存储数据库中检查这些表时,我得到了一些不同的东西:

mysql> desc TAB_COL_STATS;
+-------------------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------------------+--------------+------+-----+---------+-------+
| CS_ID | bigint(20) | NO | PRI | NULL | |
| AVG_COL_LEN | double | YES | | NULL | |
| COLUMN_NAME | varchar(128) | NO | | NULL | |
| COLUMN_TYPE | varchar(128) | NO | | NULL | |
| DB_NAME | varchar(128) | NO | | NULL | |
| DOUBLE_HIGH_VALUE | double | YES | | NULL | |
| DOUBLE_LOW_VALUE | double | YES | | NULL | |
| LAST_ANALYZED | bigint(20) | NO | | NULL | |
| LONG_HIGH_VALUE | bigint(20) | YES | | NULL | |
| LONG_LOW_VALUE | bigint(20) | YES | | NULL | |
| MAX_COL_LEN | bigint(20) | YES | | NULL | |
| NUM_DISTINCTS | bigint(20) | YES | | NULL | |
| NUM_FALSES | bigint(20) | YES | | NULL | |
| NUM_NULLS | bigint(20) | NO | | NULL | |
| NUM_TRUES | bigint(20) | YES | | NULL | |
| TBL_ID | bigint(20) | YES | MUL | NULL | |
| TABLE_NAME | varchar(128) | NO | | NULL | |
+-------------------+--------------+------+-----+---------+-------+
17 rows in set (0.00 sec)

mysql> desc PART_COL_STATS;
+-------------------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------------------+--------------+------+-----+---------+-------+
| CS_ID | bigint(20) | NO | PRI | NULL | |
| AVG_COL_LEN | double | YES | | NULL | |
| COLUMN_NAME | varchar(128) | NO | | NULL | |
| COLUMN_TYPE | varchar(128) | NO | | NULL | |
| DB_NAME | varchar(128) | NO | | NULL | |
| DOUBLE_HIGH_VALUE | double | YES | | NULL | |
| DOUBLE_LOW_VALUE | double | YES | | NULL | |
| LAST_ANALYZED | bigint(20) | NO | | NULL | |
| LONG_HIGH_VALUE | bigint(20) | YES | | NULL | |
| LONG_LOW_VALUE | bigint(20) | YES | | NULL | |
| MAX_COL_LEN | bigint(20) | YES | | NULL | |
| NUM_DISTINCTS | bigint(20) | YES | | NULL | |
| NUM_FALSES | bigint(20) | YES | | NULL | |
| NUM_NULLS | bigint(20) | NO | | NULL | |
| NUM_TRUES | bigint(20) | YES | | NULL | |
| PART_ID | bigint(20) | YES | MUL | NULL | |
| PARTITION_NAME | varchar(767) | NO | | NULL | |
| TABLE_NAME | varchar(128) | NO | | NULL | |
+-------------------+--------------+------+-----+---------+-------+

因此,由于两个表都是空的,我只是删除并再次创建它们,创建语句列在 hive-schema-0.13.0.mysql.sql 文件中。

这样做之后,我可以再次删除 Hive 表。

关于mysql - 无法删除配置单元表,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27288611/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com