gpt4 book ai didi

java - hive 计数 * 内存不足

转载 作者:可可西里 更新时间:2023-11-01 16:42:22 24 4
gpt4 key购买 nike

 hive> select count(*) from ipaddress where country='China';
WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X releases.
Query ID = pruthviraj_20160922163728_79a0f8d6-5ea6-4cb5-8dd2-d3bb63f8baaf
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
Starting Job = job_1474512819880_0032, Tracking URL = http://Pruthvis-MacBook-Pro.local:8088/proxy/application_1474512819880_0032/
Kill Command = /Users/pruthviraj/lab/software/hadoop-2.7.0/bin/hadoop job -kill job_1474512819880_0032
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
2016-09-22 16:37:45,094 Stage-1 map = 0%, reduce = 0%
2016-09-22 16:37:52,532 Stage-1 map = 100%, reduce = 0%
2016-09-22 16:37:59,901 Stage-1 map = 100%, reduce = 100%
Ended Job = job_1474512819880_0032
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1 Reduce: 1 HDFS Read: 10393 HDFS Write: 102 SUCCESS
Total MapReduce CPU Time Spent: 0 msec
OK
Exception in thread "main"
Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "main"
Pruthvis-MacBook-Pro:apache-hive-2.1.0-bin pruthviraj$

我在 mac os 10 上运行这个,我已经厌倦了 premmax 大小,但仍然无法正常工作。任何帮助将不胜感激。

最佳答案

转到env文件并将-Xmx2048m增加到-Xmx4096m

-Xmx4096m -XX:PermSize=128m -XX:MaxPermSize=128m

关于java - hive 计数 * 内存不足,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39637530/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com