2016-09-22 102 views
0
hive> select count(*) from ipaddress where country='China'; 
WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X releases. 
Query ID = pruthviraj_20160922163728_79a0f8d6-5ea6-4cb5-8dd2-d3bb63f8baaf 
Total jobs = 1 
Launching Job 1 out of 1 
Number of reduce tasks determined at compile time: 1 
In order to change the average load for a reducer (in bytes): 
    set hive.exec.reducers.bytes.per.reducer=<number> 
In order to limit the maximum number of reducers: 
    set hive.exec.reducers.max=<number> 
In order to set a constant number of reducers: 
    set mapreduce.job.reduces=<number> 
Starting Job = job_1474512819880_0032, Tracking URL = http://Pruthvis-MacBook-Pro.local:8088/proxy/application_1474512819880_0032/ 
Kill Command = /Users/pruthviraj/lab/software/hadoop-2.7.0/bin/hadoop job -kill job_1474512819880_0032 
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1 
2016-09-22 16:37:45,094 Stage-1 map = 0%, reduce = 0% 
2016-09-22 16:37:52,532 Stage-1 map = 100%, reduce = 0% 
2016-09-22 16:37:59,901 Stage-1 map = 100%, reduce = 100% 
Ended Job = job_1474512819880_0032 
MapReduce Jobs Launched: 
Stage-Stage-1: Map: 1 Reduce: 1 HDFS Read: 10393 HDFS Write: 102 SUCCESS 
Total MapReduce CPU Time Spent: 0 msec 
OK 
Exception in thread "main" 
Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "main" 
Pruthvis-MacBook-Pro:apache-hive-2.1.0-bin pruthviraj$ 

我在mac os上運行這個10,我已經厭倦了premmax的大小,但仍然沒有working.any的幫助將不勝感激。HIVE COUNT * OUT OF MEMORY

回答

0

去ENV文件,並增加-Xmx2048m至-Xmx4096m

-Xmx4096m -XX:PermSize =128米-XX:MaxPermSize參數=128米

+0

你能具體談談這是什麼「-Xmx4096m -XX :PermSize = 128m -XX:MaxPermSize = 128m「,如果你提供了正確的語法,這將是有用的。 –