gpt4 book ai didi

hadoop - 为什么 Hadoop map reduce 作业中允许的计数器数量有限制?

转载 作者:可可西里 更新时间:2023-11-01 14:19:40 26 4
gpt4 key购买 nike

我正在使用 Hadoop map-reduce,当我尝试以编程方式创建多个作业计数器时,出现了 CountersExceededException。我知道我可以通过配置文件增加允许的计数器数量,但是有人知道吗:

a) 为什么 map-reduce 计数器有限制

b) 增加 map-reduce 计数器的最大数量是好还是坏?

我正在使用 hadoop 0.20.2

最佳答案

请参阅 this post 中关于计数器的部分.

Counters

Counters represent global counters, defined either by the Map/Reduce framework or applications. Applications can define arbitrary Counters and update them in the map and/or reduce methods. These counters are then globally aggregated by the framework.

Counters are appropriate for tracking few, important, global bits of information. They are definitely not meant to aggregate very fine-grained statistics of applications. Counters are very expensive since the JobTracker has to maintain every counter of every map/reduce task for the entire duration of the application.

关于hadoop - 为什么 Hadoop map reduce 作业中允许的计数器数量有限制?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/11233283/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com