gpt4 book ai didi

python - 执行器上的 Spark 2.3 内存泄漏

转载 作者:太空狗 更新时间:2023-10-29 20:29:47 30 4
gpt4 key购买 nike

我收到内存泄漏警告,理想情况下这是一个 Spark 错误,直到 1.6 版本并已解决。

模式:独立集成开发环境:PyCharm星火版本:2.3Python版本:3.6

下面是堆栈跟踪 -

2018-05-25 15:00:05 WARN  Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3148
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3152
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3151
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3150
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3149
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3153
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3154
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3158
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3155
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3157
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3160
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3161
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3156
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3159
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3165
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3163
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3162
2018-05-25 15:00:05 WARN Executor:66 - Managed memory leak detected; size = 262144 bytes, TID = 3166

关于为什么会发生的任何见解?虽然我的工作正在顺利完成。

编辑:许多人说它与 2 年前的问题重复,但那里的答案说这是一个 Spark 错误,但在 Spark 的 Jira 中检查时,它说它已解决。

问题来了,这么多版本了,为什么我到Spark 2.3还是一样?如果对我的查询有一些有效或合乎逻辑的答案,如果这个问题看起来真的多余,我肯定会删除这个问题。

最佳答案

根据 SPARK-14168 ,警告源于未使用整个迭代器。我在 Spark shell 中从 RDD 中获取 n 个元素时遇到了同样的错误。

关于python - 执行器上的 Spark 2.3 内存泄漏,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50526122/

30 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com