gpt4 book ai didi

Python 每执行 98 次就会面临开销?

转载 作者:行者123 更新时间:2023-11-28 17:58:59 25 4
gpt4 key购买 nike

我有一个大数据库,我只想为一个新列分配一个常量。在第一次处决时(从 1 到 97);一切都很好,代码运行速度很快。然后内存在第 98 次迭代时飙升,然后直到第 196 次迭代(98 次迭代之后)RAM 再次飙升,然后循环继续内存火箭在每个 i where i 是 98 的乘法...

我猜神秘数字 98 可能因您的 PC 而异。您可能必须更改数据库大小才能重现该问题。

这是我的代码

编辑:我认为这不是垃圾回收,因为 gc.isenabled() 在代码末尾返回 False

import pandas as pd
import numpy as np

n = 2000000
data = pd.DataFrame({'a' : range(n)})
for i in range(1, 100):
data['col_' + str(i)] = np.random.choice(['a', 'b'], n)

gc.disable()
for i in range(1, 600):
data['test_{}'.format(i)] = i
print(str(i)) # slow at every i multiplication of 98

gc.isenabled()
> False

这是我的内存使用情况,峰值出现在 i*98 迭代(其中 i 是一个整数)

我使用的是 Windows 10、Python 3.6.1 | python 4.4.0 | Pandas 0.24.2

我有 16 GB 内存和 8 核 CPU

enter image description here

最佳答案

首先,我想在 16 GB RAM 和 GC 被禁用的 Ubuntu 上确认相同的行为。因此,这绝对不是 GC 或 Windows 内存管理的问题。

其次,在我的系统上,它在每 99 次迭代后变慢:99 次之后、198 次之后、297 次之后等等。无论如何,我的交换文件相当有限,所以当 RAM+Swap 被填满时,它会崩溃以下堆栈跟踪:

294
295
296
297
Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/pandas/core/indexes/base.py", line 2657, in get_loc
return self._engine.get_loc(key)
File "pandas/_libs/index.pyx", line 108, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/index.pyx", line 132, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/hashtable_class_helper.pxi", line 1601, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas/_libs/hashtable_class_helper.pxi", line 1608, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'test_298'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/pandas/core/internals/managers.py", line 1053, in set
loc = self.items.get_loc(item)
File "/usr/local/lib/python3.6/dist-packages/pandas/core/indexes/base.py", line 2659, in get_loc
return self._engine.get_loc(self._maybe_cast_indexer(key))
File "pandas/_libs/index.pyx", line 108, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/index.pyx", line 132, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/hashtable_class_helper.pxi", line 1601, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas/_libs/hashtable_class_helper.pxi", line 1608, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'test_298'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "py-memory-test.py", line 12, in <module>
data['test_{}'.format(i)] = i
File "/usr/local/lib/python3.6/dist-packages/pandas/core/frame.py", line 3370, in __setitem__
self._set_item(key, value)
File "/usr/local/lib/python3.6/dist-packages/pandas/core/frame.py", line 3446, in _set_item
NDFrame._set_item(self, key, value)
File "/usr/local/lib/python3.6/dist-packages/pandas/core/generic.py", line 3172, in _set_item
self._data.set(key, value)
File "/usr/local/lib/python3.6/dist-packages/pandas/core/internals/managers.py", line 1056, in set
self.insert(len(self.items), item, value)
File "/usr/local/lib/python3.6/dist-packages/pandas/core/internals/managers.py", line 1184, in insert
self._consolidate_inplace()
File "/usr/local/lib/python3.6/dist-packages/pandas/core/internals/managers.py", line 929, in _consolidate_inplace
self.blocks = tuple(_consolidate(self.blocks))
File "/usr/local/lib/python3.6/dist-packages/pandas/core/internals/managers.py", line 1899, in _consolidate
_can_consolidate=_can_consolidate)
File "/usr/local/lib/python3.6/dist-packages/pandas/core/internals/blocks.py", line 3149, in _merge_blocks
new_values = new_values[argsort]
MemoryError

因此,似乎 pandas 有时会在插入时进行某种合并/合并/重新打包。我们来看看core/internals/managers.pyinsert 函数,它有以下几行:

def insert(self, loc, item, value, allow_duplicates=False):
...
self._known_consolidated = False

if len(self.blocks) > 100:
self._consolidate_inplace()

我想这正是我们要找的!

每次我们执行 insert 时都会创建新 block 。当 block 数超过某个限制时,将执行额外的工作(合并)。代码中的 100 个 block 限制与我们获得的大约 98-99 的经验数字之间的差异可能是由于存在一些额外的数据帧元数据,这也需要一些空间。

UPD:为了证明这个假设,我尝试更改 100 -> 1000000,它运行良好,没有性能差距,没有 MemoryError。但是,没有公共(public) API 可以在运行时修改此参数,它只是硬编码。

UPD2:提交了一个 issuepandas,因为 MemoryError 看起来不像是这样一个简单程序的适当行为。

关于Python 每执行 98 次就会面临开销?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/56690909/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com