gpt4 book ai didi

python - 在 python 中 pickle 数据时出现 MemoryError

转载 作者:太空狗 更新时间:2023-10-30 00:07:42 26 4
gpt4 key购买 nike

我正在尝试使用 python 中提供的“转储”命令将字典转储为 pickle 格式。字典的文件大小约为 150 MB,但当仅转储 115 MB 的文件时会发生异常。异常(exception)情况是:

Traceback (most recent call last): 
File "C:\Python27\generate_traffic_pattern.py", line 32, in <module>
b.dump_data(way_id_data,'way_id_data.pickle')
File "C:\Python27\class_dump_load_data.py", line 8, in dump_data
pickle.dump(data,saved_file)
File "C:\Python27\lib\pickle.py", line 1370, in dump
Pickler(file, protocol).dump(obj)
File "C:\Python27\lib\pickle.py", line 224, in dump
self.save(obj)
File "C:\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python27\lib\pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
File "C:\Python27\lib\pickle.py", line 663, in _batch_setitems
save(v)
File "C:\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python27\lib\pickle.py", line 600, in save_list
self._batch_appends(iter(obj))
File "C:\Python27\lib\pickle.py", line 615, in _batch_appends
save(x)
File "C:\Python27\lib\pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Python27\lib\pickle.py", line 599, in save_list
self.memoize(obj)
File "C:\Python27\lib\pickle.py", line 247, in memoize
self.memo[id(obj)] = memo_len, obj
MemoryError

我真的很困惑,因为我的相同代码之前运行良好。

最佳答案

您是否只丢弃了一个对象,仅此而已?

如果您多次调用转储,那么在转储之间调用 Pickler.clear_memo() 将刷新内部存储的反向引用(导致“泄漏”)。你的代码应该可以正常工作......

关于python - 在 python 中 pickle 数据时出现 MemoryError,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/16403633/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com