gpt4 book ai didi

python - x32 机器上的 numpy.memmap 最大数组大小?

转载 作者:行者123 更新时间:2023-12-01 05:34:08 26 4
gpt4 key购买 nike

我在 x32 win xp 上使用 python x32

有时程序上线失败

fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols))

memmap.py 中出现错误

Traceback (most recent call last):
fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols)) File "C:\Python27\lib\site-packages\numpy\core\memmap.py", line 253, in __new__
mm = mmap.mmap(fid.fileno(), bytes, access=acc, offset=start)
OverflowError: cannot fit 'long' into an index-sized integer

所以我假设数组的大小有限制,那么数组的最大大小 maxN = rows*cols 是多少?

同样的问题1. python x32 win x64 和 2. python x64 win x64。

更新:

#create array
rows= 250000
cols= 1000
fA= np.memmap('A.npy', dtype='float32', mode='w+', shape=(rows,cols))
# fA1= np.memmap('A1.npy', dtype='float32', mode='w+', shape=(rows,cols)) # can't create another one big memmap
print fA.nbytes/1024/1024 # 953 mb

所以看来还有另一个限制,不仅仅是<单个内存映射数组的 2Gb。

还有 @Paul 提供的测试输出

working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
OverflowError("cannot fit 'long' into an index-sized integer",)
working with 3000000000 elements
number bytes required 24.000000 GB
IOError(28, 'No space left on device')
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')

最佳答案

以下是有关此主题的一些讨论:How big can a memory-mapped file be?Why doesn't Python's mmap work with large files?

对于以下测试,我使用以下代码:

baseNumber = 3000000L

for powers in arange(1,7):
l1 = baseNumber*10**powers
print('working with %d elements'%(l1))
print('number bytes required %f GB'%(l1*8/1e9))
try:
fp = numpy.memmap('test.map',dtype='float64', mode='w+',shape=(1,l1))
#works
print('works')
del fp
except Exception as e:
print(repr(e))

Windows x32 上的 python x32对于 32 位 Windows,文件大小的限制约为 2-3GB。因此,由于操作系统限制,Windows 无法创建任何大于此文件大小的文件。我无法访问 x32 位机器,但在达到文件大小限制后命令将失败

Windows x64 上的 python x32

在这种情况下,由于 python 是 32 位,我们无法达到 win64 上允许的文件大小。

%run -i scratch.py

python x32 win x64
working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
OverflowError("cannot fit 'long' into an index-sized integer",)
working with 3000000000 elements
number bytes required 24.000000 GB
OverflowError("cannot fit 'long' into an index-sized integer",)
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')

Windows x64 上的 python x64

在这种情况下,我们最初受到磁盘大小的限制,但是一旦我们的数组/字节大小足够大,就会出现一些溢出

%run -i scratch.py
working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
works
working with 3000000000 elements
number bytes required 24.000000 GB
works
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')

总结:阵列发生故障的精确点将取决于 Windows x64 的最初磁盘大小

pythonx32 Windows x64最初我们遇到了您所看到的类型错误,然后是磁盘大小限制,但在某些时候会引发无效参数错误

pythonx64 Windows x64最初我们有磁盘大小限制,但在某些时候会出现其他错误。
有趣的是,这些错误似乎与 264 大小问题无关,如 3000000000000*8 < 264,与这些错误在 win32 上的表现方式相同。

如果磁盘足够大,那么我们可能不会看到无效参数错误,并且我们可以达到 2**64 限制,尽管我没有足够大的磁盘来测试这个:)

关于python - x32 机器上的 numpy.memmap 最大数组大小?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19534178/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com