gpt4 book ai didi

Python Pillow - ValueError : Decompressed Data Too Large

转载 作者:太空狗 更新时间:2023-10-30 00:59:21 36 4
gpt4 key购买 nike

我使用 Pillow 库创建缩略图。我必须创建很多,实际上超过 10.000

程序运行良好,但在处理大约 1.500 后,出现以下错误:

    Traceback (most recent call last):
File "thumb.py", line 15, in <module>
im = Image.open('/Users/Marcel/images/07032017/' + infile)
File "/Users/Marcel/product-/PIL/Image.py", line 2339, in open
im = _open_core(fp, filename, prefix)
File "/Users/Marcel/product-/PIL/Image.py", line 2329, in _open_core
im = factory(fp, filename)
File "/Users/Marcel/product-/PIL/ImageFile.py", line 97, in __init__
self._open()
File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 538, in _open
s = self.png.call(cid, pos, length)
File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 136, in call
return getattr(self, "chunk_" + cid.decode('ascii'))(pos, length)
File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 319, in chunk_iCCP
icc_profile = _safe_zlib_decompress(s[i+2:])
File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 90, in _safe_zlib_decompress
raise ValueError("Decompressed Data Too Large")
ValueError: Decompressed Data Too Large

我的程序非常简单:

import os, sys
import PIL
from PIL import Image

size = 235, 210
reviewedProductsList = open('products.txt', 'r')
reviewedProducts = reviewedProductsList.readlines()
t = map(lambda s: s.strip(), reviewedProducts)

print "Thumbs to create: '%s'" % len(reviewedProducts)

for infile in t:
outfile = infile
try:
im = Image.open('/Users/Marcel/images/07032017/' + infile)
im.thumbnail(size, Image.ANTIALIAS)
print "thumb created"
im.save('/Users/Marcel/product-/thumbs/' + outfile, "JPEG")
except IOError, e:
print "cannot create thumbnail for '%s'" % infile
print "error: '%s'" % e

我在我的 MacBook Pro 上本地执行此操作。

最佳答案

这是为了防止减压炸弹对运行 Pillow 的服务器造成潜在的 DoS 攻击。当发现解压缩的图像具有太大的元数据时,就会发生这种情况。参见 http://pillow.readthedocs.io/en/4.0.x/handbook/image-file-formats.html?highlight=decompression#png

这是 CVE 报告:https://www.cvedetails.com/cve/CVE-2014-9601/

来自最近一期:

If you set ImageFile.LOAD_TRUNCATED_IMAGES to true, it will suppress the error (but still not read the large metadata). Alternately, you can change set the values here: https://github.com/python-pillow/Pillow/ blob/master/PIL/PngImagePlugin.py#L74

https://github.com/python-pillow/Pillow/issues/2445

关于Python Pillow - ValueError : Decompressed Data Too Large,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42671252/

36 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com