gpt4 book ai didi

python - 如何在不必手动处理数据 block 的情况下散列大文件?

转载 作者:行者123 更新时间:2023-12-05 09:07:34 24 4
gpt4 key购买 nike

当我们想在 Python 中获取一个大文件的哈希值时,使用 Python 的 hashlib,我们可以像这样处理大小为 1024 字节的数据 block :

import hashlib

m = hashlib.md5()
chunksize = 1024
with open("large.txt", 'rb') as f:
while True:
chunk = f.read(chunksize)
if not chunk:
break
m.update(chunk)
print(m.hexdigest())

或者干脆忽略分块,像这样:

import hashlib
sha256 = hashlib.sha256()
with open(f, 'rb') as g:
sha256.update(g.read())
print(sha256.hexdigest())

找到最佳实现可能很棘手,需要进行一些性能测试和改进(1024 block ?4KB?64KB?等),详见 Hashing file in Python 3?Getting a hash string for a very large file

问题:是否有一个跨平台的、随时可用的函数来使用 Python 计算大文件的 MD5 或 SHA256? (这样我们就不需要重新发明轮子,或者担心最佳 block 大小等)

类似于:

import hashlib

# get the result without having to think about chunks, etc.
hashlib.file_sha256('bigfile.txt')

最佳答案

你确定你真的需要优化这个吗?我做了一些分析,当 block 大小不是小得离谱时,在我的计算机上没有太多收获:

import os
import timeit

filename = "large.txt"
with open(filename, 'w') as f:
f.write('x' * 100*1000*1000) # Create 100 MB file

setup = '''
import hashlib

def md5(filename, chunksize):
m = hashlib.md5()
with open(filename, 'rb') as f:
while chunk := f.read(chunksize):
m.update(chunk)
return m.hexdigest()
'''

for i in range(16):
chunksize = 32 * 2**i
print('chunksize:', chunksize)
print(timeit.Timer(f'md5("{filename}", {chunksize})', setup=setup).repeat(2, 2))

os.remove(filename)

打印:

chunksize: 32
[1.3256129720248282, 1.2988303459715098]
chunksize: 64
[0.7864588440279476, 0.7887071970035322]
chunksize: 128
[0.5426529520191252, 0.5496777250082232]
chunksize: 256
[0.43311091500800103, 0.43472746800398454]
chunksize: 512
[0.36928231100318953, 0.37598425400210544]
chunksize: 1024
[0.34912850096588954, 0.35173907200805843]
chunksize: 2048
[0.33507052797358483, 0.33372197503922507]
chunksize: 4096
[0.3222631579847075, 0.3201586640207097]
chunksize: 8192
[0.33291386102791876, 0.31049903703387827]
chunksize: 16384
[0.3095061599742621, 0.3061956529854797]
chunksize: 32768
[0.3073280190001242, 0.30928074003895745]
chunksize: 65536
[0.30916607001563534, 0.3033451830269769]
chunksize: 131072
[0.3083479679771699, 0.3039141249610111]
chunksize: 262144
[0.3087183449533768, 0.30319386802148074]
chunksize: 524288
[0.29915712698129937, 0.29429047100711614]
chunksize: 1048576
[0.2932401319849305, 0.28639856696827337]

这表明您可以只选择一个大的但不是疯狂的 block 大小。例如1 MB。

关于python - 如何在不必手动处理数据 block 的情况下散列大文件?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/64730177/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com