gpt4 book ai didi

python - 使用 boto3 完成 multipart_upload?

转载 作者:太空狗 更新时间:2023-10-29 21:39:20 24 4
gpt4 key购买 nike

试过这个:

import boto3
from boto3.s3.transfer import TransferConfig, S3Transfer
path = "/temp/"
fileName = "bigFile.gz" # this happens to be a 5.9 Gig file
client = boto3.client('s3', region)
config = TransferConfig(
multipart_threshold=4*1024, # number of bytes
max_concurrency=10,
num_download_attempts=10,
)
transfer = S3Transfer(client, config)
transfer.upload_file(path+fileName, 'bucket', 'key')

结果:s3 上的 5.9 gig 文件。似乎不包含多个部分。

我找到了 this example ,但 part 未定义。

import boto3

bucket = 'bucket'
path = "/temp/"
fileName = "bigFile.gz"
key = 'key'

s3 = boto3.client('s3')

# Initiate the multipart upload and send the part(s)
mpu = s3.create_multipart_upload(Bucket=bucket, Key=key)
with open(path+fileName,'rb') as data:
part1 = s3.upload_part(Bucket=bucket
, Key=key
, PartNumber=1
, UploadId=mpu['UploadId']
, Body=data)

# Next, we need to gather information about each part to complete
# the upload. Needed are the part number and ETag.
part_info = {
'Parts': [
{
'PartNumber': 1,
'ETag': part['ETag']
}
]
}

# Now the upload works!
s3.complete_multipart_upload(Bucket=bucket
, Key=key
, UploadId=mpu['UploadId']
, MultipartUpload=part_info)

问题:有谁知道如何使用boto3的分段上传?

最佳答案

您的代码已经正确。实际上,分段上传的一个最小示例如下所示:

import boto3
s3 = boto3.client('s3')
s3.upload_file('my_big_local_file.txt', 'some_bucket', 'some_key')

您无需明确请求分段上传,也无需使用 boto3 中与分段上传相关的任何低级函数。只需调用 upload_file,如果您的文件大小超过特定阈值(默认为 8MB),boto3 将自动使用分段上传。

您似乎对 S3 中的最终结果不是由多个部分组成这一事实感到困惑:

Result: 5.9 gig file on s3. Doesn't seem to contain multiple parts.

...但这是预期的结果。分段上传 API 的全部意义在于让您通过多个 HTTP 请求上传单个文件并最终在 S3 中获得单个对象。

关于python - 使用 boto3 完成 multipart_upload?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34303775/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com