gpt4 book ai didi

amazon-web-services - 获取 EntityTooSmall 异常

转载 作者:行者123 更新时间:2023-12-03 01:05:35 25 4
gpt4 key购买 nike

我正在尝试从 Azure blob 中下载数据 block ,然后尝试将相同的 block 上传到 aws s3 存储桶。上传时,我收到“您建议的上传大小小于允许的最小大小”异常。我注意到一件事,在上传响应中我得到的内容长度为 0。我尝试的数据大小超过 300MB。有任何指示这里可能有什么问题吗?下面是我的代码片段:

            var remainingLength = blob.Properties.Length;
long startPosition = 0;
List<UploadPartResponse> uploadResponses = new List<UploadPartResponse>();
int i = 1;
string uploadId = string.Empty;

//Step 1: build and send a multi upload request
var initiateRequest = new InitiateMultipartUploadRequest
{
BucketName = existingBucketName,
Key = "firstobj"
};

var initResponse = client.InitiateMultipartUpload(initiateRequest);
uploadId = initResponse.UploadId;

do
{
var blockSize = Math.Min(segmentSize, remainingLength);
using (var ms = new MemoryStream())
{
blob.DownloadRangeToStream(ms, startPosition, blockSize);

//Step 2: upload each chunk (this is run for every chunk unlike the other steps which are run once)
var uploadRequest = new UploadPartRequest
{
BucketName = existingBucketName,
Key = "firstobj",
UploadId = uploadId,
PartNumber = i,
PartSize = ms.Length,
InputStream = ms
};

// Upload part and add response to our list.
var temp = client.UploadPart(uploadRequest);
uploadResponses.Add(temp);
}

//Step 3: build and send the multipart complete request
if (blockSize < segmentSize)
{

var completeRequest = new CompleteMultipartUploadRequest
{
BucketName = existingBucketName,
Key = "firstobj",
UploadId = uploadId,
};

completeRequest.AddPartETags(uploadResponses);
client.CompleteMultipartUpload(completeRequest);
}

startPosition += blockSize;
remainingLength -= blockSize;
i++;
}
while (remainingLength > 0);

最佳答案

经过一番努力之后,我找到了解决方案。在第 2 步中,在将部分上传到 AWS 之前,我们应该将流位置设置为 0。

uploadRequest.InputStream.Position = 0;

关于amazon-web-services - 获取 EntityTooSmall 异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49195031/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com