gpt4 book ai didi

amazon-web-services - Amazon S3 是否有文件大小限制

转载 作者:行者123 更新时间:2023-12-05 00:48:01 24 4
gpt4 key购买 nike

成功运行后,bash 脚本停止工作此行产生错误:

move failed: ../../../../Users/thisuser/Desktop/somefile.zip to s3://cloudbackups/somefile.zip ('Connection aborted.', error(32, 'Broken pipe'))

aws s3 mv $source_file_path $target_path

源文件超过 8GB。 Mac 操作系统。

最佳答案

来自 S3 FAQ

Q: How much data can I store in Amazon S3?

The total volume of data and number of objects you can store are unlimited. Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 terabytes. The largest object that can be uploaded in a single PUT is 5 gigabytes. For objects larger than 100 megabytes, customers should consider using the Multipart Upload capability.)

虽然最大文件大小为 5TB,但单次 PUT 操作的最大大小为 5GB - 这意味着您将无法通过单次操作上传 8GB 文件,并且您需要使用分段上传。请注意,AWS 建议对任何大于 100MB 的文件进行分段上传。

分段上传具有巨大的可靠性优势 - 失败和重试的大小和范围受到限制。为了通过命令行界面执行此操作,您需要熟悉多个不同的命令 documented here

有关分段上传的更多详细信息,请参阅 documentation :

  • Improved throughput - You can upload parts in parallel to improve throughput.
  • Quick recovery from any network issues - Smaller part size minimizes the impact of restarting a failed upload due to a network error.
  • Pause and resume object uploads - You can upload object parts over time. Once you initiate a multipart upload there is no expiry; you must explicitly complete or abort the multipart upload.
  • Begin an upload before you know the final object size - You can upload an object as you are creating it.

关于amazon-web-services - Amazon S3 是否有文件大小限制,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54012602/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com