gpt4 book ai didi

python - 我应该如何处理这个 gsutil "parallel composite upload"警告?

转载 作者:行者123 更新时间:2023-11-28 22:23:25 32 4
gpt4 key购买 nike

我正在运行 python 脚本并使用 os 库执行 gsutil 命令,该命令通常在 Windows 的命令提示符下执行。我的本地计算机上有一些文件,我想将其放入 Google Bucket,所以我这样做:

导入操作系统

command = 'gsutil -m cp myfile.csv  gs://my/bucket/myfile.csv'
os.system(command)

我收到如下消息:

==> NOTE: You are uploading one or more large file(s), which would run significantly faster if you enable parallel composite uploads. This feature can be enabled by editing the "parallel_composite_upload_threshold" value in your .boto configuration file. However, note that if you do this large files will be uploaded as 'composite objects https://cloud.google.com/storage/docs/composite-objects'_, which means that any user who downloads such objects will need to have a compiled crcmod installed (see "gsutil help crcmod"). This is because without a compiled crcmod, computing checksums on composite objects is so slow that gsutil disables downloads of composite objects.

我想通过隐藏它来摆脱这条消息,如果它是无关紧要的,或者实际上按照它的建议去做,但我找不到 .boto 文件。我该怎么办?

最佳答案

Parallel Composite Uploads gsutil 的文档部分描述了如何解决此问题(假设,正如警告所指定的那样,此内容将由具有可用的 crcmod 模块的客户端使用):

gsutil -o GSUtil:parallel_composite_upload_threshold=150M cp bigfile gs://your-bucket

要从 Python 安全地执行此操作,如下所示:

filename='myfile.csv'
gs_bucket='my/bucket'
parallel_threshold='150M' # minimum size for parallel upload; 0 to disable

subprocess.check_call([
'gsutil',
'-o', 'GSUtil:parallel_composite_upload_threshold=%s' % (parallel_threshold,),
'cp', filename, 'gs://%s/%s' % (gs_bucket, filename)
])

请注意,您在这里明确提供了参数向量边界,而不是依赖 shell 为您做这件事;这可以防止恶意或错误的文件名执行不需要的操作。


如果您不知道访问此存储桶中内容的客户端将具有 crcmod 模块,请考虑设置上面的 parallel_threshold='0',这将禁用此功能支持。

关于python - 我应该如何处理这个 gsutil "parallel composite upload"警告?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47043441/

32 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com