gpt4 book ai didi

python - 无法上传 > ~2GB 到 Google Cloud Storage

转载 作者:太空狗 更新时间:2023-10-30 01:05:07 25 4
gpt4 key购买 nike

跟踪如下。

相关的 Python 片段:

bucket = _get_bucket(location['bucket'])
blob = bucket.blob(location['path'])
blob.upload_from_filename(source_path)

最终触发(来自 ssl 库):

OverflowError: string longer than 2147483647 bytes

我假设我缺少一些特殊的配置选项?

这可能与这个 ~1.5 年前显然仍 Unresolved 问题有关:https://github.com/googledatalab/datalab/issues/784 .

感谢帮助!

完整跟踪:

[File "/usr/src/app/gcloud/download_data.py", line 109, in ******* blob.upload_from_filename(source_path)

File "/usr/local/lib/python3.5/dist-packages/google/cloud/storage/blob.py", line 992, in upload_from_filename size=total_bytes)

File "/usr/local/lib/python3.5/dist-packages/google/cloud/storage/blob.py", line 946, in upload_from_file client, file_obj, content_type, size, num_retries)

File "/usr/local/lib/python3.5/dist-packages/google/cloud/storage/blob.py", line 867, in _do_upload client, stream, content_type, size, num_retries)

File "/usr/local/lib/python3.5/dist-packages/google/cloud/storage/blob.py", line 700, in _do_multipart_upload transport, data, object_metadata, content_type)

File "/usr/local/lib/python3.5/dist-packages/google/resumable_media/requests/upload.py", line 97, in transmit retry_strategy=self._retry_strategy)

File "/usr/local/lib/python3.5/dist-packages/google/resumable_media/requests/_helpers.py", line 101, in http_request func, RequestsMixin._get_status_code, retry_strategy)

File "/usr/local/lib/python3.5/dist-packages/google/resumable_media/_helpers.py", line 146, in wait_and_retry response = func()

File "/usr/local/lib/python3.5/dist-packages/google/auth/transport/requests.py", line 186, in request method, url, data=data, headers=request_headers, **kwargs)

File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py", line 508, in request resp = self.send(prep, **send_kwargs)

File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py", line 618, in send r = adapter.send(request, **kwargs)

File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py", line 440, in send timeout=timeout

File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpool.py", line 601, in urlopen chunked=chunked)

File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw)

File "/usr/lib/python3.5/http/client.py", line 1106, in request self._send_request(method, url, body, headers)

File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request self.endheaders(body)

File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders self._send_output(message_body)

File "/usr/lib/python3.5/http/client.py", line 936, in _send_output self.send(message_body)

File "/usr/lib/python3.5/http/client.py", line 908, in send self.sock.sendall(data)

File "/usr/lib/python3.5/ssl.py", line 891, in sendall v = self.send(data[count:])

File "/usr/lib/python3.5/ssl.py", line 861, in send return self._sslobj.write(data)

File "/usr/lib/python3.5/ssl.py", line 586, in write return self._sslobj.write(data)

OverflowError: string longer than 2147483647 bytes

最佳答案

问题是它试图将整个文件读入 memory .遵循来自 upload_from_filename 的链显示它stats 文件,然后将其作为上传大小作为单个上传部分传递。

相反,在创建对象时指定一个 chunk_size 将触发它分多个部分上传:

# Must be a multiple of 256KB per docstring    
CHUNK_SIZE = 10485760 # 10MB
blob = bucket.blob(location['path'], chunk_size=CHUNK_SIZE)

黑客攻击快乐!

关于python - 无法上传 > ~2GB 到 Google Cloud Storage,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47610283/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com