gpt4 book ai didi

python - Dask Dataframe "ValueError: Data is compressed as snappy but we don' t 已安装”

转载 作者:太空宇宙 更新时间:2023-11-04 04:42:34 25 4
gpt4 key购买 nike

似乎已安装 python-snappy - Dask 返回 ValueError。

jupyter 和 worker 的 Helm 配置:

env:
- name: EXTRA_CONDA_PACKAGES
value: numba xarray s3fs python-snappy pyarrow ruamel.yaml -c conda-forge
- name: EXTRA_PIP_PACKAGES
value: dask-ml --upgrade

容器显示 python-snappy(通过 conda 列表)

数据框是从 Apache Drill 生成的多部分拼花文件加载的:

files = ['s3://{}'.format(f) for f in fs.glob(path='{}/*.parquet'.format(filename))]
df = dd.read_parquet(files)

在数据帧上运行 len(df) 返回:

distributed.utils - ERROR - Data is compressed as snappy but we don't have this installed
Traceback (most recent call last):
File "/opt/conda/lib/python3.6/site-packages/distributed/utils.py", line 622, in log_errors
yield
File "/opt/conda/lib/python3.6/site-packages/distributed/client.py", line 921, in _handle_report
six.reraise(*clean_exception(**msg))
File "/opt/conda/lib/python3.6/site-packages/six.py", line 692, in reraise
raise value.with_traceback(tb)
File "/opt/conda/lib/python3.6/site-packages/distributed/comm/tcp.py", line 203, in read
msg = yield from_frames(frames, deserialize=self.deserialize)
File "/opt/conda/lib/python3.6/site-packages/tornado/gen.py", line 1099, in run
return
File "/opt/conda/lib/python3.6/site-packages/tornado/gen.py", line 315, in wrapper
future.set_result(_value_from_stopiteration(e))
File "/opt/conda/lib/python3.6/site-packages/distributed/comm/utils.py", line 75, in from_frames
res = _from_frames()
File "/opt/conda/lib/python3.6/site-packages/distributed/comm/utils.py", line 61, in _from_frames
return protocol.loads(frames, deserialize=deserialize)
File "/opt/conda/lib/python3.6/site-packages/distributed/protocol/core.py", line 96, in loads
msg = loads_msgpack(small_header, small_payload)
File "/opt/conda/lib/python3.6/site-packages/distributed/protocol/core.py", line 171, in loads_msgpack
" installed" % str(header['compression']))
ValueError: Data is compressed as snappy but we don't have this installed

谁能在这里建议正确的配置或修复步骤?

最佳答案

这个错误实际上不是来自读取您的 parquet 文件,而是来自 Dask 如何在机器之间压缩数据。您可以通过在所有客户端/调度程序/工作程序 pod 上一致地安装或不安装 python-snappy 来解决此问题。

您应该执行以下任一操作:

  1. jupyterworker pod 的 conda 包列表中删除 python-snappy。如果您使用的是 pyarrow,那么这是不必要的,我相信 Arrow 在 C++ 级别包含 snappy。
  2. python-snappy 添加到您的 scheduler pod

FWIW 我个人推荐使用 lz4 进行机器间压缩,而不是 snappy

关于python - Dask Dataframe "ValueError: Data is compressed as snappy but we don' t 已安装”,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50340721/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com