gpt4 book ai didi

python - 使用 h5py 创建大量数据集 - 无法注册数据类型原子(无法插入重复键)

转载 作者:太空宇宙 更新时间:2023-11-04 03:30:50 29 4
gpt4 key购买 nike

我正在尝试将大量 numpy 结构化数组作为数据集存储在 hdf5 文件中。
例如,
f['tree1'] = structured_array1


f['tree60000'] = structured_array60000(大约有 60000 棵树),

读取文件的大约 70% 时,出现错误RuntimeError:无法注册数据类型原子(无法插入重复键)

此问题仅发生在非常大的 ascii 文件中(10e7 行,5gb)。如果文件大约(10e6 行,500mb),则不会发生。如果我取出数据类型并仅存储为字符串的 numpy 数组,它也不会发生。

如果我中途停止读取文件,关闭我的终端,再次打开它,并继续从中途读取文件到最后(我保存我结束的行号),我可以解决这个问题。我尝试在 python 函数本身中打开和关闭 hdf5 文件,但这没有用。

dt = [
('scale', 'f4'),
('haloid', 'i8'),
('scale_desc', 'f4'),
('haloid_desc', 'i8'),
('num_prog', 'i4'),
('pid', 'i8'),
('upid', 'i8'),
('pid_desc', 'i8'),
('phantom', 'i4'),
('mvir_sam', 'f4'),
('mvir', 'f4'),
('rvir', 'f4'),
('rs', 'f4'),
('vrms', 'f4'),
('mmp', 'i4'),
('scale_lastmm', 'f4'),
('vmax', 'f4'),
('x', 'f4'),
('y', 'f4'),
('z', 'f4'),
('vx', 'f4'),
('vy', 'f4'),
('vz', 'f4'),
('jx', 'f4'),
('jy', 'f4'),
('jz', 'f4'),
('spin', 'f4'),
('haloid_breadth_first', 'i8'),
('haloid_depth_first', 'i8'),
('haloid_tree_root', 'i8'),
('haloid_orig', 'i8'),
('snap_num', 'i4'),
('haloid_next_coprog_depthfirst', 'i8'),
('haloid_last_prog_depthfirst', 'i8'),
('haloid_last_mainleaf_depthfirst', 'i8'),
('rs_klypin', 'f4'),
('mvir_all', 'f4'),
('m200b', 'f4'),
('m200c', 'f4'),
('m500c', 'f4'),
('m2500c', 'f4'),
('xoff', 'f4'),
('voff', 'f4'),
('spin_bullock', 'f4'),
('b_to_a', 'f4'),
('c_to_a', 'f4'),
('axisA_x', 'f4'),
('axisA_y', 'f4'),
('axisA_z', 'f4'),
('b_to_a_500c', 'f4'),
('c_to_a_500c', 'f4'),
('axisA_x_500c', 'f4'),
('axisA_y_500c', 'f4'),
('axisA_z_500c', 'f4'),
('t_by_u', 'f4'),
('mass_pe_behroozi', 'f4'),
('mass_pe_diemer', 'f4')
]

def read_in_trees(self):
"""Store each tree as an hdf5 dataset.
"""
with open(self.fname) as ascii_file:
with h5py.File(self.hdf5_name,"r+") as f:
tree_id = ""
current_tree = []
for line in ascii_file:
if(line[0]=='#'): #new tree
arr = np.array(current_tree, dtype = dt)
f[tree_id] = arr
current_tree = []
tree_id = line[6:].strip('\n')
else: #read in next tree element
current_tree.append(tuple(line.split()))
return

错误:

/Volumes/My Passport for Mac/raw_trees/bolshoi/rockstar/asciiReaderOne.py in read_in_trees(self)
129 arr = np.array(current_tree, dtype = dt)
130 # depth_sort = arr['haloid_depth_first'].argsort()
--> 131 f[tree_id] = arr
132 current_tree = []
133 first_line = False

/Library/Python/2.7/site-packages/h5py/_objects.so in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2458)()

/Library/Python/2.7/site-packages/h5py/_objects.so in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2415)()

/Library/Python/2.7/site-packages/h5py/_hl/group.pyc in __setitem__(self, name, obj)
281
282 else:
--> 283 ds = self.create_dataset(None, data=obj, dtype=base.guess_dtype(obj))
284 h5o.link(ds.id, self.id, name, lcpl=lcpl)
285

/Library/Python/2.7/site-packages/h5py/_hl/group.pyc in create_dataset(self, name, shape, dtype, data, **kwds)
101 """
102 with phil:
--> 103 dsid = dataset.make_new_dset(self, shape, dtype, data, **kwds)
104 dset = dataset.Dataset(dsid)
105 if name is not None:

/Library/Python/2.7/site-packages/h5py/_hl/dataset.pyc in make_new_dset(parent, shape, dtype, data, chunks, compression, shuffle, fletcher32, maxshape, compression_opts, fillvalue, scaleoffset, track_times)
124
125 if data is not None:
--> 126 dset_id.write(h5s.ALL, h5s.ALL, data)
127
128 return dset_id

/Library/Python/2.7/site-packages/h5py/_objects.so in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2458)()

/Library/Python/2.7/site-packages/h5py/_objects.so in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2415)()

/Library/Python/2.7/site-packages/h5py/h5d.so in h5py.h5d.DatasetID.write (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5d.c:3260)()

/Library/Python/2.7/site-packages/h5py/h5t.so in h5py.h5t.py_create (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5t.c:15314)()

/Library/Python/2.7/site-packages/h5py/h5t.so in h5py.h5t.py_create (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5t.c:14903)()

/Library/Python/2.7/site-packages/h5py/h5t.so in h5py.h5t._c_compound (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5t.c:14192)()

/Library/Python/2.7/site-packages/h5py/h5t.so in h5py.h5t.py_create (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5t.c:15314)()

/Library/Python/2.7/site-packages/h5py/h5t.so in h5py.h5t.py_create (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5t.c:14749)()

/Library/Python/2.7/site-packages/h5py/h5t.so in h5py.h5t._c_float (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5t.c:12379)()

RuntimeError: Unable to register datatype atom (Can't insert duplicate key)

最佳答案

你得到错误堆栈了吗?指示代码中何处产生错误?

您报告:错误 RuntimeError: Unable to register datatype atom (Can't insert duplicate key)

在/usr/lib/python3/dist-packages/h5py/_hl/datatype.py

class Datatype(HLObject):
# Represents an HDF5 named datatype stored in a file.
# >>> MyGroup["name"] = numpy.dtype("f")
def __init__(self, bind):
""" Create a new Datatype object by binding to a low-level TypeID.

我在这里抛出一个猜测。您的 dt 有 57 个术语。我怀疑每次您向文件添加一个时,它都会将每个字段注册为一个新的数据类型

In [71]: (57*10e7*.7)/(2**32)
Out[71]: 0.9289942681789397

57 * 10e7 的 70% 接近于 2*32。如果 Python/numpy 使用 int32 作为 dtype id,那么你可能会达到这个限制。

我们必须在 h5pynumpy 代码中进行更多挖掘,才能找到发出此错误消息的人。

通过向文件添加一个数组:

f[tree_id] = arr

您将每个数组放入新 中的数据集中。如果每个 Dataset 都有一个数据类型,或数组的每个字段的数据类型,您可以轻松获得 2*32 数据类型。

另一方面,如果您可以将多个 arr 存储到一个组或数据集中,您可能会避免注册数千种数据类型。我对 h5py 不够熟悉,无法建议您如何操作。


我想知道这个序列是否可以重用多个数据集的数据类型:

dt1=np.dtype(dt)
gg= f.create_group('testgroup')
gg['xdtype']=dt1
# see h5py.Datatype doc
xdtype=gg['xdtype']
x=np.zeros((10,),dtype=xdtype)
gg['tree1']=x
x=np.ones((10,),dtype=xdtype)
gg['tree2']=x

根据 Datatype 文档,我尝试注册一个命名数据类型,并将其用于添加到组中的每个数据集。

In [117]: isinstance(xdtype, h5py.Datatype)
Out[117]: True
In [118]: xdtype.id
Out[118]: <h5py.h5t.TypeCompoundID at 0xb46e0d4c>

因此,如果我正确阅读 def make_new_dset,这将绕过 py_create 调用。

关于python - 使用 h5py 创建大量数据集 - 无法注册数据类型原子(无法插入重复键),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31190573/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com