gpt4 book ai didi

Python 字典 : the size affects timing?

转载 作者:太空宇宙 更新时间:2023-11-04 00:54:43 26 4
gpt4 key购买 nike

假设您在字典 A 中有一个键,而在字典 B 中有 10 亿个键

从算法上讲,查找操作是 O(1)

但是,根据dict的大小,实际查找的时间(程序执行时间)不同?

onekey_stime = time.time()
print one_key_dict.get('firstkey')
onekey_dur = time.time() - onekey_stime

manykeys_stime = time.time()
print manykeys_dict.get('randomkey')
manykeys_dur = time.time() - manykey_stime

我会看到 onekey_durmanykeys_dur 之间有任何时间差异吗?

最佳答案

在使用小型和大型 dict 的测试中几乎相同:

In [31]: random_key = lambda: ''.join(np.random.choice(list(string.ascii_letters), 20))

In [32]: few_keys = {random_key(): np.random.random() for _ in xrange(100)}

In [33]: many_keys = {random_key(): np.random.random() for _ in xrange(1000000)}

In [34]: few_lookups = np.random.choice(few_keys.keys(), 50)

In [35]: many_lookups = np.random.choice(many_keys.keys(), 50)

In [36]: %timeit [few_keys[k] for k in few_lookups]
100000 loops, best of 3: 6.25 µs per loop

In [37]: %timeit [many_keys[k] for k in many_lookups]
100000 loops, best of 3: 7.01 µs per loop

编辑:对你来说,@ShadowRanger——错过的查找也非常接近:

In [38]: %timeit [few_keys.get(k) for k in many_lookups]
100000 loops, best of 3: 7.99 µs per loop

In [39]: %timeit [many_keys.get(k) for k in few_lookups]
100000 loops, best of 3: 8.78 µs per loop

关于Python 字典 : the size affects timing?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/35591668/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com