gpt4 book ai didi

python - python lru_cache 中的单位是什么?

转载 作者:行者123 更新时间:2023-12-03 20:25:29 28 4
gpt4 key购买 nike

根据documentation lru_cache 的默认值来自 functools是 128。但是没有定义单位。

Decorator to wrap a function with a memoizing callable that saves up to the maxsize most recent calls. It can save time when an expensive or I/O bound function is periodically called with the same arguments.

Since a dictionary is used to cache results, the positional and keyword arguments to the function must be hashable.

Distinct argument patterns may be considered to be distinct calls with separate cache entries. For example, f(a=1, b=2) and f(b=2, a=1) differ in their keyword argument order and may have two separate cache entries.

If user_function is specified, it must be a callable. This allows the lru_cache decorator to be applied directly to a user function, leaving the maxsize at its default value of 128.



我的问题是有没有像位、字节、兆字节这样的单位附加到这个或者这是一个与使用的内存没有简单关系的任意数字?

最佳答案

简答 :它是存储在缓存中的元素数。

我们可以查一下 lru_cache [GitHub]的源代码.代码比较复杂,但简而言之,line 619已经给出了线索​​:

                    full = (cache_len() >= maxsize)


这指定缓存已满,因为 cache_len()大于或等于 maxsize .
cache_len是一个返回字典中记录数的函数,我们可以在 source code中看到:

 cache = {}
hits = misses = 0
full = False
cache_get = cache.get # bound method to lookup a key or return None
cache_len = cache.__len__ # get cache size without calling len()


逻辑也是每次 branches when it adds a new record ,如果缓存已满,它将“踢出”以下元素之一:

                if key in cache:
# Getting here means that this same key was added to the
# cache while the lock was released. Since the link
# update is already done, we need only return the
# computed result and update the count of misses.
pass
elif full:
# Use the old root to store the new key and result.
oldroot = root
oldroot[KEY] = key
oldroot[RESULT] = result
# Empty the oldest link and make it the new root.
# Keep a reference to the old key and old result to
# prevent their ref counts from going to zero during the
# update. That will prevent potentially arbitrary object
# clean-up code (i.e. __del__) from running while we're
# still adjusting the links.
root = oldroot[NEXT]
oldkey = root[KEY]
oldresult = root[RESULT]
root[KEY] = root[RESULT] = None
# Now update the cache dictionary.
del cache[oldkey]
# Save the potentially reentrant cache[key] assignment
# for last, after the root and links have been put in
# a consistent state.
cache[key] = oldroot
else:
# Put result in a new link at the front of the queue.
last = root[PREV]
link = [last, root, key, result]
last[NEXT] = root[PREV] = cache[key] = link
# Use the cache_len bound method instead of the len() function
# which could potentially be wrapped in an lru_cache itself.
full = (cache_len() >= maxsize)

关于python - python lru_cache 中的单位是什么?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62183821/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com