Memoizing decorator. Has the same API as the functools.lru_cache() in Py3.2 but without the LRU feature, so it takes less memory, runs faster, and doesn't need locks to keep the dictionary in a consistent state.
1 2 3 4 5 6 7 8 910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576 | fromcollectionsimportnamedtuplefromfunctoolsimportwraps_CacheInfo=namedtuple("CacheInfo","hits misses maxsize currsize")defcache():"""Memoizing cache decorator. Arguments to the cached function must be hashable. View the cache statistics named tuple (hits, misses maxsize, size) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). """defdecorating_function(user_function,tuple=tuple,sorted=sorted,len=len,KeyError=KeyError):cache=dict()hits=misses=0kwd_mark=object()# separates positional and keyword args@wraps(user_function)defwrapper(*args,**kwds):nonlocalhits,misseskey=argsifkwds:key+=(kwd_mark,)+tuple(sorted(kwds.items()))try:result=cache[key]hits+=1exceptKeyError:result=user_function(*args,**kwds)cache[key]=resultmisses+=1returnresultdefcache_info():"""Report cache statistics"""return_CacheInfo(hits,misses,None,len(cache))defcache_clear():"""Clear the cache and cache statistics"""nonlocalhits,missescache.clear()hits=misses=0wrapper.cache_info=cache_infowrapper.cache_clear=cache_clearreturnwrapperreturndecorating_function# ----- Example ----------------------------------------------------------------if__name__=='__main__':@cache()deffib(n):ifn<2:return1returnfib(n-1)+fib(n-2)fromrandomimportshuffleinputs=list(range(30))shuffle(inputs)results=sorted(fib(n)fornininputs)print(results)print(fib.cache_info())expected_output='''[1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610, 987, 1597, 2584, 4181, 6765, 10946, 17711, 28657, 46368, 75025, 121393, 196418, 317811, 514229, 832040] CacheInfo(hits=56, misses=30, maxsize=None, currsize=30) ''' |
Fast, lightweight alternative to the LRU cache in Py3.2. Use the LRU version for long running processes that need to free-up memory. Use this for whenever cumulative cache growth isn't an issue.
The @cache() syntax is used instead of @cache to keep the API as close as possible to the LRU cache.
Isn't that the same thing as " functools.lru_cache(maxsize=None) " ?
| Created byRaymond HettingeronWed, 1 Dec 2010(MIT) |
| ◄ | Python recipes (4591) | ► |
| ◄ | Raymond Hettinger's recipes (97) | ► |
| ◄ | HongxuChen's Fav (39) | ► |
Privacy Policy |Contact Us |Support
© 2024 ActiveState Software Inc. All rights reserved. ActiveState®, Komodo®, ActiveState Perl Dev Kit®, ActiveState Tcl Dev Kit®, ActivePerl®, ActivePython®, and ActiveTcl® are registered trademarks of ActiveState. All other marks are property of their respective owners.