documenting 'cached' decorator

This commit is contained in:
Mahmoud Hashemi 2015-04-04 20:54:20 -07:00
parent 390a812b14
commit aec87c8fd1
2 changed files with 29 additions and 0 deletions

View File

@ -390,6 +390,21 @@ class CachedFunction(object):
def cached(cache, typed=False):
"""\
Cache any function with the cache instance of your choosing. Note
that the function wrapped should take only `hashable`_ arguments.
Args:
cache (Mapping): Any :class:`dict`-like object suitable for
use as a cache. Instances of the :class:`LRU` and
:class:`LRI` are good choices.
typed (bool): Whether to factor argument types into the cache
check. Default ``False``, setting to ``True`` causes the
cache keys for ``3`` and ``3.0`` to be considered unequal.
.. _hashable: https://docs.python.org/2/glossary.html#term-hashable
"""
def cached_func_decorator(func):
return CachedFunction(func, cache, typed=typed)

View File

@ -28,3 +28,17 @@ LRU has threadsafety built in.
.. autoclass:: boltons.cacheutils.LRU
:members:
Automatic function caching
--------------------------
Continuing in the theme of cache tunability and experimentation,
``cacheutils`` also offers a way to pluggably cache function return
values: the :func:`cached` function decorator.
.. autofunction:: boltons.cacheutils.cached
Similar functionality can be found in Python 3.4's :mod:`functools`
module, though it is made for cache pluggability and does not support
sharing the cache object across multiple functions.