functools — Higher-order functions and operations on callable objects (2024)

Source code: Lib/functools.py

The functools module is for higher-order functions: functions that act onor return other functions. In general, any callable object can be treated as afunction for the purposes of this module.

The functools module defines the following functions:

@functools.cache(user_function)

Simple lightweight unbounded function cache. Sometimes called“memoize”.

Returns the same as lru_cache(maxsize=None), creating a thinwrapper around a dictionary lookup for the function arguments. Because itnever needs to evict old values, this is smaller and faster thanlru_cache() with a size limit.

For example:

@cachedef factorial(n): return n * factorial(n-1) if n else 1>>> factorial(10) # no previously cached result, makes 11 recursive calls3628800>>> factorial(5) # just looks up cached value result120>>> factorial(12) # makes two new recursive calls, the other 10 are cached479001600

The cache is threadsafe so that the wrapped function can be used inmultiple threads. This means that the underlying data structure willremain coherent during concurrent updates.

It is possible for the wrapped function to be called more than once ifanother thread makes an additional call before the initial call has beencompleted and cached.

New in version 3.9.

@functools.cached_property(func)

Transform a method of a class into a property whose value is computed onceand then cached as a normal attribute for the life of the instance. Similarto property(), with the addition of caching. Useful for expensivecomputed properties of instances that are otherwise effectively immutable.

Example:

class DataSet: def __init__(self, sequence_of_numbers): self._data = tuple(sequence_of_numbers) @cached_property def stdev(self): return statistics.stdev(self._data)

The mechanics of cached_property() are somewhat different fromproperty(). A regular property blocks attribute writes unless asetter is defined. In contrast, a cached_property allows writes.

The cached_property decorator only runs on lookups and only when anattribute of the same name doesn’t exist. When it does run, thecached_property writes to the attribute with the same name. Subsequentattribute reads and writes take precedence over the cached_propertymethod and it works like a normal attribute.

The cached value can be cleared by deleting the attribute. Thisallows the cached_property method to run again.

The cached_property does not prevent a possible race condition inmulti-threaded usage. The getter function could run more than once on thesame instance, with the latest run setting the cached value. If the cachedproperty is idempotent or otherwise not harmful to run more than once on aninstance, this is fine. If synchronization is needed, implement the necessarylocking inside the decorated getter function or around the cached propertyaccess.

Note, this decorator interferes with the operation of PEP 412key-sharing dictionaries. This means that instance dictionariescan take more space than usual.

Also, this decorator requires that the __dict__ attribute on each instancebe a mutable mapping. This means it will not work with some types, such asmetaclasses (since the __dict__ attributes on type instances areread-only proxies for the class namespace), and those that specify__slots__ without including __dict__ as one of the defined slots(as such classes don’t provide a __dict__ attribute at all).

If a mutable mapping is not available or if space-efficient key sharing isdesired, an effect similar to cached_property() can also be achieved bystacking property() on top of lru_cache(). SeeHow do I cache method calls? for more details on how this differs from cached_property().

New in version 3.8.

Changed in version 3.12: Prior to Python 3.12, cached_property included an undocumented lock toensure that in multi-threaded usage the getter function was guaranteed torun only once per instance. However, the lock was per-property, notper-instance, which could result in unacceptably high lock contention. InPython 3.12+ this locking is removed.

functools.cmp_to_key(func)

Transform an old-style comparison function to a key function. Usedwith tools that accept key functions (such as sorted(), min(),max(), heapq.nlargest(), heapq.nsmallest(),itertools.groupby()). This function is primarily used as a transitiontool for programs being converted from Python 2 which supported the use ofcomparison functions.

A comparison function is any callable that accepts two arguments, compares them,and returns a negative number for less-than, zero for equality, or a positivenumber for greater-than. A key function is a callable that accepts oneargument and returns another value to be used as the sort key.

Example:

sorted(iterable, key=cmp_to_key(locale.strcoll)) # locale-aware sort order

For sorting examples and a brief sorting tutorial, see Sorting Techniques.

New in version 3.2.

@functools.lru_cache(user_function)
@functools.lru_cache(maxsize=128, typed=False)

Decorator to wrap a function with a memoizing callable that saves up to themaxsize most recent calls. It can save time when an expensive or I/O boundfunction is periodically called with the same arguments.

The cache is threadsafe so that the wrapped function can be used inmultiple threads. This means that the underlying data structure willremain coherent during concurrent updates.

It is possible for the wrapped function to be called more than once ifanother thread makes an additional call before the initial call has beencompleted and cached.

Since a dictionary is used to cache results, the positional and keywordarguments to the function must be hashable.

Distinct argument patterns may be considered to be distinct calls withseparate cache entries. For example, f(a=1, b=2) and f(b=2, a=1)differ in their keyword argument order and may have two separate cacheentries.

If user_function is specified, it must be a callable. This allows thelru_cache decorator to be applied directly to a user function, leavingthe maxsize at its default value of 128:

@lru_cachedef count_vowels(sentence): return sum(sentence.count(vowel) for vowel in 'AEIOUaeiou')

If maxsize is set to None, the LRU feature is disabled and the cache cangrow without bound.

If typed is set to true, function arguments of different types will becached separately. If typed is false, the implementation will usuallyregard them as equivalent calls and only cache a single result. (Sometypes such as str and int may be cached separately even when typedis false.)

Note, type specificity applies only to the function’s immediate argumentsrather than their contents. The scalar arguments, Decimal(42) andFraction(42) are be treated as distinct calls with distinct results.In contrast, the tuple arguments ('answer', Decimal(42)) and('answer', Fraction(42)) are treated as equivalent.

The wrapped function is instrumented with a cache_parameters()function that returns a new dict showing the values for maxsizeand typed. This is for information purposes only. Mutating the valueshas no effect.

To help measure the effectiveness of the cache and tune the maxsizeparameter, the wrapped function is instrumented with a cache_info()function that returns a named tuple showing hits, misses,maxsize and currsize.

The decorator also provides a cache_clear() function for clearing orinvalidating the cache.

The original underlying function is accessible through the__wrapped__ attribute. This is useful for introspection, forbypassing the cache, or for rewrapping the function with a different cache.

The cache keeps references to the arguments and return values until they ageout of the cache or until the cache is cleared.

If a method is cached, the self instance argument is included in thecache. See How do I cache method calls?

An LRU (least recently used) cacheworks best when the most recent calls are the best predictors of upcomingcalls (for example, the most popular articles on a news server tend tochange each day). The cache’s size limit assures that the cache does notgrow without bound on long-running processes such as web servers.

In general, the LRU cache should only be used when you want to reusepreviously computed values. Accordingly, it doesn’t make sense to cachefunctions with side-effects, functions that need to createdistinct mutable objects on each call (such as generators and async functions),or impure functions such as time() or random().

Example of an LRU cache for static web content:

@lru_cache(maxsize=32)def get_pep(num): 'Retrieve text of a Python Enhancement Proposal' resource = f'https://peps.python.org/pep-{num:04d}' try: with urllib.request.urlopen(resource) as s: return s.read() except urllib.error.HTTPError: return 'Not Found'>>> for n in 8, 290, 308, 320, 8, 218, 320, 279, 289, 320, 9991:... pep = get_pep(n)... print(n, len(pep))>>> get_pep.cache_info()CacheInfo(hits=3, misses=8, maxsize=32, currsize=8)

Example of efficiently computingFibonacci numbersusing a cache to implement adynamic programmingtechnique:

@lru_cache(maxsize=None)def fib(n): if n < 2: return n return fib(n-1) + fib(n-2)>>> [fib(n) for n in range(16)][0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610]>>> fib.cache_info()CacheInfo(hits=28, misses=16, maxsize=None, currsize=16)

New in version 3.2.

Changed in version 3.3: Added the typed option.

Changed in version 3.8: Added the user_function option.

Changed in version 3.9: Added the function cache_parameters()

@functools.total_ordering

Given a class defining one or more rich comparison ordering methods, thisclass decorator supplies the rest. This simplifies the effort involvedin specifying all of the possible rich comparison operations:

The class must define one of __lt__(), __le__(),__gt__(), or __ge__().In addition, the class should supply an __eq__() method.

For example:

@total_orderingclass Student: def _is_valid_operand(self, other): return (hasattr(other, "lastname") and hasattr(other, "firstname")) def __eq__(self, other): if not self._is_valid_operand(other): return NotImplemented return ((self.lastname.lower(), self.firstname.lower()) == (other.lastname.lower(), other.firstname.lower())) def __lt__(self, other): if not self._is_valid_operand(other): return NotImplemented return ((self.lastname.lower(), self.firstname.lower()) < (other.lastname.lower(), other.firstname.lower()))

Note

While this decorator makes it easy to create well behaved totallyordered types, it does come at the cost of slower execution andmore complex stack traces for the derived comparison methods. Ifperformance benchmarking indicates this is a bottleneck for a givenapplication, implementing all six rich comparison methods instead islikely to provide an easy speed boost.

Note

This decorator makes no attempt to override methods that have beendeclared in the class or its superclasses. Meaning that if asuperclass defines a comparison operator, total_ordering will notimplement it again, even if the original method is abstract.

New in version 3.2.

Changed in version 3.4: Returning NotImplemented from the underlying comparison function forunrecognised types is now supported.

functools.partial(func, /, *args, **keywords)

Return a new partial object which when calledwill behave like func called with the positional arguments argsand keyword arguments keywords. If more arguments are supplied to thecall, they are appended to args. If additional keyword arguments aresupplied, they extend and override keywords.Roughly equivalent to:

def partial(func, /, *args, **keywords): def newfunc(*fargs, **fkeywords): newkeywords = {**keywords, **fkeywords} return func(*args, *fargs, **newkeywords) newfunc.func = func newfunc.args = args newfunc.keywords = keywords return newfunc

The partial() is used for partial function application which “freezes”some portion of a function’s arguments and/or keywords resulting in a new objectwith a simplified signature. For example, partial() can be used to createa callable that behaves like the int() function where the base argumentdefaults to two:

>>> from functools import partial>>> basetwo = partial(int, base=2)>>> basetwo.__doc__ = 'Convert base 2 string to an int.'>>> basetwo('10010')18
class functools.partialmethod(func, /, *args, **keywords)

Return a new partialmethod descriptor which behaveslike partial except that it is designed to be used as a methoddefinition rather than being directly callable.

func must be a descriptor or a callable (objects which are both,like normal functions, are handled as descriptors).

When func is a descriptor (such as a normal Python function,classmethod(), staticmethod(), abstractmethod() oranother instance of partialmethod), calls to __get__ aredelegated to the underlying descriptor, and an appropriatepartial object returned as the result.

When func is a non-descriptor callable, an appropriate bound method iscreated dynamically. This behaves like a normal Python function whenused as a method: the self argument will be inserted as the firstpositional argument, even before the args and keywords supplied tothe partialmethod constructor.

Example:

>>> class Cell:...  def __init__(self):...  self._alive = False...  @property...  def alive(self):...  return self._alive...  def set_state(self, state):...  self._alive = bool(state)...  set_alive = partialmethod(set_state, True)...  set_dead = partialmethod(set_state, False)...>>> c = Cell()>>> c.aliveFalse>>> c.set_alive()>>> c.aliveTrue

New in version 3.4.

functools.reduce(function, iterable[, initializer])

Apply function of two arguments cumulatively to the items of iterable, fromleft to right, so as to reduce the iterable to a single value. For example,reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) calculates ((((1+2)+3)+4)+5).The left argument, x, is the accumulated value and the right argument, y, isthe update value from the iterable. If the optional initializer is present,it is placed before the items of the iterable in the calculation, and serves asa default when the iterable is empty. If initializer is not given anditerable contains only one item, the first item is returned.

Roughly equivalent to:

def reduce(function, iterable, initializer=None): it = iter(iterable) if initializer is None: value = next(it) else: value = initializer for element in it: value = function(value, element) return value

See itertools.accumulate() for an iterator that yields all intermediatevalues.

@functools.singledispatch

Transform a function into a single-dispatch generic function.

To define a generic function, decorate it with the @singledispatchdecorator. When defining a function using @singledispatch, note that thedispatch happens on the type of the first argument:

>>> from functools import singledispatch>>> @singledispatch... def fun(arg, verbose=False):...  if verbose:...  print("Let me just say,", end=" ")...  print(arg)

To add overloaded implementations to the function, use the register()attribute of the generic function, which can be used as a decorator. Forfunctions annotated with types, the decorator will infer the type of thefirst argument automatically:

>>> @fun.register... def _(arg: int, verbose=False):...  if verbose:...  print("Strength in numbers, eh?", end=" ")...  print(arg)...>>> @fun.register... def _(arg: list, verbose=False):...  if verbose:...  print("Enumerate this:")...  for i, elem in enumerate(arg):...  print(i, elem)

types.UnionType and typing.Union can also be used:

>>> @fun.register... def _(arg: int | float, verbose=False):...  if verbose:...  print("Strength in numbers, eh?", end=" ")...  print(arg)...>>> from typing import Union>>> @fun.register... def _(arg: Union[list, set], verbose=False):...  if verbose:...  print("Enumerate this:")...  for i, elem in enumerate(arg):...  print(i, elem)...

For code which doesn’t use type annotations, the appropriate typeargument can be passed explicitly to the decorator itself:

>>> @fun.register(complex)... def _(arg, verbose=False):...  if verbose:...  print("Better than complicated.", end=" ")...  print(arg.real, arg.imag)...

To enable registering lambdas and pre-existing functions,the register() attribute can also be used in a functional form:

>>> def nothing(arg, verbose=False):...  print("Nothing.")...>>> fun.register(type(None), nothing)

The register() attribute returns the undecorated function. Thisenables decorator stacking, pickling, and the creationof unit tests for each variant independently:

>>> @fun.register(float)... @fun.register(Decimal)... def fun_num(arg, verbose=False):...  if verbose:...  print("Half of your number:", end=" ")...  print(arg / 2)...>>> fun_num is funFalse

When called, the generic function dispatches on the type of the firstargument:

>>> fun("Hello, world.")Hello, world.>>> fun("test.", verbose=True)Let me just say, test.>>> fun(42, verbose=True)Strength in numbers, eh? 42>>> fun(['spam', 'spam', 'eggs', 'spam'], verbose=True)Enumerate this:0 spam1 spam2 eggs3 spam>>> fun(None)Nothing.>>> fun(1.23)0.615

Where there is no registered implementation for a specific type, itsmethod resolution order is used to find a more generic implementation.The original function decorated with @singledispatch is registeredfor the base object type, which means it is used if no betterimplementation is found.

If an implementation is registered to an abstract base class,virtual subclasses of the base class will be dispatched to thatimplementation:

>>> from collections.abc import Mapping>>> @fun.register... def _(arg: Mapping, verbose=False):...  if verbose:...  print("Keys & Values")...  for key, value in arg.items():...  print(key, "=>", value)...>>> fun({"a": "b"})a => b

To check which implementation the generic function will choose fora given type, use the dispatch() attribute:

>>> fun.dispatch(float)<function fun_num at 0x1035a2840>>>> fun.dispatch(dict) # note: default implementation<function fun at 0x103fe0000>

To access all registered implementations, use the read-only registryattribute:

>>> fun.registry.keys()dict_keys([<class 'NoneType'>, <class 'int'>, <class 'object'>, <class 'decimal.Decimal'>, <class 'list'>, <class 'float'>])>>> fun.registry[float]<function fun_num at 0x1035a2840>>>> fun.registry[object]<function fun at 0x103fe0000>

New in version 3.4.

Changed in version 3.7: The register() attribute now supports using type annotations.

Changed in version 3.11: The register() attribute now supports types.UnionTypeand typing.Union as type annotations.

class functools.singledispatchmethod(func)

Transform a method into a single-dispatch generic function.

To define a generic method, decorate it with the @singledispatchmethoddecorator. When defining a function using @singledispatchmethod, notethat the dispatch happens on the type of the first non-self or non-clsargument:

class Negator: @singledispatchmethod def neg(self, arg): raise NotImplementedError("Cannot negate a") @neg.register def _(self, arg: int): return -arg @neg.register def _(self, arg: bool): return not arg

@singledispatchmethod supports nesting with other decorators such as@classmethod. Note that to allow fordispatcher.register, singledispatchmethod must be the outer mostdecorator. Here is the Negator class with the neg methods bound tothe class, rather than an instance of the class:

class Negator: @singledispatchmethod @classmethod def neg(cls, arg): raise NotImplementedError("Cannot negate a") @neg.register @classmethod def _(cls, arg: int): return -arg @neg.register @classmethod def _(cls, arg: bool): return not arg

The same pattern can be used for other similar decorators:@staticmethod,@abstractmethod, and others.

New in version 3.8.

functools.update_wrapper(wrapper, wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)

Update a wrapper function to look like the wrapped function. The optionalarguments are tuples to specify which attributes of the original function areassigned directly to the matching attributes on the wrapper function and whichattributes of the wrapper function are updated with the corresponding attributesfrom the original function. The default values for these arguments are themodule level constants WRAPPER_ASSIGNMENTS (which assigns to the wrapperfunction’s __module__, __name__, __qualname__, __annotations__and __doc__, the documentation string) and WRAPPER_UPDATES (whichupdates the wrapper function’s __dict__, i.e. the instance dictionary).

To allow access to the original function for introspection and other purposes(e.g. bypassing a caching decorator such as lru_cache()), this functionautomatically adds a __wrapped__ attribute to the wrapper that refers tothe function being wrapped.

The main intended use for this function is in decorator functions whichwrap the decorated function and return the wrapper. If the wrapper function isnot updated, the metadata of the returned function will reflect the wrapperdefinition rather than the original function definition, which is typically lessthan helpful.

update_wrapper() may be used with callables other than functions. Anyattributes named in assigned or updated that are missing from the objectbeing wrapped are ignored (i.e. this function will not attempt to set themon the wrapper function). AttributeError is still raised if thewrapper function itself is missing any attributes named in updated.

Changed in version 3.2: The __wrapped__ attribute is now automatically added.The __annotations__ attribute is now copied by default.Missing attributes no longer trigger an AttributeError.

Changed in version 3.4: The __wrapped__ attribute now always refers to the wrappedfunction, even if that function defined a __wrapped__ attribute.(see bpo-17482)

@functools.wraps(wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)

This is a convenience function for invoking update_wrapper() as afunction decorator when defining a wrapper function. It is equivalent topartial(update_wrapper, wrapped=wrapped, assigned=assigned, updated=updated).For example:

>>> from functools import wraps>>> def my_decorator(f):...  @wraps(f)...  def wrapper(*args, **kwds):...  print('Calling decorated function')...  return f(*args, **kwds)...  return wrapper...>>> @my_decorator... def example():...  """Docstring"""...  print('Called example function')...>>> example()Calling decorated functionCalled example function>>> example.__name__'example'>>> example.__doc__'Docstring'

Without the use of this decorator factory, the name of the example functionwould have been 'wrapper', and the docstring of the original example()would have been lost.

partial Objects

partial objects are callable objects created by partial(). Theyhave three read-only attributes:

partial.func

A callable object or function. Calls to the partial object will beforwarded to func with new arguments and keywords.

partial.args

The leftmost positional arguments that will be prepended to the positionalarguments provided to a partial object call.

partial.keywords

The keyword arguments that will be supplied when the partial object iscalled.

partial objects are like function objects in that they arecallable, weak referenceable, and can have attributes. There are some importantdifferences. For instance, the __name__ and __doc__ attributesare not created automatically. Also, partial objects defined inclasses behave like static methods and do not transform into bound methodsduring instance attribute look-up.

functools — Higher-order functions and operations on callable objects (2024)
Top Articles
Latest Posts
Article information

Author: Maia Crooks Jr

Last Updated:

Views: 5982

Rating: 4.2 / 5 (63 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Maia Crooks Jr

Birthday: 1997-09-21

Address: 93119 Joseph Street, Peggyfurt, NC 11582

Phone: +2983088926881

Job: Principal Design Liaison

Hobby: Web surfing, Skiing, role-playing games, Sketching, Polo, Sewing, Genealogy

Introduction: My name is Maia Crooks Jr, I am a homely, joyous, shiny, successful, hilarious, thoughtful, joyous person who loves writing and wants to share my knowledge and understanding with you.