partial
Used to create a biased function that wraps the default arguments in a callable object and returns a result that is also a callable object.
A biased function fixes some of the arguments of the original function, thus making it simpler to call.
from functools import partial int2 = partial(int, base=8) print(int2('123')) # 83
update_wrapper
Functions wrapped with partial do not have the __name__ and __doc__ attributes.
update_wrapper Function: Copy the __name__ and other attributes of the wrapped function to the new function.
from functools import update_wrapper def wrap2(func): def inner(*args): return func(*args) return update_wrapper(inner, func) @wrap2 def demo(): print('hello world') print(demo.__name__) # demo
wraps
The warps function is designed to copy the __name__ of the decorated function at the decorator.
It's just a wrapper on update_wrapper
from functools import wraps def wrap1(func): @wraps(func) # Removing it returns inner def inner(*args): print(func.__name__) return func(*args) return inner @wrap1 def demo(): print('hello world') print(demo.__name__) # demo
reduce
The Python2 equivalent is the built-in function reduce.
The function's role is to summarize a sequence into an output
reduce(function, sequence, startValue) from functools import reduce l = range(1,50) print(reduce(lambda x,y:x+y, l)) # 1225
cmp_to_key
There is a key argument in both and the built-in function sorted.
x = ['hello','worl','ni'] (key=len) print(x) # ['ni', 'worl', 'hello']
Python3 previously provided the cmp parameter to compare two elements.
The cmp_to_key function is used to convert an old-fashioned comparison function into a key function
lru_cache
Allows us to cache or uncache the return value of a function on the fly.
The decorator is used to cache the result of a function call. For functions that need to be called multiple times and with the same parameters each time, the decorator can be used to cache the result of the call, thus speeding up the program.
This decorator caches the results of different calls in memory, so be aware of the memory footprint.
from functools import lru_cache @lru_cache(maxsize=30) # The maxsize parameter tells lru_cache how many recent returns to cache def fib(n): if n < 2: return n return fib(n-1) + fib(n-2) print([fib(n) for n in range(10)]) fib.cache_clear() # Empty the cache
singledispatch
Single distributor, new to Python 3.4, for implementing generic functions.
Determines which function to call based on the type of a single argument.
from functools import singledispatch @singledispatch def fun(text): print('String:' + text) @(int) def _(text): print(text) @(list) def _(text): for k, v in enumerate(text): print(k, v) @(float) @(tuple) def _(text): print('float, tuple') fun('i am is hubo') fun(123) fun(['a','b','c']) fun(1.23) print() # All generalized functions print([int]) # Get generic function for int # String:i am is hubo # 123 # 0 a # 1 b # 2 c # float, tuple # {<class 'object'>: <function fun at 0x106d10f28>, <class 'int'>: <function _ at 0x106f0b9d8>, <class 'list'>: <function _ at 0x106f0ba60>, <class 'tuple'>: <function _ at 0x106f0bb70>, <class 'float'>: <function _ at 0x106f0bb70>} # <function _ at 0x106f0b9d8>
The above is a personal experience, I hope it can give you a reference, and I hope you can support me more.