Image by Author | Ideogram
Ever felt like your Python code could be more elegant or efficient? Decorators might be the game-changer you’re looking for. Think of decorators as special modifiers that wrap around your functions, adding functionality with minimal effort.
These powerful tools transform how your functions and classes behave without altering their core code. Python ships with several built-in decorators that can improve your code quality, readability, and performance.
In this article, we’ll look at some of Python’s most practical built-in decorators that you can use everyday as a developer—for optimizing performance, creating cleaner APIs, reducing boilerplate code, and much more. Most of these decorators are part of Python’s built-in functools module.
▶️ You can find all the code on GitHub.
1. @property – Clean Attribute Access
The @property
decorator transforms methods into attributes, allowing you to add validation logic while maintaining a clean interface.
Here we create a Temperature
class with celsius
and fahrenheit
properties that automatically handle conversion between units. When you set the temperature, it performs validation to prevent physically impossible values (below absolute zero).
class Temperature:
def __init__(self, celsius=0):
self._celsius = celsius
@property
def celsius(self):
return self._celsius
@celsius.setter
def celsius(self, value):
if value
The getters and setters work super smooth, so accessing temp.celsius
actually calls a method, but the interface feels like a regular attribute.
temp = Temperature()
temp.celsius = 25 # Clean attribute-like access with validation
print(f"temp.celsius°C = temp.fahrenheit°F")
Output:
2. @functools.cached_property – Lazy Computed Properties
The @cached_property
decorator from the functools module combines @property
with caching, computing a value only once and then storing it until the instance is deleted.
This example shows how @cached_property
can improve performance for expensive computations.
from functools import cached_property
import time
class DataAnalyzer:
def __init__(self, dataset):
self.dataset = dataset
@cached_property
def complex_analysis(self):
print("Running expensive analysis...")
time.sleep(2) # Simulating heavy computation
return sum(x**2 for x in self.dataset)
The first time you access complex_analysis
, it performs the calculation and caches the result. All subsequent accesses return the cached value instantly without recalculating.
analyzer = DataAnalyzer(range(1000000))
print("First access:")
t1 = time.time()
result1 = analyzer.complex_analysis
t2 = time.time()
print(f"Result: result1, Time: t2-t1:.2fs")
print("\nSecond access:")
t1 = time.time()
result2 = analyzer.complex_analysis
t2 = time.time()
print(f"Result: result2, Time: t2-t1:.2fs")
For the example we’ve taken, you’ll see the following output:
First access:
Running expensive analysis...
Result: 333332833333500000, Time: 2.17s
Second access:
Result: 333332833333500000, Time: 0.00s
This makes it suitable for data analysis pipelines where the same computation might be referenced multiple times.
3. @functools.lru_cache – Memoization
The lru_cache
decorator caches function results based on arguments. Which helps in speeding up expensive calculations.
Here’s a recursive Fibonacci sequence calculator that would normally be inefficient. But the @lru_cache
decorator makes it by storing previously calculated values.
from functools import lru_cache
@lru_cache(maxsize=128)
def fibonacci(n):
if n
Whenever the function is called with an argument it’s seen before, it returns the cached result instantly.
import time
start = time.time()
result = fibonacci(35)
end = time.time()
print(f"Fibonacci(35) = result, calculated in end-start:.6f seconds")
# Check cache statistics
print(f"Cache info: fibonacci.cache_info()")
The cache statistics show how many calls were avoided, demonstrating enormous performance gains for recursive algorithms.
Fibonacci(35) = 9227465, calculated in 0.000075 seconds
Cache info: CacheInfo(hits=33, misses=36, maxsize=128, currsize=36)
4. @contextlib.contextmanager – Custom Context Managers
The contextmanager
decorator from contextlib lets you create your own context managers without implementing the full __enter__/__exit__
protocol.
Let’s see how to create custom context managers with minimal code. The file_manager
function ensures files are properly closed even if exceptions occur:
from contextlib import contextmanager
@contextmanager
def file_manager(filename, mode):
try:
f = open(filename, mode)
yield f
finally:
f.close()
The timer
function measures execution time of any code block:
@contextmanager
def timer():
import time
start = time.time()
yield
elapsed = time.time() - start
print(f"Elapsed time: elapsed:.6f seconds")
The @contextmanager
decorator handles all the complexity of the context management protocol, letting you focus on the setup and cleanup logic. The yield
statement marks where execution transfers to the code inside the with
block.
Here’s how you can use these context managers.
with file_manager('test.txt', 'w') as f:
f.write('Hello, context managers!')
with timer():
# Code to time
sum(i*i for i in range(1000000))
5. @functools.singledispatch – Function Overloading
The @singledispatch
decorator from functools implements single-dispatch generic functions, allowing different implementations based on argument type.
Let’s create a flexible formatting system that handles different data types appropriately. The @singledispatch
decorator lets you define a default implementation for the format_output
function, then register specialized versions for different types.
from functools import singledispatch
from datetime import date, datetime
@singledispatch
def format_output(obj):
return str(obj)
@format_output.register
def _(obj: int):
return f"INTEGER: obj:+d"
@format_output.register
def _(obj: float):
return f"FLOAT: obj:.2f"
@format_output.register
def _(obj: date):
return f"DATE: obj.strftime('%Y-%m-%d')"
@format_output.register(list)
def _(obj):
return f"LIST: ', '.join(format_output(x) for x in obj)"
When you call the function, Python automatically selects the right implementation based on the argument type.
results = [
format_output("Hello"),
format_output(42),
format_output(-3.14159),
format_output(date(2025, 2, 21)),
format_output([1, 2.5, "three"])
]
for r in results:
print(r)
Output:
Hello
INTEGER: +42
FLOAT: -3.14
DATE: 2025-02-21
LIST: INTEGER: +1, FLOAT: 2.50, three
This brings type-based method dispatch (common in languages like Java) to Python in a clean, extensible way.
6. @functools.total_ordering – Complete Comparison Operations
This decorator generates all comparison methods from a minimal set you define.
Here we create a semantic versioning class with complete comparison capabilities. By defining just __eq__
and __lt__
, the @total_ordering
decorator automatically generates __le__
, __gt__
, and __ge__
.
from functools import total_ordering
@total_ordering
class Version:
def __init__(self, major, minor, patch):
self.major = major
self.minor = minor
self.patch = patch
def __eq__(self, other):
if not isinstance(other, Version):
return NotImplemented
return (self.major, self.minor, self.patch) == (other.major, other.minor, other.patch)
def __lt__(self, other):
if not isinstance(other, Version):
return NotImplemented
return (self.major, self.minor, self.patch)
This saves considerable boilerplate code while ensuring consistent behavior across all comparison operations. The Version
class can now be fully sorted, compared with all operators, and used in any context that requires ordered objects.
versions = [
Version(2, 0, 0),
Version(1, 9, 5),
Version(1, 11, 0),
Version(2, 0, 1)
]
print(f"Sorted versions: sorted(versions)")
print(f"v1.9.5 > v1.11.0: Version(1, 9, 5) > Version(1, 11, 0)")
print(f"v2.0.0 >= v2.0.0: Version(2, 0, 0) >= Version(2, 0, 0)")
print(f"v2.0.1
Output:
Sorted versions: [v1.9.5, v1.11.0, v2.0.0, v2.0.1]
v1.9.5 > v1.11.0: False
v2.0.0 >= v2.0.0: True
v2.0.1
7. @functools.wraps – Preserving Metadata
When writing custom decorators, @wraps
(also from the functools module) preserves the original function’s metadata, making debugging much easier.
This code shows how to create proper decorators that maintain the wrapped function’s identity. The log_execution
decorator adds debugging output before and after the function call while preserving the original function’s characteristics.
import functools
def log_execution(func):
@functools.wraps(func) # Preserves func's name, docstring, etc.
def wrapper(*args, **kwargs):
print(f"Calling func.__name__ with args: args, kwargs: kwargs")
result = func(*args, **kwargs)
print(f"func.__name__ returned: result")
return result
return wrapper
@log_execution
def add(a, b):
"""Add two numbers and return the result."""
return a + b
Without @functools.wraps
, the decorated function would lose its name, docstring, and other metadata. But reserving metadata is essential for creating production-quality decorators that work correctly with documentation generators and the like.
# Without @wraps, help(add) would show wrapper's info
help(add) # Shows the original docstring
print(f"Function name: add.__name__") # Shows "add", not "wrapper"
result = add(5, 3)
Output:
Help on function add in module __main__:
add(a, b)
Add two numbers and return the result.
Function name: add
Calling add with args: (5, 3), kwargs:
add returned: 8
Wrapping Up
Here’s a quick summary of the decorators we’ve learned about:
@property
for clean attribute interfaces@cached_property
for lazy computation and caching@lru_cache
for performance optimization@contextmanager
for resource management@singledispatch
for type-based method selection@total_ordering
for complete comparison operators@wraps
for maintaining function metadata
What else would you add to the list? Let us know in the comments.
Bala Priya C is a developer and technical writer from India. She likes working at the intersection of math, programming, data science, and content creation. Her areas of interest and expertise include DevOps, data science, and natural language processing. She enjoys reading, writing, coding, and coffee! Currently, she’s working on learning and sharing her knowledge with the developer community by authoring tutorials, how-to guides, opinion pieces, and more. Bala also creates engaging resource overviews and coding tutorials.