Python decorators are one of the language’s most elegant features, providing a way to modify function behavior without altering the function’s core logic. I’ve worked with decorators for years and find them indispensable for writing clean, maintainable code. Let me share seven advanced decorator patterns that can significantly improve your Python projects.
Function registration is a pattern I use frequently in larger applications. It allows you to build extensible systems by automatically collecting functions into a registry.
command_registry = {}
def command(name):
def decorator(func):
command_registry[name] = func
return func
return decorator
@command("start")
def start_application():
print("Application starting...")
@command("stop")
def stop_application():
print("Application stopping...")
# Later, execute commands by name
def run_command(command_name, *args, **kwargs):
if command_name in command_registry:
return command_registry[command_name](*args, **kwargs)
else:
print(f"Unknown command: {command_name}")
This pattern is excellent for creating command-line interfaces, plugin systems, or event handlers. The decorator handles the registration, keeping your function definitions clean and focused on their specific tasks.
Performance measurement is critical in production applications. A timing decorator provides a clean way to profile code without littering your functions with timing logic.
import time
import functools
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start_time = time.perf_counter()
result = func(*args, **kwargs)
end_time = time.perf_counter()
execution_time = end_time - start_time
print(f"{func.__name__} completed in {execution_time:.6f} seconds")
return result
return wrapper
@timer
def process_data(items):
# Simulate processing
time.sleep(0.1)
return [i * 2 for i in items]
Notice the use of functools.wraps()
- this preserves the original function’s metadata like name and docstring, which is important for debugging and documentation.
Memoization is a technique I’ve found invaluable for optimizing recursive functions or any calculations that are repeatedly called with the same arguments. It caches results to avoid redundant computation.
import functools
def memoize(func):
cache = {}
@functools.wraps(func)
def wrapper(*args, **kwargs):
# Create a key from the arguments
# For kwargs, we need to sort to ensure consistent ordering
key = (args, tuple(sorted(kwargs.items())))
if key not in cache:
cache[key] = func(*args, **kwargs)
return cache[key]
return wrapper
@memoize
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
# Without memoization, this would be very slow
print(fibonacci(100)) # Instant result with memoization
This implementation dramatically improves performance for functions with expensive calculations. The Fibonacci sequence is a classic example - without memoization, computing larger Fibonacci numbers becomes exponentially slower.
Input validation is essential for robust code. Decorators can enforce type constraints and other validation rules without cluttering your function bodies.
def validate(validation_func, error_message):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
if not validation_func(*args, **kwargs):
raise ValueError(error_message)
return func(*args, **kwargs)
return wrapper
return decorator
# Create specific validators using the generic decorator
def validate_positive(func):
return validate(
lambda x: x > 0,
"Argument must be positive"
)(func)
@validate_positive
def calculate_square_root(n):
import math
return math.sqrt(n)
This pattern is particularly useful for APIs where you need consistent validation across multiple functions. You can create specific validators for common requirements and apply them wherever needed.
When working with external services, network calls often fail intermittently. A retry decorator handles temporary failures gracefully without complicating your code.
import time
import functools
import logging
def retry(max_attempts=3, delay_seconds=1, backoff_factor=2, exceptions=(Exception,)):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
current_delay = delay_seconds
for attempt in range(max_attempts):
try:
return func(*args, **kwargs)
except exceptions as e:
if attempt == max_attempts - 1:
raise
logging.warning(
f"Attempt {attempt + 1}/{max_attempts} failed with error: {e}. "
f"Retrying in {current_delay} seconds..."
)
time.sleep(current_delay)
current_delay *= backoff_factor
return wrapper
return decorator
@retry(max_attempts=5, delay_seconds=2, backoff_factor=2, exceptions=(ConnectionError, TimeoutError))
def fetch_data_from_api(url):
import random
# Simulate random failures
if random.random() < 0.7:
raise ConnectionError("API connection failed")
return {"data": "success"}
This implementation includes exponential backoff, which I’ve found essential when working with rate-limited APIs. It gradually increases the delay between retry attempts to give the external service time to recover.
Resource management can be streamlined with decorators that handle context-specific setup and teardown operations.
import functools
import contextlib
def with_resource(resource_func, *args, **kwargs):
def decorator(func):
@functools.wraps(func)
def wrapper(*func_args, **func_kwargs):
with resource_func(*args, **kwargs) as resource:
return func(resource, *func_args, **func_kwargs)
return wrapper
return decorator
# Example: Database connection handling
@contextlib.contextmanager
def db_connection(connection_string):
print(f"Connecting to database: {connection_string}")
# In a real application, this would connect to a database
connection = {"connected": True}
try:
yield connection
finally:
print("Closing database connection")
connection["connected"] = False
@with_resource(db_connection, "postgresql://localhost/mydb")
def get_user_data(db, user_id):
# In a real application, this would query the database
print(f"Fetching data for user {user_id} from database")
return {"id": user_id, "name": "John Doe"}
This pattern separates resource management from business logic, making both aspects easier to maintain and test. It’s particularly useful for database connections, file handling, or API clients.
Deprecation warnings help manage code lifecycle and communicate with other developers about upcoming changes. I use this pattern when preparing to remove or change API functionality.
import functools
import warnings
def deprecated(reason, version=None):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
message = f"{func.__name__} is deprecated"
if version:
message += f" and will be removed in version {version}"
if reason:
message += f". Reason: {reason}"
warnings.warn(
message,
category=DeprecationWarning,
stacklevel=2
)
return func(*args, **kwargs)
return wrapper
return decorator
@deprecated(reason="Use new_function() instead", version="2.0.0")
def old_function():
return "This function will be removed soon"
When users call the deprecated function, they’ll receive a warning that helps them prepare for future changes. This promotes smoother transitions when updating your codebase.
These decorators become even more powerful when combined. For example, you might create a function that’s both memoized and timed:
@timer
@memoize
def expensive_calculation(n):
time.sleep(0.1) # Simulate expensive operation
return n * n
# First call is slow
result1 = expensive_calculation(10)
# Second call uses cached result and is much faster
result2 = expensive_calculation(10)
When working with decorators, it’s important to understand the order of application. Decorators are applied from bottom to top, so in this example, memoize
is applied first, then timer
wraps the memoized function.
Creating decorators that can be used with or without arguments requires a slightly different pattern:
def flexible_decorator(func=None, *, option1=False, option2=None):
def actual_decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
# Use option1 and option2 here
print(f"Options: {option1}, {option2}")
return func(*args, **kwargs)
return wrapper
# This is the key part that makes it work both ways
if func is None:
return actual_decorator
return actual_decorator(func)
# Can be used with arguments
@flexible_decorator(option1=True, option2="value")
def function1():
pass
# Or without arguments
@flexible_decorator
def function2():
pass
For more complex decorators, classes can provide a cleaner implementation:
class TraceCalls:
def __init__(self, stream=None, indent_step=2):
self.stream = stream or sys.stdout
self.indent_step = indent_step
self.indent = 0
self.trace_enabled = True
def __call__(self, func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
if not self.trace_enabled:
return func(*args, **kwargs)
args_repr = [repr(a) for a in args]
kwargs_repr = [f"{k}={v!r}" for k, v in kwargs.items()]
signature = ", ".join(args_repr + kwargs_repr)
self.stream.write(f"{' ' * self.indent}{func.__name__}({signature})\n")
self.stream.flush()
self.indent += self.indent_step
try:
result = func(*args, **kwargs)
self.indent -= self.indent_step
self.stream.write(f"{' ' * self.indent}-> {result!r}\n")
return result
except Exception as e:
self.indent -= self.indent_step
self.stream.write(f"{' ' * self.indent}-> Exception: {e}\n")
raise
return wrapper
def disable(self):
self.trace_enabled = False
def enable(self):
self.trace_enabled = True
tracer = TraceCalls()
@tracer
def factorial(n):
if n <= 1:
return 1
return n * factorial(n-1)
# Will trace the recursive calls
factorial(3)
# Can be disabled
tracer.disable()
factorial(5) # No tracing
Class-based decorators maintain state between calls and provide methods to control behavior. This example creates a tracing decorator that can be enabled or disabled during program execution.
In my experience, investing time in creating well-designed decorators pays off significantly in larger projects. They promote separation of concerns, reduce code duplication, and make your codebase more maintainable.
When building your own decorators, remember these best practices:
- Always use
functools.wraps
to preserve metadata - Keep decorators focused on a single responsibility
- Document your decorators thoroughly
- Consider how multiple decorators will interact when stacked
- Test your decorators in isolation
Python’s decorator pattern is a testament to the language’s expressiveness and flexibility. By mastering these advanced decorator techniques, you’ll write more elegant, maintainable code that better separates concerns and communicates intent clearly.