Back to Curriculum

Advanced Decorators and Metaprogramming

📚 Lesson 17 of 20 ⏱️ 65 min

Advanced Decorators and Metaprogramming

65 min

Decorators are powerful Python features that allow you to modify or extend the behavior of functions and classes without permanently modifying their code. They're essentially functions that take another function as input and return a modified version. Decorators provide a clean, reusable way to add cross-cutting concerns like logging, caching, authentication, validation, and performance monitoring. Understanding decorators is essential for writing elegant, maintainable Python code and building frameworks and libraries.

Python decorators use the `@` syntax, which is syntactic sugar for function composition. When you write `@decorator` above a function, Python calls the decorator with the function and replaces the original function with the decorator's return value. Decorators can be simple functions or classes (callable objects). They can accept arguments (decorator factories), be stacked (multiple decorators), and preserve function metadata using `functools.wraps`. Decorators enable aspect-oriented programming in Python, separating concerns cleanly.

Metaprogramming in Python allows you to create code that generates or modifies other code at runtime. Key metaprogramming features include metaclasses (classes that create classes), descriptors (objects that customize attribute access), `__getattr__` and `__setattr__` for dynamic attribute access, and `exec()`/`eval()` for code execution. Metaprogramming is powerful but should be used judiciously—it can make code harder to understand and debug. Common use cases include ORMs (Object-Relational Mappers), frameworks, and code generation tools.

Advanced decorator patterns include decorator factories (functions that return decorators, enabling parameterized decorators), class decorators (decorators that modify classes), property decorators (combining `@property` with custom decorators), and context manager decorators (using `@contextmanager`). Understanding these patterns enables you to build sophisticated, reusable abstractions. Decorators are widely used in Python frameworks like Flask, Django, and FastAPI for routing, authentication, and middleware.

Best practices for decorators include always using `functools.wraps` to preserve function metadata, documenting decorator behavior clearly, handling edge cases and errors gracefully, making decorators composable (stackable), and avoiding side effects that aren't obvious. For metaprogramming, prefer simpler solutions when possible, document metaprogramming code extensively, and test thoroughly as metaprogramming can introduce subtle bugs. Both decorators and metaprogramming are advanced features that should be used when they provide clear benefits over simpler alternatives.

Real-world applications of decorators and metaprogramming include web frameworks (route decorators, middleware), testing frameworks (test decorators, fixtures), data validation libraries (validation decorators), caching systems (memoization decorators), and API frameworks (authentication, rate limiting). Mastering these concepts enables you to write more expressive, maintainable code and contribute effectively to Python projects and frameworks.

Key Concepts

  • Decorators modify function/class behavior without changing their code.
  • Decorators use @ syntax and can be stacked or parameterized.
  • Metaprogramming creates or modifies code at runtime.
  • Metaclasses, descriptors, and dynamic attributes enable metaprogramming.
  • functools.wraps preserves function metadata in decorators.

Learning Objectives

Master

  • Creating and using decorators for cross-cutting concerns
  • Understanding decorator factories and parameterized decorators
  • Working with metaclasses and descriptors for metaprogramming
  • Implementing advanced decorator patterns

Develop

  • Aspect-oriented programming thinking
  • Understanding code generation and modification patterns
  • Designing reusable, composable abstractions

Tips

  • Always use functools.wraps to preserve function metadata in decorators.
  • Use decorator factories when you need to pass parameters to decorators.
  • Prefer simpler solutions over metaprogramming when possible.
  • Document decorator behavior and side effects clearly.

Common Pitfalls

  • Forgetting functools.wraps, losing function metadata and docstrings.
  • Creating decorators with hidden side effects that are hard to debug.
  • Overusing metaprogramming, making code difficult to understand.
  • Not handling errors in decorators, causing unexpected failures.

Summary

  • Decorators modify function/class behavior cleanly and reusably.
  • Decorators can be stacked, parameterized, and preserve metadata.
  • Metaprogramming enables code generation and modification at runtime.
  • Use decorators and metaprogramming judiciously for clear benefits.
  • Mastering these concepts enables building frameworks and libraries.

Exercise

Create a comprehensive decorator system that includes caching, logging, and performance monitoring capabilities.

import time\nimport functools\nimport logging\nfrom typing import Any, Callable, Dict, Optional\nfrom datetime import datetime, timedelta\n\n# Configure logging\nlogging.basicConfig(level=logging.INFO)\nlogger = logging.getLogger(__name__)\n\nclass CacheDecorator:\n    """Advanced caching decorator with TTL and size limits."""\n    \n    def __init__(self, ttl_seconds: int = 300, max_size: int = 100):\n        self.ttl_seconds = ttl_seconds\n        self.max_size = max_size\n        self.cache: Dict[str, Dict[str, Any]] = {}\n    \n    def __call__(self, func: Callable) -> Callable:\n        @functools.wraps(func)\n        def wrapper(*args, **kwargs):\n            # Create cache key\n            cache_key = self._create_cache_key(func, args, kwargs)\n            \n            # Check if cached and not expired\n            if cache_key in self.cache:\n                cached_item = self.cache[cache_key]\n                if datetime.now() < cached_item['expires_at']:\n                    logger.info(f"Cache hit for {func.__name__}")\n                    return cached_item['value']\n                else:\n                    # Remove expired item\n                    del self.cache[cache_key]\n            \n            # Execute function and cache result\n            result = func(*args, **kwargs)\n            self._cache_result(cache_key, result)\n            logger.info(f"Cache miss for {func.__name__}, result cached")\n            \n            return result\n        \n        return wrapper\n    \n    def _create_cache_key(self, func: Callable, args: tuple, kwargs: dict) -> str:\n        """Create a unique cache key for function call."""\n        # Convert args and kwargs to a string representation\n        args_str = str(args) + str(sorted(kwargs.items()))\n        return f"{func.__name__}:{hash(args_str)}"\n    \n    def _cache_result(self, key: str, value: Any) -> None:\n        """Cache a result with expiration and size management."""\n        # Remove oldest items if cache is full\n        if len(self.cache) >= self.max_size:\n            oldest_key = min(self.cache.keys(), key=lambda k: self.cache[k]['created_at'])\n            del self.cache[oldest_key]\n        \n        # Add new item\n        self.cache[key] = {\n            'value': value,\n            'created_at': datetime.now(),\n            'expires_at': datetime.now() + timedelta(seconds=self.ttl_seconds)\n        }\n    \n    def clear_cache(self) -> None:\n        """Clear all cached items."""\n        self.cache.clear()\n        logger.info("Cache cleared")\n    \n    def get_cache_stats(self) -> Dict[str, Any]:\n        """Get cache statistics."""\n        now = datetime.now()\n        active_items = sum(1 for item in self.cache.values() if item['expires_at'] > now)\n        expired_items = len(self.cache) - active_items\n        \n        return {\n            'total_items': len(self.cache),\n            'active_items': active_items,\n            'expired_items': expired_items,\n            'max_size': self.max_size,\n            'ttl_seconds': self.ttl_seconds\n        }\n\ndef performance_monitor(func: Callable) -> Callable:\n    """Decorator to monitor function performance with detailed metrics."""\n    @functools.wraps(func)\n    def wrapper(*args, **kwargs):\n        start_time = time.time()\n        start_memory = _get_memory_usage()\n        \n        try:\n            result = func(*args, **kwargs)\n            execution_time = time.time() - start_time\n            end_memory = _get_memory_usage()\n            memory_delta = end_memory - start_memory\n            \n            logger.info(f"{func.__name__} executed in {execution_time:.4f}s, "\n                       f"memory delta: {memory_delta:+d} bytes")\n            \n            return result\n        except Exception as e:\n            execution_time = time.time() - start_time\n            logger.error(f"{func.__name__} failed after {execution_time:.4f}s: {e}")\n            raise\n    \n    return wrapper\n\ndef retry_on_failure(max_attempts: int = 3, delay_seconds: float = 1.0):\n    """Decorator to retry failed function calls with exponential backoff."""\n    def decorator(func: Callable) -> Callable:\n        @functools.wraps(func)\n        def wrapper(*args, **kwargs):\n            last_exception = None\n            \n            for attempt in range(max_attempts):\n                try:\n                    return func(*args, **kwargs)\n                except Exception as e:\n                    last_exception = e\n                    if attempt < max_attempts - 1:\n                        wait_time = delay_seconds * (2 ** attempt)\n                        logger.warning(f"{func.__name__} attempt {attempt + 1} failed, "\n                                       f"retrying in {wait_time}s: {e}")\n                        time.sleep(wait_time)\n            \n            logger.error(f"{func.__name__} failed after {max_attempts} attempts")\n            raise last_exception\n        \n        return wrapper\n    return decorator\n\ndef validate_input(*validators: Callable):\n    """Decorator to validate function inputs using multiple validator functions."""\n    def decorator(func: Callable) -> Callable:\n        @functools.wraps(func)\n        def wrapper(*args, **kwargs):\n            # Validate all arguments\n            all_args = args + tuple(kwargs.values())\n            for validator in validators:\n                for arg in all_args:\n                    if not validator(arg):\n                        raise ValueError(f"Validation failed for argument {arg} using {validator.__name__}")\n            \n            return func(*args, **kwargs)\n        \n        return wrapper\n    return decorator\n\n# Utility functions\ndef _get_memory_usage() -> int:\n    """Get current memory usage (simplified)."""\n    try:\n        import psutil\n        return psutil.Process().memory_info().rss\n    except ImportError:\n        return 0\n\ndef is_positive_number(value: Any) -> bool:\n    """Validator: check if value is a positive number."""\n    return isinstance(value, (int, float)) and value > 0\n\ndef is_string(value: Any) -> bool:\n    """Validator: check if value is a string."""\n    return isinstance(value, str)\n\ndef is_list(value: Any) -> bool:\n    """Validator: check if value is a list."""\n    return isinstance(value, list)\n\n# Example usage\n@CacheDecorator(ttl_seconds=60, max_size=50)\n@performance_monitor\n@retry_on_failure(max_attempts=3, delay_seconds=0.5)\n@validate_input(is_positive_number, is_string)\ndef expensive_operation(number: int, text: str) -> str:\n    """Simulate an expensive operation that might fail."""\n    # Simulate some processing time\n    time.sleep(0.1)\n    \n    # Simulate occasional failures\n    if number % 7 == 0:\n        raise RuntimeError(f"Simulated failure for number {number}")\n    \n    return f"Processed {text} with number {number}"\n\ndef demonstrate_decorators():\n    """Demonstrate the decorator system in action."""\n    print("=== Advanced Decorator System Demo ===\n")\n    \n    # Test the decorated function\n    try:\n        result1 = expensive_operation(5, "Hello")\n        print(f"Result 1: {result1}")\n        \n        result2 = expensive_operation(10, "World")\n        print(f"Result 2: {result2}")\n        \n        # This should trigger a retry due to failure\n        result3 = expensive_operation(7, "Test")\n        print(f"Result 3: {result3}")\n        \n    except Exception as e:\n        print(f"Final failure: {e}")\n    \n    # Test caching\n    print("\n=== Testing Caching ===")\n    start_time = time.time()\n    expensive_operation(5, "Cached")\n    first_call_time = time.time() - start_time\n    \n    start_time = time.time()\n    expensive_operation(5, "Cached")\n    second_call_time = time.time() - start_time\n    \n    print(f"First call: {first_call_time:.4f}s")\n    print(f"Second call (cached): {second_call_time:.4f}s")\n    print(f"Speed improvement: {first_call_time / second_call_time:.1f}x")\n\nif __name__ == "__main__":\n    demonstrate_decorators()

Code Editor

Output