Unclever Code: Metaprogramming for Mortals
A practical guide to metaprogramming in Python without losing your sanity
be me
tired of writing the same validation code
“wait, you can make code write code?”
discovers Python decorators
first decorator: simple logging
second decorator: input validation
third decorator: caching
“I AM BECOMING UNLIMITED”
one year later
trying to explain my “framework” to new team member
they’re crying
i’m crying
the code is crying
They say the road to production hell is paved with clever abstractions. Metaprogramming—that practice of “code that manipulates code”—might just be the express lane. One minute you’re feeling like a coding deity, orchestrating an elegant dance of decorators and metaclasses. The next, 7 layers deep in the stack trace trying to figure out which middleware functions decided to silently convert your integers to strings.
Behold, the “hat on a hat” of decorative metaprogramming solutions:
# this function is so top heavy it's about to tip over
@log_everything
@add_metrics
@middle_decorator # goes in the middle
@handle_errors
@validate_input
@cache_results
def business_logic():
# the real treasure was all those decorators
# we passed through along the way
return "No"
I’m here to present a different perspective: metaprogramming isn’t about clever tricks or reducing lines of code—it’s about extending a language (here: Python) to better express your domain’s concepts.
Common Utility Patterns
Before diving into domain-specific territory, let’s address those utility patterns that every developer discovers eventually. Logging, caching, retries—they’re like the gateway drug to metaprogramming:
@cache(ttl=3600)
@retry(max_attempts=3)
@log_calls
def fetch_data():
pass
While these patterns are useful, they should be:
- Used sparingly and explicitly
- Composed thoughtfully
- Focused on operational needs
Here’s what thoughtful utility usage looks like:
# Being explicit about what we're caching and why
@cache_to_redis(
ttl="1h",
key_prefix="user_data",
invalidate_on=["user_update"]
)
def get_user(user_id: int) -> User:
return db.fetch_user(user_id)
# Clear about retry behavior because network calls are fickle
@retry(
max_attempts=3,
on_exceptions=[NetworkError],
backoff_factor=1.5
)
def external_api_call():
pass
Employing metaprogramming reduces code footprint significantly. Adding a minimal amount of verbosity at the point of implementation is a worthy tradeoff!
Why Use Metaprogramming?
1. Domain Expression
Sometimes Python’s built-in syntax just doesn’t naturally fit the concepts you’re trying to express. That’s where metaprogramming shines.
Take Django’s database models. An example of making Python speak database
class User(Model):
name = CharField(max_length=100)
email = EmailField(unique=True)
@receiver(post_save)
def send_welcome_email(sender, instance, created, **kwargs):
if created:
send_email(instance.email)
It’s python, but it’s databases. Clarity.
2. Clarity at the Point of Use
When you want to hide complexity but keep intent clear.
E.g. FastAPI, the much loved web framework. One key selling point: it’s readable and comes with useful free stuff (api type hints → documentation, b/c you know developers will not actually write documentation).
# FastAPI: Easy to read, easy to use
@app.get("/users/{user_id}")
@requires_auth
def get_user(user_id: int) -> User:
return db.get_user(user_id)
3. Enforcing Patterns
When you need to ensure consistent behavior without writing the same boilerplate 500 times:
# Pydantic: Because runtime errors are so 2010
class UserCreate(BaseModel):
name: str
age: int = Field(gt=0, lt=150)
email: EmailStr
The Three Tools of Metaprogramming
1. Decorators
Think of decorators as function modifiers—they let you wrap existing functions with new behaviors. The beauty is that they’re explicit about what they do right at the point of use (note: beauty is still in the eye of the beholder)
# A simple route decorator shows exactly what this endpoint does
@app.post("/users/", status_code=201)
def create_user(user: UserCreate):
return db.create_user(user)
# Authentication decorators make security requirements clear
@requires_permission("admin")
def delete_user(user_id: int):
return db.delete_user(user_id)
2. Metaclasses
Metaclasses are the behind-the-scenes directors of class creation—they determine how your classes are built and behave. They’re powerful but complex—like a chainsaw, they can either help you build something amazing or cause spectacular disasters.
class ModelMetaclass(type):
def __new__(cls, name, bases, attrs):
# This is where the magic happens...and where
# stack traces go to die
for key, value in attrs.items():
if isinstance(value, Field):
value.contribute_to_class(cls, key)
return super().__new__(cls, name, bases, attrs)
class Model(metaclass=ModelMetaclass):
pass
3. Descriptors
Descriptors give you fine-grained control over attribute access. They’re perfect for when you need to add validation, computation, or tracking to class attributes:
class Positive:
def __get__(self, obj, type=None):
return obj._value
def __set__(self, obj, value):
if value <= 0:
raise ValueError("Must be positive")
obj._value = value
class Account:
balance = Positive() # Now balance can never be negative
Common Pitfalls and Solutions
The Stack Trace Trap
Ever seen a stack trace that looks like it’s been through a paper shredder? Decorators are usually the culprit:
# The problematic child
def log_calls(func):
def wrapper(*args, **kwargs):
print(f"Calling {func}") # Which function? Who knows!
return func(*args, **kwargs)
return wrapper
@log_calls
def important_calculation(x, y):
return x + y
# The responsible adult
from functools import wraps
import logging
def log_calls(func):
@wraps(func) # b/c stack traces should be helpful
def wrapper(*args, **kwargs):
logging.info(f"Calling {func.__name__} with args={args}, kwargs={kwargs}")
result = func(*args, **kwargs)
logging.info(f"{func.__name__} returned {result}")
return result
return wrapper
@log_calls
def important_calculation(x, y):
"""Adds two numbers together."""
return x + y
Performance Pitfalls
Metaprogramming often involves inspecting or modifying code at runtime, which can introduce unexpected performance costs. Nice job writing decorators that make DB queries look fast.
# Slow: Inspecting function attributes on every call
def log_decorator(func):
def wrapper(*args, **kwargs):
# This inspection happens every time
print(f"Calling {func.__name__} with {func.__dict__}")
return func(*args, **kwargs)
return wrapper
# Better: Cache expensive operations
from functools import lru_cache
@lru_cache(maxsize=32)
def expensive_meta_operation():
# Complex reflection or code manipulation here
return calculate_complex_result()
# Even Better: Do expensive work once at import time
PRECALCULATED_MAPPINGS = {
# Calculate mappings once when module loads
# instead of every function call
'mapping1': calculate_mapping1(),
'mapping2': calculate_mapping2()
}
def fast_operation():
return PRECALCULATED_MAPPINGS['mapping1']
The Golden Rules
- Keep it simple (If you can’t explain why you need it, you probably don’t)
- Look to existing frameworks for patterns—they’ve already made the mistakes you’re about to make. Django’s model system and SQLAlchemy’s declarative base aren’t accidentally complex
- Focus on making your domain concepts clear
- Consider the poor soul who’ll maintain your code (it might be future you)
- When in doubt, write it twice before abstracting
Metaprogramming is a powerful tool, but like a katana, it’s best used with skill and intent— and not because many neckbeards think it’s the ultimate weapon.
Use it to make your code clearer, not clevererer.