Best Practices for Using Pydantic in Python

Nuno Bispo
Django Unleashed
Published in
12 min readJul 16, 2024

This guide provides best practices for using Pydantic in Python projects, covering model definition, data validation, error handling, and performance optimization.

Best Practices for Using Pydantic in Python (AI-generated image)

Pydantic is a Python library that simplifies data validation using type hints. It ensures data integrity and offers an easy way to create data models with automatic type checking and validation.

In software applications, reliable data validation is crucial to prevent errors, security issues, and unpredictable behavior.

This guide provides best practices for using Pydantic in Python projects, covering model definition, data validation, error handling, and performance optimization.

Installing Pydantic

To install Pydantic, use pip, the Python package installer, with the command:

pip install pydantic

This command installs Pydantic and its dependencies.

Basic Usage

Create Pydantic models by making classes that inherit from BaseModel. Use Python type annotations to specify each field's type:

from pydantic import BaseModel

class User(BaseModel):
id: int
name: str
email: str

Pydantic supports various field types, including int, str, float, bool, list, and dict. You can also define nested models and custom types:

from typing import List, Optional
from pydantic import BaseModel

class Address(BaseModel):
street: str
city: str
zip_code: Optional[str] = None

class User(BaseModel):
id: int
name: str
email: str
age: Optional[int] = None
addresses: List[Address]

Once you’ve defined a Pydantic model, create instances by providing the required data. Pydantic will validate the data and raise errors if any field doesn’t meet the specified requirements:

user = User(
id=1,
name="John Doe",
email="john.doe@example.com",
addresses=[{"street": "123 Main St", "city": "Anytown", "zip_code": "12345"}]
)

print(user)
# Output:
# id=1 name='John Doe' email='john.doe@example.com' age=None addresses=[Address(street='123 Main St', city='Anytown', zip_code='12345')]

Are you tired of writing the same old Python code? Want to take your programming skills to the next level? Look no further! This book is the ultimate resource for beginners and experienced Python developers alike.

Get “Python’s Magic Methods — Beyond __init__ and __str__”

Magic methods are not just syntactic sugar, they’re powerful tools that can significantly improve the functionality and performance of your code. With this book, you’ll learn how to use these tools correctly and unlock the full potential of Python.

Defining Pydantic Models

Pydantic models use Python type annotations to define data field types.

They support various built-in types, including:

  • Primitive types: str, int, float, bool
  • Collection types: list, tuple, set, dict
  • Optional types: Optional from the typing module for fields that can be None
  • Union types: Union from the typing module to specify a field can be one of several types

Example:

from typing import List, Dict, Optional, Union
from pydantic import BaseModel

class Item(BaseModel):
name: str
price: float
tags: List[str]
metadata: Dict[str, Union[str, int, float]]

class Order(BaseModel):
order_id: int
items: List[Item]
discount: Optional[float] = None

Custom Types

In addition to built-in types, you can define custom types using Pydantic’s conint, constr, and other constraint functions.

These allow you to add additional validation rules, such as length constraints on strings or value ranges for integers.

Example:

from pydantic import BaseModel, conint, constr

class Product(BaseModel):
name: constr(min_length=2, max_length=50)
quantity: conint(gt=0, le=1000)
price: float

product = Product(name="Laptop", quantity=5, price=999.99)

Required vs. Optional Fields

By default, fields in a Pydantic model are required unless explicitly marked as optional.

If a required field is missing during model instantiation, Pydantic will raise a ValidationError.

Example:

from pydantic import BaseModel

class User(BaseModel):
id: int
name: str
email: str

user = User(id=1, name="John Doe")

# Output
# Field required [type=missing, input_value={'id': 1, 'name': 'John Doe'}, input_type=dict]

Optional Fields with Default Values

Fields can be made optional by using Optional from the typing module and providing a default value.

Example:

from pydantic import BaseModel
from typing import Optional

class User(BaseModel):
id: int
name: str
email: Optional[str] = None

user = User(id=1, name="John Doe")

In this example, email is optional and defaults to None if not provided.

Nested Models

Pydantic allows models to be nested within each other, enabling complex data structures.

Nested models are defined as fields of other models, ensuring data integrity and validation at multiple levels.

Example:

from pydantic import BaseModel
from typing import Optional, List

class Address(BaseModel):
street: str
city: str
zip_code: Optional[str] = None

class User(BaseModel):
id: int
name: str
email: str
addresses: List[Address]

user = User(
id=1,
name="John Doe",
email="john.doe@example.com",
addresses=[{"street": "123 Main St", "city": "Anytown"}]
)

Best Practices for Managing Nested Data

When working with nested models, it’s important to:

  • Validate data at each level: Ensure each nested model has its own validation rules and constraints.
  • Use clear and consistent naming conventions: This makes the structure of your data more readable and maintainable.
  • Keep models simple: Avoid overly complex nested structures. If a model becomes too complex, consider breaking it down into smaller, more manageable components.

Data Validation

Pydantic includes a set of built-in validators that handle common data validation tasks automatically.

These validators include:

  • Type validation: Ensures fields match the specified type annotations (e.g., int, str, list).
  • Range validation: Enforces value ranges and lengths using constraints like conint, constr, confloat.
  • Format validation: Checks specific formats, such as EmailStr for validating email addresses.
  • Collection validation: Ensures elements within collections (e.g., list, dict) conform to specified types and constraints.

These validators simplify the process of ensuring data integrity and conformity within your models.

Here are some examples demonstrating built-in validators:

from pydantic import BaseModel, EmailStr, conint, constr

class User(BaseModel):
id: conint(gt=0) # id must be greater than 0
name: constr(min_length=2, max_length=50) # name must be between 2 and 50 characters
email: EmailStr # email must be a valid email address
age: conint(ge=18) # age must be 18 or older

user = User(id=1, name="John Doe", email="john.doe@example.com", age=25)

In this example, the User model uses built-in validators to ensure the id is greater than 0, the name is between 2 and 50 characters, the email is a valid email address, and the age is 18 or older.

To be able to use the email validator, you need to install an extension to pydantic:

pip install pydantic[email]

Custom Validators

Pydantic allows you to define custom validators for more complex validation logic.

Custom validators are defined using the @field_validator decorator within your model class.

Example of a custom validator:

from pydantic import BaseModel, field_validator

class Product(BaseModel):
name: str
price: float
@field_validator('price')
def price_must_be_positive(cls, value):
if value <= 0:
raise ValueError('Price must be positive')
return value

product = Product(name="Laptop", price=999.99)

Here, the price_must_be_positive validator ensures that the price field is a positive number.

Custom validators are registered automatically when you define them within a model using the @field_validator decorator. Validators can be applied to individual fields or across multiple fields.

Example of registering a validator for multiple fields:

from pydantic import BaseModel, field_validator

class Person(BaseModel):
first_name: str
last_name: str
@field_validator('first_name', 'last_name')
def names_cannot_be_empty(cls, value):
if not value:
raise ValueError('Name fields cannot be empty')
return value

person = Person(first_name="John", last_name="Doe")

In this example, the names_cannot_be_empty validator ensures that both the first_name and last_name fields are not empty.

Using Config Classes

Pydantic models can be customized using an inner Config class.

This class allows you to set various configuration options that affect the model’s behavior, such as validation rules, JSON serialization, and more.

Example of a Config class:

from pydantic import BaseModel

class User(BaseModel):
id: int
name: str
email: str
class Config:
str_strip_whitespace = True # Strip whitespace from strings
str_min_length = 1 # Minimum length for any string field

user = User(id=1, name=" John Doe ", email="john.doe@example.com")

print(user)

# Output:
# id=1 name='John Doe' email='john.doe@example.com'

In this example, the Config class is used to strip whitespace from string fields and enforce a minimum length of 1 for any string field.

Some common configuration options in Pydantic’s Config class include:

  • str_strip_whitespace: Automatically strip leading and trailing whitespace from string fields.
  • str_min_length: Set a minimum length for any string field.
  • validate_default: Validate all fields, even those with default values.
  • validate_assignment: Enable validation on assignment to model attributes.
  • use_enum_values: Use the values of enums directly instead of the enum instances.
  • json_encoders: Define custom JSON encoders for specific types.

Error Handling

When Pydantic finds data that doesn’t conform to the model’s schema, it raises a ValidationError.

This error provides detailed information about the issue, including the field name, the incorrect value, and a description of the problem.

Here’s an example of how default error messages are structured:

from pydantic import BaseModel, ValidationError, EmailStr

class User(BaseModel):
id: int
name: str
email: EmailStr
try:
user = User(id='one', name='John Doe', email='invalid-email')
except ValidationError as e:
print(e.json())

# Output:
# [{"type":"int_parsing","loc":["id"],"msg":"Input should be a valid integer, unable to parse string as an integer","input":"one","url":"https://errors.pydantic.dev/2.8/v/int_parsing"},{"type":"value_error","loc":["email"],"msg":"value is not a valid email address: An email address must have an @-sign.","input":"invalid-email","ctx":{"reason":"An email address must have an @-sign."},"url":"https://errors.pydantic.dev/2.8/v/value_error"}]

In this example, the error message will indicate that id must be an integer and email must be a valid email address.

Customizing Error Messages

Pydantic allows you to customize error messages for specific fields by raising exceptions with custom messages in validators or by setting custom configurations.

Here’s an example of customizing error messages:

from pydantic import BaseModel, ValidationError, field_validator

class Product(BaseModel):
name: str
price: float
@field_validator('price')
def price_must_be_positive(cls, value):
if value <= 0:
raise ValueError('Price must be a positive number')
return value
try:
product = Product(name='Laptop', price=-1000)
except ValidationError as e:
print(e.json())

# Output:
# [{"type":"value_error","loc":["price"],"msg":"Value error, Price must be a positive number","input":-1000,"ctx":{"error":"Price must be a positive number"},"url":"https://errors.pydantic.dev/2.8/v/value_error"}]

In this example, the error message for price is customized to indicate that it must be a positive number.

Best Practices for Error Reporting

Effective error reporting involves providing clear, concise, and actionable feedback to users or developers. Here are some best practices:

  • Log errors: Use logging mechanisms to record validation errors for debugging and monitoring purposes.
  • Return user-friendly messages: When exposing errors to end-users, avoid technical jargon. Instead, provide clear instructions on how to correct the data.
  • Aggregate errors: When multiple fields are invalid, aggregate the errors into a single response to help users correct all issues at once.
  • Use consistent formats: Ensure that error messages follow a consistent format across the application for easier processing and understanding.

Examples of best practices in error reporting:

from pydantic import BaseModel, ValidationError, EmailStr
import logging

logging.basicConfig(level=logging.INFO)

class User(BaseModel):
id: int
name: str
email: EmailStr

def create_user(data):
try:
user = User(**data)
return user
except ValidationError as e:
logging.error("Validation error: %s", e.json())
return {"error": "Invalid data provided", "details": e.errors()}

user_data = {'id': 'one', 'name': 'John Doe', 'email': 'invalid-email'}
response = create_user(user_data)

print(response)

# Output:
# ERROR:root:Validation error: [{"type":"int_parsing","loc":["id"],"msg":"Input should be a valid integer, unable to parse string as an integer","input":"one","url":"https://errors.pydantic.dev/2.8/v/int_parsing"},{"type":"value_error","loc":["email"],"msg":"value is not a valid email address: An email address must have an @-sign.","input":"invalid-email","ctx":{"reason":"An email address must have an @-sign."},"url":"https://errors.pydantic.dev/2.8/v/value_error"}]
# {'error': 'Invalid data provided', 'details': [{'type': 'int_parsing', 'loc': ('id',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'one', 'url': 'https://errors.pydantic.dev/2.8/v/int_parsing'}, {'type': 'value_error', 'loc': ('email',), 'msg': 'value is not a valid email address: An email address must have an @-sign.', 'input': 'invalid-email', 'ctx': {'reason': 'An email address must have an @-sign.'}}]}

In this example, validation errors are logged, and a user-friendly error message is returned, helping maintain application stability and providing useful feedback to the user.

Performance Considerations

Lazy initialization is a technique that postpones the creation of an object until it is needed.

In Pydantic, this can be useful for models with fields that are costly to compute or fetch. By delaying the initialization of these fields, you can reduce the initial load time and improve performance.

Example of lazy initialization:

from pydantic import BaseModel
from functools import lru_cache

class DataModel(BaseModel):
name: str
expensive_computation: str = None
@property
@lru_cache(maxsize=1)
def expensive_computation(self):
# Simulate an expensive computation
result = "Computed Value"
return result

data_model = DataModel(name="Test")
print(data_model.expensive_computation)

In this example, the expensive_computation field is computed only when accessed for the first time, reducing unnecessary computations during model initialization.

Redundant Validation

Pydantic models automatically validate data during initialization.

However, if you know that certain data has already been validated or if validation is not necessary in some contexts, you can disable validation to improve performance.

This can be done using the model_construct method, which bypasses validation:

Example of avoiding redundant validation:

from pydantic import BaseModel

class User(BaseModel):
id: int
name: str
email: str

# Constructing a User instance without validation
data = {'id': 1, 'name': 'John Doe', 'email': 'john.doe@example.com'}
user = User.model_construct(**data)

In this example, User.model_construct is used to create a User instance without triggering validation, which can be useful in performance-critical sections of your code.

Efficient Data Parsing

When dealing with large datasets or high-throughput systems, efficiently parsing raw data becomes critical.

Pydantic provides the model_validate_json method, which can be used to parse JSON or other serialized data formats directly into Pydantic models.

Example of efficient data parsing:

from pydantic import BaseModel

class User(BaseModel):
id: int
name: str
email: str

json_data = '{"id": 1, "name": "John Doe", "email": "john.doe@example.com"}'
user = User.model_validate_json(json_data)

print(user)

In this example, model_validate_json is used to parse JSON data into a User model directly, providing a more efficient way to handle serialized data.

Controlling Validation

Pydantic models can be configured to validate data only when necessary.

The validate_default and validate_assignment options in the Config class control when validation occurs, which can help improve performance:

  • validate_default: When set to False, only fields that are set during initialization are validated.
  • validate_assignment: When set to True, validation is performed on field assignment after the model is created.

Example configuration:

from pydantic import BaseModel

class User(BaseModel):
id: int
name: str
email: str
class Config:
validate_default = False # Only validate fields set during initialization
validate_assignment = True # Validate fields on assignment

user = User(id=1, name="John Doe", email="john.doe@example.com")
user.email = "new.email@example.com" # This assignment will trigger validation

In this example, validate_default is set to False to avoid unnecessary validation during initialization, and validate_assignment is set to True to ensure that fields are validated when they are updated.

Settings Management

Pydantic’s BaseSettings class is designed for managing application settings, supporting environment variable loading and type validation.

This helps in configuring applications for different environments (e.g., development, testing, production).

Consider this .env file:

database_url=db
secret_key=sk
debug=False

Example of using BaseSettings:

from pydantic_settings import BaseSettings

class Settings(BaseSettings):
database_url: str
secret_key: str
debug: bool = False
class Config:
env_file = ".env"

settings = Settings()
print(settings.model_dump())

# Output:
# {'database_url': 'db', 'secret_key': 'sk', 'debug': False}

In this example, settings are loaded from environment variables, and the Config class specifies that variables can be loaded from a .env file.

For using BaseSettings you will need to install an additional package:

pip install pydantic-settings

Managing settings effectively involves a few best practices:

  • Use environment variables: Store configuration values in environment variables to keep sensitive data out of your codebase.
  • Provide defaults: Define sensible default values for configuration settings to ensure the application runs with minimal configuration.
  • Separate environments: Use different configuration files or environment variables for different environments (e.g., .env.development, .env.production).
  • Validate settings: Use Pydantic’s validation features to ensure all settings are correctly typed and within acceptable ranges.

Common Pitfalls and How to Avoid Them

One common mistake when using Pydantic is misapplying type annotations, which can lead to validation errors or unexpected behavior.

Here are a few typical mistakes and their solutions:

Misusing Union Types: Using Union incorrectly can complicate type validation and handling.

Optional Fields without Default Values: Forgetting to provide a default value for optional fields can lead to None values causing errors in your application.

Incorrect Type Annotations: Assigning incorrect types to fields can cause validation to fail. For example, using str for a field that should be an int.

Ignoring Performance Implications

Ignoring performance implications when using Pydantic can lead to slow applications, especially when dealing with large datasets or frequent model instantiations.

Here are some strategies to avoid performance bottlenecks:

Leverage Configuration Options: Use Pydantic’s configuration options like validate_default and validate_assignment to control when validation occurs.

Optimize Nested Models: When working with nested models, ensure that you are not over-validating or duplicating validation logic.

Use Efficient Parsing Methods: Utilize model_validate_json and model_validate for efficient data parsing.

Avoid Unnecessary Validation: Use the model_construct method to create models without validation when the data is already known to be valid.

Overcomplicating Models

Overcomplicating Pydantic models can make them difficult to maintain and understand. Here are some tips to keep models simple and maintainable:

Document Your Models: Use docstrings and comments to explain complex validation rules or business logic embedded in models.

Encapsulate Logic Appropriately: Keep validation and business logic within appropriate model methods or external utilities to avoid cluttering model definitions.

Use Inheritance Sparingly: While inheritance can promote code reuse, excessive use can make the model hierarchy complex and harder to follow.

Avoid Excessive Nesting: Deeply nested models can be hard to manage. Aim for a balanced level of nesting.

Conclusion

In this guide, we have covered various best practices for using Pydantic effectively in your Python projects.

We began with the basics of getting started with Pydantic, including installation, basic usage, and defining models. We then delved into advanced features like custom types, serialization and deserialization, and settings management.

Key performance considerations, such as optimizing model initialization and efficient data parsing, were highlighted to ensure your applications run smoothly.

We also discussed common pitfalls, such as misusing type annotations, ignoring performance implications, and overcomplicating models, and provided strategies to avoid them.

Applying these best practices in your real-world projects will help you leverage the full power of Pydantic, making your code more robust, maintainable, and performant.

Follow me on Twitter: https://twitter.com/DevAsService

Follow me on Instagram at: https://www.instagram.com/devasservice/

Follow me on TikTok at: https://www.tiktok.com/@devasservice

--

--

Nuno Bispo
Django Unleashed

➡️ Content Creator 📜 ➡️ Software Builder 🧑‍💻 ➡️ Follow me for content about 🐍 Python 📘 Django 🧠 A.I.