python

How Can You Effortlessly Manage Multiple Databases in FastAPI?

Navigating the Multiverse of Databases with FastAPI: A Tale of Configuration and Connection

How Can You Effortlessly Manage Multiple Databases in FastAPI?

Let’s dive into the world of managing multiple databases in a FastAPI project. It may sound like a maze at first, but with a sprinkle of the right tools and steps, we can make it a breeze.

First things first, we need to get a few libraries on board. For this, you’ll want to grab FastAPI, SQLAlchemy, and alembic. These are our main players, helping us drive database migrations and object-relational mapping in Python.

pip install fastapi sqlalchemy alembic

If MySQL is your database of choice, make sure to add the mysql-connector-python library too.

pip install mysql-connector-python

Now, let’s get started with configuring our database connections. For multiple databases, we need to set up separate connection strings for each. We can keep this tidy in our configuration file.

from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker

# Database configs
db1_config = {
    'username': 'username1',
    'password': 'password1',
    'ip': 'localhost',
    'name': 'db1',
    'port': '3306'
}

db2_config = {
    'username': 'username2',
    'password': 'password2',
    'ip': 'localhost',
    'name': 'db2',
    'port': '3306'
}

# Create engines for each database
engine1 = create_engine(f'mysql+mysqlconnector://{db1_config["username"]}:{db1_config["password"]}@{db1_config["ip"]}:{db1_config["port"]}/{db1_config["name"]}')
engine2 = create_engine(f'mysql+mysqlconnector://{db2_config["username"]}:{db2_config["password"]}@{db2_config["ip"]}:{db2_config["port"]}/{db2_config["name"]}')

# Session makers
Session1 = sessionmaker(bind=engine1)
Session2 = sessionmaker(bind=engine2)

# Base model classes
Base1 = declarative_base()
Base2 = declarative_base()

Once the connection setup is done, we move on to defining our database models. Each model should inherit from the Base class we defined for each database.

from sqlalchemy import Column, Integer, String

# Models for Database 1
class User1(Base1):
    __tablename__ = "users1"
    id = Column(Integer, primary_key=True, index=True)
    name = Column(String, index=True)
    email = Column(String, unique=True, index=True)


# Models for Database 2
class User2(Base2):
    __tablename__ = "users2"
    id = Column(Integer, primary_key=True, index=True)
    name = Column(String, index=True)
    email = Column(String, unique=True, index=True)

One of the key features of FastAPI is its dependency injection system, which simplifies the management of database sessions. This lets us define functions that yield database sessions on a per-request basis.

def get_db1():
    db = Session1()
    try:
        yield db
    finally:
        db.close()

def get_db2():
    db = Session2()
    try:
        yield db
    finally:
        db.close()

Now we can use these dependency functions in our FastAPI routes to interact with the corresponding databases.

from fastapi import FastAPI, Depends
from sqlalchemy.orm import Session

app = FastAPI()

@app.get("/users1/")
def read_users1(db: Session = Depends(get_db1)):
    users = db.query(User1).all()
    return users

@app.get("/users2/")
def read_users2(db: Session = Depends(get_db2)):
    users = db.query(User2).all()
    return users

Sometimes, you might need to dynamically choose which database to interact with based on the request details. Middleware can save the day here. It can inspect requests and set the right database connection accordingly.

from fastapi import FastAPI, Request
from fastapi.middleware.base import BaseHTTPMiddleware

class DynamicDBMiddleware(BaseHTTPMiddleware):
    async def dispatch(self, request: Request, call_next):
        if "api1" in str(request.url):
            request.state.db = get_db1()
        else:
            request.state.db = get_db2()
        response = await call_next(request)
        return response

app.add_middleware(DynamicDBMiddleware)

When it comes to database migrations, Alembic is the tool of choice. Initialize Alembic in your project to get started:

alembic init alembic

Then, configure Alembic to be aware of your multiple databases in its env.py file.

from sqlalchemy import engine_from_config, pool
from alembic import context
from config import db1_config, db2_config

config = context.config
fileConfig(config.config_file_name)

target_metadata = [Base1.metadata, Base2.metadata]

def run_migrations(ctx, engines):
    for engine in engines:
        with engine.connect() as connection:
            ctx.configure(connection=connection, target_metadata=target_metadata)
            ctx.run_migrations()

def run_migrations_offline():
    run_migrations(context, [engine_from_config(config.get_section('db1'), prefix='sqlalchemy.'), engine_from_config(config.get_section('db2'), prefix='sqlalchemy.')])

def run_migrations_online():
    engines = [
        engine_from_config(db1_config, prefix='sqlalchemy.'),
        engine_from_config(db2_config, prefix='sqlalchemy.')
    ]
    run_migrations(context, engines)

if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

And there you go. Managing multiple databases in a FastAPI project doesn’t have to be daunting. By setting up clear configurations and using FastAPI’s dependency injection system and middleware, the complexity reduces significantly. Tools like Alembic help maintain database schema consistency across your different databases, making your application scalable and maintainable. This structured approach not only makes database management more efficient but also ensures your application can smoothly handle different scenarios and environments.

Keywords: FastAPI, SQLAlchemy, alembic, multiple databases, MySQL, dependency injection, database migration, dynamic database, middleware, database configuration



Similar Posts
Blog Image
How to Handle Circular References in Marshmallow with Grace

Marshmallow circular references tackled with nested schemas, lambda functions, and two-pass serialization. Caching optimizes performance. Testing crucial for reliability. Mix techniques for complex structures.

Blog Image
Automatic Schema Generation: Unlocking Marshmallow’s Potential with Python Dataclasses

Automatic schema generation using Marshmallow and Python dataclasses simplifies data serialization and deserialization. It improves code maintainability, reduces errors, and handles complex structures efficiently. This approach streamlines development and enhances data validation capabilities.

Blog Image
Can FastAPI Unlock the Secrets of Effortless Data Validation?

Unlock Effortless User Input Validation with FastAPI and Pydantic

Blog Image
Handling Multi-Tenant Data Structures with Marshmallow Like a Pro

Marshmallow simplifies multi-tenant data handling in Python. It offers dynamic schemas, custom validation, and performance optimization for complex structures. Perfect for SaaS applications with varying tenant requirements.

Blog Image
Boost Your API Performance: FastAPI and Redis Unleashed

FastAPI and Redis combo offers high-performance APIs with efficient caching, session management, rate limiting, and task queuing. Improves speed, scalability, and user experience in Python web applications.

Blog Image
Why Shouldn't Your FastAPI App Speak in Code?

Secure Your FastAPI App with HTTPS and SSL for Seamless Deployment