Deploying FastAPI Apps with AWS Lambda for a Hassle-Free Serverless Setup
FastAPI has quickly become a go-to framework for building web APIs, thanks to its speed, robust features, and simplicity. However, deploying these applications in a serverless environment like AWS Lambda requires a bit of finesse. Let’s walk through the process of taking your FastAPI app and making it seamlessly run on AWS Lambda.
What’s the Big Deal about Serverless Deployment?
Going serverless can make your life easier while also potentially cutting costs. With AWS Lambda, you only pay for the compute time you use, meaning you don’t need to fret over server maintenance or management. It scales automatically, so you can focus your energy on writing great code and let AWS worry about the heavy lifting.
What You Need Before You Start
A few things are essential to have in place before you jump into deployment. Make sure you have:
- Python: This is where your FastAPI app will live and breathe.
- Docker: Handy for containerizing your app. It’s not mandatory but highly recommended for better security and consistency.
- AWS CLI: This will let you interact with AWS services straight from your command line.
- AWS Account: Because you’ll need one to use AWS Lambda and its buddies.
Kicking Off Your FastAPI Application
Let’s start with a small FastAPI app. Trust me, this is as easy as it gets:
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
async def root():
return {"message": "Hello World!"}
@app.get("/hello/{name}")
async def hello(name: str):
return {"message": f"Hello from FastAPI, {name}!"}
This little snippet sets up two simple endpoints – one to say “Hello World!” and another to greet users by their name.
Making FastAPI Lambda-Compatible with Mangum
To get your FastAPI app up and running on AWS Lambda, you need Mangum – a nifty library that acts as a bridge between AWS Lambda events and FastAPI (ASGI) applications.
First, install Mangum:
pip install mangum
Next, wrap your FastAPI app with Mangum:
from fastapi import FastAPI
from mangum import Mangum
app = FastAPI()
@app.get("/")
async def root():
return {"message": "Hello World!"}
@app.get("/hello/{name}")
async def hello(name: str):
return {"message": f"Hello from FastAPI, {name}!"}
handler = Mangum(app)
And just like that, your FastAPI app is Lambda-ready.
Deploying with Serverless Framework
Serverless Framework is a powerful ally when it comes to deploying serverless apps. Let’s break down how to get your FastAPI app deployed:
- Create a
serverless.yml
File: This file is the lifeblood of your deployment configuration.
service: serverless-fastapi
frameworkVersion: '3'
provider:
name: aws
runtime: python3.9
functions:
api:
handler: app.handler
events:
- httpApi: '*'
- List Your Dependencies: Pop your dependencies in a
requirements.txt
file.
fastapi==0.89.1
mangum==0.17.0
- Deploy: Time to run the magic command:
serverless deploy
This command will package your application, send it over to AWS, and handle the rest.
Using AWS API Gateway for Routing
API Gateway is your friend here, routing requests to your Lambda function.
-
Create an API Gateway: Serverless Framework usually creates this for you, but you can also set it up manually through the AWS console.
-
Route Requests: Configuration in the
serverless.yml
file takes care of routing through thehttpApi
event. -
Test Your API: Once deployed, head to the URL provided by API Gateway to test your endpoints.
https://a7kxkebqij.execute-api.us-east-1.amazonaws.com/dev
This URL lets you test if your Lambda function returns the expected responses.
Optional: Containerizing Your App
Although it’s optional, containerizing your app with Docker adds a layer of security and consistency. Here’s how to do it:
- Create a Dockerfile:
FROM python:3.9-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
- Build and Push the Image: Build your Docker image and push it to Amazon Elastic Container Registry (ECR).
docker build -t my-fastapi-app .
docker tag my-fastapi-app:latest <account_id>.dkr.ecr.<region>.amazonaws.com/my-fastapi-app:latest
docker push <account_id>.dkr.ecr.<region>.amazonaws.com/my-fastapi-app:latest
- Deploy the Containerized Application: Create a Lambda function using the ECR image and tie it with API Gateway.
Heads-Up and Handy Hints
Deploying FastAPI on AWS Lambda is awesome, but here are a few things to keep in mind:
- Cold Start: The first request might take a bit longer as the Lambda function gets initialized, but AWS works hard to keep these delays minimal for subsequent requests.
- Package Size: AWS Lambda has a 250MB limit for unzipped packages. Keep your deployment package lean by excluding unnecessary files.
Wrapping Up
Deploying a FastAPI app on AWS Lambda with the Serverless Framework and Mangum is pretty straightforward. It lets you enjoy the perks of a serverless architecture, making your API scalable and cost-effective. Whether you choose to go the Docker route or stick with a direct deployment, the key is to have a well-configured setup optimized for performance.
With this guide, you’re all set to create an efficient, serverless FastAPI app. Go ahead, deploy with confidence, and let your API soar!