Deploying a FastAPI application to cloud platforms like AWS or Google Cloud can be a breeze when you use CI/CD pipelines. This approach makes sure your application gets updated, tested, and deployed smoothly, slashing the time and hassle it takes to go live.
First off, you need your FastAPI application set up the right way. Here’s a simple illustration:
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
return {"Hello": "World"}
@app.get("/items/{item_id}")
def read_item(item_id: int):
return {"item_id": item_id}
This example lays out a straightforward FastAPI application with two endpoints: one for the root URL and another for fetching items by their ID.
Now, let’s talk about containerization using Docker. To push your FastAPI app to cloud platforms, you’ll need to put it in a Docker container. This involves setting up a Dockerfile
which explains how to build your application’s Docker image.
Here’s an example of a Dockerfile
:
FROM python:3.10-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8080"]
And for the requirements.txt
file, your app might need something like:
fastapi==0.110.3
uvicorn==0.29.0
When it comes to deploying on Google Cloud Platform (GCP), you’ve got options like App Engine and Cloud Run.
For App Engine, get an app.yaml
file ready:
runtime: python38
instance_class: F1
automatic_scaling:
max_instances: 1
handlers:
- url: /.*
script: auto
After that, deploy with the gcloud
CLI:
gcloud app create --project=your-project-id
gcloud app deploy app.yaml --project=your-project-id
Just swap your-project-id
with the GCP project ID you’re using. Once it’s live, your access will be something like your-project-id.appspot.com
.
For Cloud Run, it’s a bit different:
-
Build and push your Docker image:
docker build -t gcr.io/your-project-id/your-image-name . docker push gcr.io/your-project-id/your-image-name
-
Deploy to Cloud Run:
gcloud run deploy --image gcr.io/your-project-id/your-image-name --platform managed --allow-unauthenticated
You’ll get a URL from the deployment command that you can use to access your app.
Turning to AWS, services like AWS Elastic Container Service (ECS) and AWS Lambda offer great solutions.
For ECS, here’s the rundown:
First, build and push your Docker image:
docker build -t your-image-name .
docker tag your-image-name:latest <account_id>.dkr.ecr.<region>.amazonaws.com/your-repo-name:latest
docker push <account_id>.dkr.ecr.<region>.amazonaws.com/your-repo-name:latest
Then, set up an ECS cluster and task definition. Do this via AWS Management Console or AWS CLI.
Finally, deploy it:
aws ecs create-service --cluster your-cluster-name --service-name your-service-name --task-definition your-task-definition --desired-count 1
You can then access it via the load balancer or service endpoint.
Now, let’s delve into CI/CD integration. Automating the build, test, and deployment processes is key to keeping your app updated without breaking a sweat. GitHub Actions is an excellent option for both AWS and GCP.
For AWS, here’s an example GitHub Actions workflow:
name: Deploy to AWS ECS
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Login to AWS
uses: aws-actions/login@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: 'us-west-2'
- name: Build and push image
run: |
docker build -t ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.us-west-2.amazonaws.com/your-repo-name:latest .
docker tag ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.us-west-2.amazonaws.com/your-repo-name:latest ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.us-west-2.amazonaws.com/your-repo-name:latest
docker push ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.us-west-2.amazonaws.com/your-repo-name:latest
- name: Deploy to ECS
run: |
aws ecs update-service --cluster your-cluster-name --service your-service-name --task-definition your-task-definition --desired-count 1
And for GCP, here’s the GitHub Actions setup:
name: Deploy to GCP App Engine
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Login to GCP
uses: google-github-actions/login@v0
with:
credentials_json: ${{ secrets.GCP_CREDENTIALS }}
- name: Deploy to App Engine
run: |
gcloud app deploy app.yaml --project=your-project-id
Monitoring and logging are also crucial once your app is live. On GCP, the gcloud
CLI helps you keep track by streaming logs:
gcloud app logs tail -s default
AWS CloudWatch is your go-to for logging and monitoring on AWS. Set up your ECS tasks to send logs to CloudWatch:
{
"containerDefinitions": [
{
"name": "your-container-name",
"image": "<account_id>.dkr.ecr.<region>.amazonaws.com/your-repo-name:latest",
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "your-log-group",
"awslogs-stream": "your-log-stream",
"awslogs-region": "us-west-2"
}
}
}
]
}
Deploying a FastAPI app to the cloud with continuous integration and deployment not only keeps your app up-to-date but also ensures it runs like a well-oiled machine. Containers with Docker, automated deployments with GitHub Actions, and proper logging and monitoring tools take the heavy lifting off your shoulders. Whether you go for App Engine, Cloud Run, or ECS, the key is to build a workflow that suits your team’s needs, making sure your app is always ready to deliver to your users efficiently.