python

Is Your FastAPI App Missing the Magic of CI/CD with GitHub Actions?

FastAPI Deployment: From GitHub Actions to Traefik Magic

Is Your FastAPI App Missing the Magic of CI/CD with GitHub Actions?

Deploying FastAPI apps efficiently can be a game-changer, especially if you harness the power of continuous integration and continuous deployment (CI/CD) pipelines. GitHub Actions is one of the best tools to make this magic happen. Let’s dive into how to set up a smooth CI/CD pipeline for FastAPI applications using GitHub Actions.

First things first, get your FastAPI project ready. Your code should already be nestled comfortably in a GitHub repository. Dependencies need to be managed with tools like pip or poetry, and you should have unit tests in place, possibly using pytest.

Now, GitHub Actions is where the fun really begins. It’s like having an assistant that automates your build, test, and deployment workflows straight from your GitHub repository. Imagine triggering specific workflows whenever you push code or open a pull request—it’s a coder’s dream.

Creating a CI/CD pipeline begins with defining a workflow in a YAML file. Place this file in the .github/workflows directory of your repository. Here’s an example of a no-frills workflow that kicks in when you open a pull request on the master branch:

name: Pull Request

on:
  pull_request:
    branches:
      - master

jobs:
  test:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        python-version: [3.9]

    steps:
      - uses: actions/checkout@v2
      - name: Set up Python ${{ matrix.python-version }}
        uses: actions/setup-python@v2
        with:
          python-version: ${{ matrix.python-version }}

      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          pip install flake8
          if [ -f requirements.txt ]; then pip install -r requirements.txt; fi

      - name: Static Code Linting with flake8
        run: |
          flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
          flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics

      - name: Unit Testing with pytest
        env:
          # Add environment variables for tests
        run: |
          pytest

With this workflow, your code gets checked out, Python is set up, dependencies installed, code linted with flake8, and unit tests run with pytest.

Next up, let’s talk Docker. Docker containers can wrap up your FastAPI application neatly, making it easier to ship and deploy. Extend the workflow to build and deploy a Docker image. Start by building the Docker image:

- name: Build and push Docker image
  run: |
    docker build -t my-fastapi-app .
    docker tag my-fastapi-app:latest <your-docker-hub-username>/my-fastapi-app:latest
    docker push <your-docker-hub-username>/my-fastapi-app:latest

Then, deploy it to a remote server:

- name: Deploy to remote server
  uses: appleboy/ssh-action@v1
  with:
    host: ${{ secrets.REMOTE_SERVER_HOST }}
    username: ${{ secrets.REMOTE_SERVER_USERNAME }}
    key: ${{ secrets.REMOTE_SERVER_PRIVATE_KEY }}
    script: |
      ssh -o "StrictHostKeyChecking=no" ${{ secrets.REMOTE_SERVER_USERNAME }}@${{ secrets.REMOTE_SERVER_HOST }} "docker pull <your-docker-hub-username>/my-fastapi-app:latest && docker-compose up -d"

This uses the appleboy/ssh-action to make things happen on your remote server—pulling the latest Docker image and spinning it up.

Continuous deployment (CD) takes it a step further, ensuring that your app automatically gets deployed to production after passing all tests and checks. Here’s how you can set that up:

name: Deploy to Production

on:
  push:
    branches:
      - master

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Set up Python
        uses: actions/setup-python@v2
        with:
          python-version: '3.9'

      - name: Build and push Docker image
        run: |
          docker build -t my-fastapi-app .
          docker tag my-fastapi-app:latest <your-docker-hub-username>/my-fastapi-app:latest
          docker push <your-docker-hub-username>/my-fastapi-app:latest

      - name: Deploy to remote server
        uses: appleboy/ssh-action@v1
        with:
          host: ${{ secrets.REMOTE_SERVER_HOST }}
          username: ${{ secrets.REMOTE_SERVER_USERNAME }}
          key: ${{ secrets.REMOTE_SERVER_PRIVATE_KEY }}
          script: |
            ssh -o "StrictHostKeyChecking=no" ${{ secrets.REMOTE_SERVER_USERNAME }}@${{ secrets.REMOTE_SERVER_HOST }} "docker pull <your-docker-hub-username>/my-fastapi-app:latest && docker-compose up -d"

Every time you push code to the master branch, this workflow rolls out the updates automatically.

To jazz things up even more, you could use Traefik for reverse proxy and HTTPS. Traefik routes incoming requests to your FastAPI application, adding a layer of sophistication with SSL certificates and encrypted traffic.

Here’s a sneak peek into configuring Traefik alongside your FastAPI service using Docker Compose:

version: '3'
services:
  traefik:
    image: traefik:v2.4
    ports:
      - "80:80"
      - "443:443"
    volumes:
      - ./traefik.yml:/etc/traefik/traefik.yml
      - ./certs:/etc/traefik/certs
      - /var/run/docker.sock:/var/run/docker.sock
    command: --log.level=DEBUG

  fastapi:
    build: .
    ports:
      - "8000:8000"
    labels:
      - "traefik.enable=true"
      - "traefik.http.routers.fastapi.rule=Host(`your-domain.com`)"
      - "traefik.http.routers.fastapi.tls=true"
      - "traefik.http.routers.fastapi.tls.certresolver=letsencrypt"

This configuration ensures Traefik handles HTTP and HTTPS requests, routing them right into your FastAPI app.

Let’s not forget about secrets. Any sensitive information like server credentials or API keys should be managed wisely. GitHub Actions lets you store these secrets securely, easily referenced in your workflows using ${{ secrets.SECRET_NAME }}.

For more complex needs, you might opt for a self-hosted GitHub Actions runner. This allows actions to run on your own infrastructure, giving you even more control.

Setting up the runner involves following the official guide to get everything installed and configured. Ensure it runs as a service so it keeps ticking even if the connection drops.

Wrapping it all up, getting your FastAPI app deployed with GitHub Actions is a series of steps: prepping your environment, defining workflows, building with Docker, and rolling out the app while keeping secrets safe. Automate as much as possible to reduce human errors and maintain consistency.

Using GitHub Actions, you’ll have a nimble and reliable CI/CD pipeline, keeping your FastAPI application always up-to-date and performance-ready.

Keywords: FastAPI, CI/CD, GitHub Actions, Docker, pytest, continuous deployment, YAML workflows, Docker Compose, Traefik, secrets management



Similar Posts
Blog Image
How Can FastAPI and Kafka Transform Your Real-time Data Processing?

Unleashing Real-Time Data Magic: FastAPI and Kafka in Symphony

Blog Image
5 Essential Python Libraries for Efficient Data Preprocessing

Discover 5 essential Python libraries for efficient data preprocessing. Learn how Pandas, Scikit-learn, NumPy, Dask, and category_encoders can streamline your workflow. Boost your data science skills today!

Blog Image
Mastering Python's Abstract Base Classes: Supercharge Your Code with Flexible Inheritance

Python's abstract base classes (ABCs) define interfaces and behaviors for derived classes. They ensure consistency while allowing flexibility in object-oriented design. ABCs can't be instantiated directly but serve as blueprints. They support virtual subclasses, custom subclass checks, and abstract properties. ABCs are useful for large systems, libraries, and testing, but should be balanced with Python's duck typing philosophy.

Blog Image
What Secrets Can Dependency Scopes Reveal About Building Scalable APIs with FastAPI?

Unlocking FastAPI's Full Potential Through Mastering Dependency Scopes

Blog Image
Writing Domain-Specific Compilers with Python: A Step-by-Step Guide

Creating a domain-specific compiler in Python involves lexical analysis, parsing, semantic analysis, and code generation. It's a powerful tool for specialized tasks, enhancing code expressiveness and efficiency in specific domains.

Blog Image
Unleash Marshmallow’s True Power: Master Nested Schemas for Complex Data Structures

Marshmallow: Python library for handling complex data. Nested schemas simplify serialization of hierarchical structures. Versatile for JSON APIs and databases. Supports validation, transformation, and inheritance. Efficient for large datasets. Practice key to mastery.