I’m blogging for a few years now. I write about CI/CD, containers, automation tools and my experiences in the cloud. I’ve published some blogs on Medium and dev.to,
but I missed the satisfaction of publishing under my own domain.
I considered Hashnode but as a Cloud & DevOps engineer I want to be responsible to host my website in a secure, high available and automated way!

As an AWS Community Builder it’s not entirely coincidental that I choose AWS as cloud provider to host my website. The website itself should be very fast and support the markdown language. After doing my research I decided to go with Hugo as open-source static site generator. Let’s first take a look to the architecture.

Amazon Simple Storage Service (S3)

Let’s start at the bottom. I’m using AWS S3 (standard storage) to store my content.
I don’t want to go in too much detail but the chance you would lose data stored on S3 is almost non-existent. Not only durability is one of the strenghts of S3 but also availability. In the worst case scenario your data won’t be available for a few hours per year.
S3 is a regional service. This means your data will be stored in a specific AWS region (eu-west-1). For my personal website that’s fine. There is no need to set up cross-region replication to replicate data to additional AWS regions.

Amazon CloudFront

There is a more important disadvantage of keeping my data in one geographical region which is latency. I’m living in Belgium so for me it would be fine to connect with the eu-west-1 region which is located in Dublin, Ireland.
People living in Australia can face some increased latency. AWS offers CloudFront as solution against latency issues. Amazon CloudFront is a fast content delivery network (CDN) service that securely delivers my data to people globally with low latency, and high transfer speeds.
CloudFront makes use of Edge locations where the data will be cached closely to the visitors. The data is cached for 24h by default. The Red arrows below show the short latency between visitors and the Edge locations. The blue lines represent the one time connection an Edge location needs to make to the bucket to get the content and store (cache) it in the edge location.

Amazon CloudFront Functions

To improve the security we can use CloudFront functions to manage security headers. Security headers are a group of headers in the HTTP response from a server that tell your browser how to behave when handling your site’s content.
For example, X-XSS-Protection is a header that Internet Explorer and Chrome respect to stop pages loading when they detect cross-site scripting (XSS) attacks.
The security score of a website improves when you define these headers.
CloudFront functions are pretty new. In many cases they can replace the older lambda@edge solutions. This article explains the differences well.
Below you can find the code I use to add the headers.

function handler(event) {
    var response = event.response;
    var headers = response.headers;
    
    headers['strict-transport-security'] = { value: 'max-age=63072000; includeSubdomains; preload'}; 
    //CSP clashes with hugo inline scripts:
    //headers['content-security-policy'] are not (yet) set
    headers['x-content-type-options'] = { value: 'nosniff'}; 
    headers['x-xss-protection'] = {value: '1; mode=block'};
    headers['referrer-policy'] = {value: 'same-origin'};
    headers['x-frame-options'] = {value: 'DENY'};
    
    return response;
}

My site achieves a ‘B’ and not an ‘A+’ score because I didn’t enable the Content-Security-Policy response header. This was clashing which some inline scripts which were used by my Hugo template.

The score before adding the CloudFront function
The score after adding the CloudFront function

I use another CloudFront function to redirect https://www.lvthillo.com to https://lvthillo.com. Without this function both www and non-www will work but this is bad in SEO perspective. Google will handle it as two separate websites.

Amazon Route53

CloudFront provides a domain name for the distribution, such as d111111abcdef8.cloudfront.net. If you want to use your own domain name, you can add an alternate domain name to your CloudFront distribution and create an alias or CNAME record in Route53.

AWS Certificate Manager

I’m using ACM to request a SSL/TLS certificate and I’ll attach it to my distribution.
Public SSL/TLS certificates provisioned through AWS Certificate Manager are free.
The certificate should be requested in us-east-1 if you want to attach it to a CloudFront distribution!

AWS Cloud Development Kit

The AWS Cloud Development Kit (AWS CDK) is an open source software development framework to define your cloud resources using familiar programming languages.
I’m using Python to describe my infrastructure in code (IaC).
CDK enables you to define your infrastructure using higher-level abstractions, but still leverages CloudFormation under the hood. You can find the code on my GitHub.

$ npm install -g aws-cdk
$ cdk version
$ git clone https://github.com/lvthillo/blog-infra.git 
$ cd blog-infra
$ python3 -m venv .venv
$ source .venv/bin/activate
$ pip install -r requirements.txt

The stack requires two context variables. The following command will define those variables and prints the CloudFormation template for the specified stack.

$ cdk synth -c bucket_name=lvthillo-bucket -c domain_name=lvthillo.com

I’m hosting my website in the eu-west-1 region. It’s important to note that some services like Route53 and CloudFront operate as global services.
I’ve already mentioned that when you want to attach an SSL/TLS certificate to your CloudFront distribution, you need to create it in us-east-1.
That’s the reason why I need CDK bootstrap to make it possible to deploy my stack in multiple regions.

The region for our certificate is hardcoded (always us-east-1)
def __create_certificate(self, hosted_zone):
    self.certificate = acm.DnsValidatedCertificate(self, 'CrossRegionCertificate',
        domain_name=self.__domain_name,
        subject_alternative_names=[self.__domain_name, self.__www_domain_name],
        hosted_zone=hosted_zone,
        region='us-east-1',
    )

Now I use CDK bootstrap and I’ll deploy the CDK stack.

$ cdk bootstrap aws://12345678912/eu-west-1 -c bucket_name=lvthillo-bucket -c domain_name=lvthillo.com
$ cdk deploy -c bucket_name=lvthillo-bucket -c domain_name=lvthillo.com

The deploy will make the following IAM statement and policy changes.

After the deploy I checked all resources in the blog-infra CloudFormation stack.
At last I had to build my Hugo site and upload it to my S3 bucket. Here for I’m using GitHub Actions.

name: GitHub S3 Deploy

on:
  push:
    branches:
      - main
  pull_request:

jobs:
  deploy:
    runs-on: ubuntu-20.04
    concurrency:
      group: ${{ github.workflow }}-${{ github.ref }}
    steps:
      - uses: actions/checkout@v2
        with:
          submodules: true  # Fetch Hugo themes (true OR recursive)
          fetch-depth: 0    # Fetch all history for .GitInfo and .Lastmod

      - name: Setup Hugo
        uses: peaceiris/actions-hugo@v2
        with:
          hugo-version: '0.85.0'

      - name: Build
        run: hugo --minify

      - name: Deploy to S3
        run: aws s3 sync public/ s3://{s3-bucket-name}/
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

The website is deployed and I’m able to check the performance!

Now I have a fully automated way to deploy and host my website. I used AWS CDK to describe my infrastructure and to deploy the stack to host my static website.
My website itself is automatically deployed using GitHub actions!