In 2022, I came across Forrest Brazeal’s Cloud Resume Challenge on LinkedIn and then his personal website. The premise is simple: build your resume as a website using cloud services. But given my developer background and current role as an AWS employee, I decided to make it harder and implement best practices in a highly cost-optimised architecture.

Choosing the Hard Path

I’ve played around with single page web-apps in the past using Hugo as the template engine. This was the first time I actually deployed a website outside my dev machine onto the cloud, and I wanted to have a direct link from pushing updates to the git repository through to the deployed website on S3. At the time, Code Commit and Code Pipeline were reasonable choices (Code Commit has been since deprecated and then revived… but I’ve moved on). I’ve since updated my CICD to use Github with Github Actions for the build and deploy. I also wanted to get some hands-on experience similar to my enterprise customers by provisioning a multi-account AWS Organisation with separate dev, test, and prod accounts, fully automated CI/CD infrastructure as code, to go with the completely serverless architecture.

The initial deployment cost $0.02/month. As I’ve added more content, images, and video assets, it’s grown to around $0.12/month — I’m going to try to keep the cost below $1/month as I add demo’s and more content to the site (I haven’t yet implemented S3 lifecycle rules to claw-back my $0.10/month cost increase…).

Architecture Overview

The architecture spans three AWS accounts (dev, test, prod) with two GitHub repositories and separate CloudFormation stacks in each account.

Infrastructure Repository

The infrastructure repo manages four CloudFormation stacks:

  • Website Stack — S3 bucket with static website hosting enabled, bucket policy for public read access. Each account gets its own bucket.

  • CICD Stack — The original CodePipeline setup (now legacy). This created a V2 CodePipeline with CodeCommit as source, an S3 artifact bucket, and IAM roles for pipeline execution. This was the first CI/CD approach before migrating to GitHub Actions.

  • Resume Stack — The original Hugo build pipeline using Lambda. A Python Lambda function with custom layers (Hugo binary, libstdc++, unzip) would receive source code from CodePipeline, build the Hugo site, and deploy to S3. EventBridge rules triggered the pipeline on CodeCommit pushes.

  • GitHub OIDC Stack — IAM OIDC identity provider and role for GitHub Actions. This replaced the CodeCommit/CodePipeline approach with keyless authentication from GitHub to AWS using OpenID Connect. The role has scoped permissions for S3 deployment, CloudFront invalidation, CloudFormation operations, and IAM role management.

Resume Repository

The resume repo contains the Hugo site source code and a GitHub Actions workflow that:

  1. Authenticates to the dev account via OIDC to pull media assets from S3
  2. Installs Hugo modules and Node dependencies
  3. Builds the site with a stage-aware base URL (dev uses S3 website endpoint, test/prod use custom domains)
  4. Re-authenticates to the target stage account
  5. Syncs the built site to S3 (excluding the media prefix)
  6. For test/prod: assumes the prod account role to invalidate CloudFront cache

Content Delivery

  • Dev — S3 static website hosting directly, no CloudFront, no custom domain, used for developing the site with AWS component deployment/integration
  • Test — CloudFront distribution, S3 origin, CloudFront Function for index.html resolution on subdirectories
  • Prod — CloudFront distribution, same setup as test

Media Asset Pipeline

Images and videos are stored in the dev account S3 bucket under its own prefix (data is synced locally to that prefix before building the site). During CI/CD builds, the workflow pulls these assets into the Hugo assets/images/ directory before building. Hugo’s module mount system serves them as both processed assets (for resources.Get in templates) and static files (for markdown image references). This keeps binary files out of the git repository while maintaining a single source of truth for all media.

Local Development

For local development, I use a containerised approach with Finch (AWS’s open-source Docker alternative). A simple Dockerfile pulls the hugomods/hugo:exts-0.146.0 image, and an entrypoint script installs Hugo modules and Node dependencies at container start. Docker Compose mounts the source directory and persists the Hugo cache and node_modules as named volumes.

What’s Next

I’m now using this website as a platform for me to recertify my AWS skills and build some cool demos for fun. I’ll have a go at some demo projects for each certification (when it makes sense) and document them as blog posts. It’s been hard for me to explain some of my background experiences in mechanical engineering and subsea pipelines, so I’ll give that a go here as well.

For me, the Cloud Resume Challenge was never just about building a resume. It was about using the tech I work with every day for something that’s meaningful to me, and giving me a space to share my story.