2026-01-24 02:19:02
2026-01-24 02:14:21
From phones and wallets to car keys and coffee receipts, there's never been a perfect way to carry a polished PDF resume.
So I built a resume I can carry 24/7. A live website on AWS, secured with HTTPS, delivered globally through a CDN, and backed by a serverless visitor counter. In this post, I break down the milestones I completed over 18 days, from S3 + CloudFront (OAC private origin) to API Gateway, Lambda, SAM (SAM CLI), and DynamoDB.
Live Site: https://resume.michael-burbank.com/
I designed and built a cloud-hosted resume that's secure, automated, and powered by a serverless backend, not just "another static webpage".
I built my resume as a real webpage using HTML for structure and CSS for styling. This made it easy to update and share, while providing better flexibility than a PDF that may or may not be lost and outdated. 
I store the resume’s static files (HTML/CSS/JS) in an S3 bucket and use it as a private CloudFront origin. The bucket is not public—CloudFront accesses it using Origin Access Control (OAC) so objects can only be fetched through CloudFront.
To serve the site securely using HTTPS and improve load times, I placed CloudFront in front of the S3 bucket. CloudFront caches the site at edge locations and delivers it globally. When resume changes are needed, I update the content followed by CloudFront invalidations to force edge caches to fetch the latest version from the S3 origin. CloudFront uses Origin Access Control (OAC) to securely access my private S3 bucket, enforced by a restrictive bucket policy.
My main domain points to an Amazon EC2-hosted personal website, running on Amazon Linux (AL) 2023. So for the Cloud Resume Challenge, I created the subdomain resume.michael-burbank.com and routed it to the CloudFront distribution that serves my S3 resume. This keeps the two sites separated while still living under the same domain.
Whenever a visitor loads the site, JavaScript calls my API and renders the updated visitor count on the page.
Instead of letting the browser communicate directly with DynamoDB, I used API Gateway to expose a REST API endpoint the website can call over HTTPS. API Gateway triggers Lambda. Lambda is the only component that reads and updates the DynamoDB table. 
Lambda handles the visitor count logic. It runs only when invoked, increments the count, and returns the updated value.
Within my Lambda function, I used Python and the AWS SDK (boto3) to interact with DynamoDB and return a clean JSON response back to the website.
I used DynamoDB on-demand capacity to store the visitor count, keeping costs low and removing capacity planning.
I defined the backend infrastructure, DynamoDB, API Gateway, and Lambda using AWS SAM, so I can deploy with the SAM CLI instead of clicking around in the AWS console. Provisioning and configuring infrastructure using IaC is my favorite part of building different software projects.
I wrote pytest tests for the Lambda logic so changes can be validated automatically before deploying to prod.
I initially configured a working pipeline using GitHub Actions as the challenge suggests, then I migrated to GitLab CI so both repos use a single, consistent CI/CD approach. Each repo uses the .gitlab-ci.yml pipeline to reduce manual steps, automate deployments, all while improving repeatability.
Maintaining my GitHub presence is a must, given that many companies or other developers reference GitHub more often than referencing GitLab. My solution? Utilize two different remote repositories, ensuring this project still aligned with the minimal requirements and the source-control milestone. I mirrored my GitLab repositories to GitHub. This was the first time I had ever configured two different, but parallel and in-sync remote repositories. When I push to main in GitLab (origin) using the CLI, the same commits automatically push to the GitHub repo(s), keeping GitHub in-sync with GitLab. I configured this for both the front and backend repositories.
Below are the runtime request flows (frontend + backend), followed by the deployment pipeline that ships changes to AWS.

CloudFront terminates HTTPS using ACM, serves cached content from edge locations, and uses OAC so only CloudFront can fetch objects from the private S3 origin.

The browser calls API Gateway over HTTPS (CORS enabled). API Gateway invokes Lambda (proxy integration). Lambda uses boto3 to GetItem (GET) or UpdateItem (POST/PUT) in DynamoDB, then returns JSON { "count": n }. CloudWatch captures logs/metrics.

GitLab CI runs tests, deploys serverless resources via SAM/CloudFormation, syncs static site assets to S3 and invalidates CloudFront so edge caches refresh.
I learned a lot through this project - not just AWS services, but how the puzzle pieces fit together in terms of a real production workflow. These were the biggest skills I gained while building and shipping my Cloud Resume Challenge end-to-end:
I started with GitHub Actions (as the challenge suggested) but migrated to GitLab mid-project to GitLab CI to standardize my pipelines. That forced me to think through runner behavior, environment variables, and deployment steps instead of copying a template. It also gave me a real "change-in-flight" experience without breaking production.
Defining the backend in SAM strengthened my ability in treating infrastructure like versioned code. Instead of "click ops", I gain more skills in deploying repeatability, roll changes safely, and keep my API/Lambda/DynamoDB consistent across updates.
I configured my workflows so one push to main in GitLab also syncs to GitHub. This helped me keep my GitHub presence active while using GitLab CI as the primary CI/CD platform instead of GitHub Actions. It was also my first time running a multi-remote workflow cleanly.
I strengthened my ability to keep the front and backend behavior separate from each other while maintaining an efficient contract between the two. Designing the endpoint around "increment and return the updated count" made the frontend less complex and kept the database logic server-side where it rightfully belongs.
API Gateway became the front door: HTTPS access, routing, and a clean interface between the browser and Lambda. It made the whole design feel like a real system rather than "JavaScript communicating with the database".
Building pipelines for both the front and backend reinforced how much automation reduces human error. Once it worked, deployments stopped being a "process" and became a push-to-main routine.
Using Docker for local API/DB testing taught me how to validate logic without relying on cloud deployments for every change. That feedback loop is faster, cheaper, and much closer to how teams develop and test, a part of the Software Development Life Cycle (SDLC).
If you’re doing the Cloud Resume Challenge, drop your link — I’ll check it out!
2026-01-24 02:13:14
I searched for an easy way fo find the best contrasting color for colors on my website.
There's contrast-color in css, but only works in Firefox and Safari.
There are other options with limited availability.
I didn't see any formula's I liked.
Google Search AI offered steps. I'm hesitant to use anything AI offers without another source where I can verify it. The steps were written differently enough, that it took my brain a few minutes to understand what each one was saying and how they could/would work together.
In my xs_style.js library which would be open source, but I'm the only one maintaining it, I used those steps to build xs_color_contrast_enhanced. It allows you to contrast with the best of 2, 3, or 4 colors. It also allows you to contrast with the best of any number of colors - just pass them in an array. Yes, it can do 16777216 colors, which is the max # colors available on a website. But, you're array of colors should probably be much smaller for performance sake.
xs_style.js contains xs_color_rainbow_array and that worked best for contrast.
However, there were still a few colors on my Many Colors page where the contrast/text color was off - see link at bottom of post. There, I display about 1000 colors. There's another array which also has white and black, besides rainbow colors. That one was further off. Mostly some greens should have had white/light text and some reds should have had black/dark text. I added a couple of arrays to that webpage so that I could overwrite the contrasting colors that were visually off. Until I get a contrast formula that's closer, this will have to do. Only 114 were off. Not bad. Perhaps, I could have checked the hue colors to see if they were red or green. But, that would have corrected too many incorrectly. And there were more colors where the contrasting color was off.
I thought that if I evenly spaced the colors throughout the possibilities, that might work. Maybe it would if they were evenly spaced by luminosity, but by hex value it didn't help.
Still, since it would be close in most situations, I added it to my webpage where you can input a color and find out the closest official colors. You can also find that page at the following link.
https://stubbart.com/computer_consulting/color_themes/color_all.html
If you know of better ways to find the contrasting color, please let me know. And, always feel free to ask questions.
2026-01-24 02:10:44
Building Affly 🚧 A platform to discover trusted deals & affiliates without spam.
Launching soon 👀 #BuildingInPublic #Affly
2026-01-24 02:07:15
TL;DR
Font Awesome, Lucide, Heroicons… When choosing an icon library for work projects, I suspect many of us don't give much thought to licensing.
I used to think "it's MIT licensed, so it's fine" and left it at that. But when I looked into it more closely, I discovered quite a few things I didn't know. I thought it might be helpful to share what I found.
When you hear "free icon library," you probably think "no cost." But in the open source world, there's another meaning.
"Think free as in free speech, not free beer."
These are the words of Richard Stallman, founder of the free software movement.
For icon libraries:
This distinction matters when it comes to choosing licenses.
In the 2000s, web icons were mostly individual PNG images. They weren't scalable, and they required many HTTP requests.
In 2011, Twitter Bootstrap (now Bootstrap) shipped with Glyphicons bundled in. Glyphicons was an icon set created by Jan Kovařík — originally a paid product, but around 250 icons were provided free of charge for Bootstrap. Then in 2012, Dave Gandy released Font Awesome, and it became the most popular new project on GitHub.
In 2016, Font Awesome raised $1,076,960 (over $1 million) on Kickstarter, reportedly setting a record for software projects at the time.
Glyphicons was standard in Bootstrap 3, but it was removed in Bootstrap 4. The reasons included: file size bloat for users who didn't need icons, many users were already using Font Awesome or other icon fonts, and it made sense to delegate icon development to specialized projects.
In 2017, Cole Bemis released Feather Icons. It was a minimal and beautiful icon set, but maintenance gradually slowed down, with over 300 issues left unaddressed.
In 2020, the community forked it and launched Lucide. It has since grown into an active project with over 1,500 icons. This is open source succession at its finest.
| License | Commercial Use | Modify/Redistribute | Attribution | Copyleft | Notes |
|---|---|---|---|---|---|
| MIT | ✅ | ✅ | In source code | None | Most simple. Most popular on GitHub |
| ISC | ✅ | ✅ | In source code | None | Nearly identical to MIT. npm's default |
| Apache 2.0 | ✅ | ✅ | In source code | None | Includes patent clause. Common in Google projects |
| CC BY 4.0 | ✅ | ✅ | Required in principle | None | Designed for content. Used by Font Awesome icons |
| SIL OFL 1.1 | ✅ | ✅ | In source code | None | Font-specific. Cannot be sold standalone |
| GPL | ✅ | ✅ | Derivatives must use same license | Yes | If you distribute derivatives, you must release source code under the same license |
Keep the copyright notice and license text in your source code
That's basically all you need to do. No visible attribution in the UI is required. When you run npm install, license files are automatically included in locations like node_modules/lucide-react/LICENSE — just don't delete them and you're fine.
Lucide, Heroicons, Tabler Icons, Phosphor Icons, and Bootstrap Icons all fall into this category. These licenses are quite easy to work with for commercial projects.
Similar to MIT, but includes explicit permission regarding patents. Often adopted by projects involving large corporations.
Common in Google's Material Symbols and Android-related projects. Attribution is "welcomed but not required."
Creative Commons was originally designed for content like images, music, and text. "BY" stands for "Attribution."
Font Awesome Free's icon SVGs and JS files use this license. In principle, attribution is required, but Font Awesome has made special accommodations (more on this below).
Font Awesome Free isn't under a single license. This might come as a surprise.
| Component | License | Specific Files |
|---|---|---|
| Icons (design) | CC BY 4.0 | SVG files |
| Font files | SIL OFL 1.1 | .woff2, .ttf, etc. |
| Code | MIT | CSS, JavaScript |
Even when using the SVG-JS model via npm, these three licenses are all in play.
The official LICENSE.txt states:
Attribution is required by MIT, SIL OFL, and CC BY licenses. Downloaded Font Awesome Free files already contain embedded comments with sufficient attribution, so you shouldn't need to do anything additional when using these files normally.
In other words, if you keep the embedded comments in the downloaded files, no additional attribution is required.
Modern build tools like Webpack, Vite, and esbuild extract comments starting with @license, @preserve, or /*! into separate files (.LICENSE.txt) by default. Font Awesome's files include these comments, so the license information is preserved. However, since the comments are removed from the original JS files, it's worth checking your build output to be sure.
If you copy individual SVGs from the website, they may not include comments. In that case, you can add a license comment inside the SVG. This way, you don't need to display attribution in your footer or UI.
<!-- Font Awesome Free 6.x by @fontawesome - https://fontawesome.com License - CC BY 4.0 -->
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512">...</svg>
Of course, you can also add attribution in your about page or footer.
<footer>
Icons by <a href="https://fontawesome.com">Font Awesome</a> (CC BY 4.0)
</footer>
Brand icons included in Font Awesome (company logos, etc.) involve trademark issues.
Even if the license is MIT or CC BY, you must follow each company's Brand Guidelines. For example, modifying Twitter (X)'s logo without permission or using it in a competitor's context is not allowed. This is separate from licensing.
Font Awesome Pro requires an API token when installing via npm. Here's the difference from the Free version.
npm install --save @fortawesome/fontawesome-free
That's all you need.
The Pro version is fetched from Font Awesome's private npm registry, so an authentication token is required. Configure it in your project's .npmrc file:
# .npmrc
@fortawesome:registry=https://npm.fontawesome.com/
//npm.fontawesome.com/:_authToken=${FONTAWESOME_NPM_AUTH_TOKEN}
Manage your token via environment variables — don't write it directly in the .npmrc file.
There are plenty of alternatives to Font Awesome.
| Library | License | Icon Count | Attribution | Token |
|---|---|---|---|---|
| Font Awesome Free | CC BY 4.0 + SIL OFL + MIT | 2,000+ | Required (embedded OK) | Not required |
| Font Awesome Pro | Commercial license | 30,000+ | Not required | Required |
| Lucide | ISC | 1,500+ | Not required | Not required |
| Heroicons | MIT | 450+ | Not required | Not required |
| Tabler Icons | MIT | 5,900+ | Not required | Not required |
| Bootstrap Icons | MIT | 2,000+ | Not required | Not required |
| Phosphor Icons | MIT | 9,000+ | Not required | Not required |
| Material Symbols | Apache 2.0 | 2,500+ | Recommended (not required) | Not required |
If simplicity is your priority:
If icon count is your priority:
If ecosystem matters:
| License | Source Code/LICENSE | UI (Footer, etc.) |
|---|---|---|
| MIT | ✅ Required | Optional |
| ISC | ✅ Required | Optional |
| Apache 2.0 | ✅ Required | Optional |
| CC BY 4.0 | ✅ Required | Recommended if comments removed |
| SIL OFL 1.1 | ✅ Required | Optional |
For MIT-licensed libraries, visible attribution in HTML, CSS, or UI is generally not considered necessary.
# THIRD_PARTY_LICENSES.md
## Icons
### Lucide Icons
- License: ISC
- https://lucide.dev/license
With MIT or ISC licenses, you don't need visible attribution in the UI. However, you do need to keep license information somewhere in your project's source code. Creating a file like this can be helpful when legal review comes around.
There are also tools to auto-generate license lists:
npx license-checker --json > licenses.json
When choosing an icon library for commercial projects, it's easy to default to "just use Font Awesome." But understanding the license structure can give you peace of mind.
Personally, I tend to use Lucide or Heroicons for new projects. They have simple, single licenses (MIT/ISC), no token management required, and official React/Vue components — less friction in practice.
Font Awesome Free is perfectly fine for commercial use, so if you're already using it in an existing project, there's no need to migrate. Just keep the embedded comments and you'll meet the attribution requirements.
I hope this article helps someone out there.
Licenses
Icon Libraries
History
2026-01-24 02:04:53
Installing Arch Linux can feel overwhelming at first, not because it is difficult,
but because it gives you many choices from the very beginning.
This post documents a clean and practical way I approach Arch Linux installation,
focused on understanding each step instead of blindly copy-pasting commands.
This is not meant to replace the Arch Wiki. Instead, it is a practical walkthrough
for users who prefer a structured approach while still learning how things work.
Those topics deserve their own dedicated guides.
Arch Linux encourages learning by doing. Having a clear reference
helps reduce confusion while still respecting Arch’s philosophy.
I documented the complete installation with explanations and notes here:
Complete Arch Linux Installation Guide (MusaBase)
A lightweight reference version is also available on GitHub:
Lightweight Reference on GitHub
Let me know in the comments if you’d like a follow-up on desktop setup or gaming tweaks!