MoreRSS

site iconThe Practical DeveloperModify

A constructive and inclusive social network for software developers.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of The Practical Developer

I cheated to maintain GitHub streak and I don't regret it

2026-01-24 22:13:23

Since September 2022, I am actively trying to have a GitHub streak - every day at least one new contribution. And yes, I totally agree with you, it is meaningless. But it has a meaning for me. I want to do it, and I want to keep it. Even if some days the contribution has little to no value - like just adding a new article I have recently read into one of my news lists.

I used this widget to display the number of days on my GitHub profile. It was great until it stopped working some time ago. Some API changes probably. Maybe it'll come back eventually. But I didn't want to wait. I found an alternative that works here and now.

Except it doesn't work the same way as the previous one and as the GitHub commit graph itself. It only counts "real" commits, it doesn't recognize code reviews.

And this pointed out a flaw in my beloved streak. One day, June 1st, 2024, I somehow forgot to actually commit something. I "only" did two reviews for my Dependabot updates. I didn't notice, because when I opened GitHub page, the day was green. But my streak suddenly dropped by more than half 🥹

I could have just let it go and accept the bitterness. But I didn't want to. The other day I did four commits! What if I just virtually travel back in time and alter one of them to fill in the gap? 💡

And so I did. And the timeline was fixed. And I can enjoy my 1236 and counting streak. End of the story.

But this wasn't meant to be just a confession. And I don't ask for redemption. The title says I don't regret it. Here I wanted to explain why not.

Because due to this "cheating", I learned to know Git better than before.

Git is one of those things you can adopt in a few minutes and then spend a lifetime trying to really understand it. Btw it took just 10 days to forge this backbone on modern development. Stories like that keep reminding me how mediocre I actually am.

To be able to change a year and half worth of my project Git history, I needed to understand how to work with git rebase command. How to find the correct commit and how to alter it via the interactive mode. The biggest Aha! moment was realizing that once you do this, you effectively erase the current history starting from the altered commit and you'll get a new one. The changes and messages remain the same (except what was changed), but all commits will become new with new hashes. I used git rebase before, but only to fix very recent issues. So I successfully overlooked this fact. Now I am smarter.

Another 🤯 thing I have learned is that when using GitHub, there are two commit dates - the original Author Date (marking the point of time when the commit was created) and Commit Date (changing when the commit gets edited - or rebased). I learned it the hard way after I force-pushed my changes to GitHub and suddenly 150 commits were "made" on 17th January 2026 😰

That unfortunate event directly leads into my third discovery of existence of git filter-branch command and (more modern) git filter-repo tool. Those can be called into action to do bulk updates over commits. And help undone lapses like mine. So no worries, after another trial-and-error session with Copilot, I managed to repair my flawed Git history, and you could never tell again.

And that concludes my story. If you already knew all the above, good for you. It took me like 8 years of working with Git to dig that deep. I guess one can go even deeper, but I will take a break for a while. If you have stories to share or questions to ask, feel free to express them in the comments below. And stay tuned for another Alois discovers trivial things article.

Disclaimer: You should be careful when tampering with your Git timeline and force-pushing something in the repository. If you have changes in other branches or even uncommitted in your local checkouts, you may get into trouble. If the team is bigger than just you, the troubles may be even bigger. I could afford to ignore those concerns as it was my private repo with no WIP. But this is not always the case.

Inside the Command Center

2026-01-24 22:02:10

If Kubernetes is the Captain of the ship, the Control Plane is the bridge where all the decisions are made, and the Worker Nodes are the deckhands doing the heavy lifting.

Let's pull back the curtain on the "brains" and "brawn" of a K8s cluster.

1. The Brains: The Control Plane

The Control Plane lives exclusively on Linux. It’s a suite of services that work together to make sure your "desired state" (what you want) matches the "actual state" (what is happening).

The API Server: The Gatekeeper

Every single thing that happens in Kubernetes goes through the API Server.

  • Think of it as the cluster's "Front Desk."
  • When you send a YAML file (your app's blueprint), the API Server checks your ID (Authentication), makes sure you're allowed to do it (Authorization), and then records the plan.

The Cluster Store (etcd): The Source of Truth

Kubernetes needs a memory. It uses etcd, a tiny but mighty distributed database.

  • The "Quorum" Rule: etcd prefers odd numbers (3 or 5). Why? Because if a network wire gets cut, the cluster needs a majority to make decisions.
  • The "Split Brain" Problem: If you have 4 nodes and they split 2-and-2, no one knows who is in charge. With 3 nodes, if 1 goes down, the other 2 still have a majority.

The Scheduler: The Logistics Expert

The Scheduler is like a high-stakes puzzle solver. When a new task comes in, it looks at every worker node and asks:

  1. Does this node have enough RAM?
  2. Is it already too busy?
  3. Does it have the right "tags" (Affinity)? If it finds a match, it assigns the work. If not, it triggers the Autoscaler to go buy more servers!

The Controllers: The Watchdogs

Controllers are the reason Kubernetes "self-heals."

  • A Deployment Controller says: "The boss wants 3 copies of this web app."
  • If one copy crashes, the controller notices the count is now 2.
  • It immediately tells the Scheduler: "Hey, we're missing one! Start a new one now."

2. The Brawn: Worker Nodes

While the Control Plane must be Linux, the Worker Nodes (where your apps live) can be Linux or Windows. This allows you to run modern cloud-native apps alongside legacy Windows services.

Every Worker Node has three essential tools:

The Kubelet: The On-Site Manager

The Kubelet is an agent that runs on every node. It watches the API Server like a hawk. When the API Server says, "Run this pod on your node," the Kubelet gets to work. It reports back constantly: "All good here!" or "Help, this container won't start!"

The Container Runtime: The Engine

This is the "engine" that actually pulls the images and runs them. While Docker started the fire, most modern clusters use containerd or CRI-O. They are leaner, faster, and built specifically for the Kubelet to talk to.

The Kube-proxy: The Traffic Cop

How does a user's request find its way to a specific container? Kube-proxy. It manages the networking rules on each node, ensuring that traffic is load-balanced and reaches the right destination without getting lost.

The Workflow: How a "Click" Becomes a "Container"

  1. The Human: "Here is a YAML file. I want 5 web servers."
  2. API Server: "Accepted. Saving this to the etcd database."
  3. Scheduler: "I see a new task! Node B has plenty of room. Assigning it there."
  4. Kubelet (on Node B): "I see a new assignment! Hey Container Runtime, pull the image and start it."
  5. Kube-proxy: "I'll open the gates so people can actually visit this new web server."

Summary

The Control Plane makes the plans, and the Worker Nodes execute them. By separating the Brains from the Brawn, Kubernetes ensures that even if a server fails or a container crashes, the "Command Center" stays in control.

Day 2: What is EEG

2026-01-24 21:59:20

Introduction
In this post, I will introduce four fundamental foundations of my research: the brain, what EEG is, and its applications.

1. Brain
From the system level perspective, the brain is organized into different parts with specific properties.
For better explanation i created this map.

2. The Electroencephalogram (EEG), Rhythms, and Waveforms
The combined electrical activity of the cerebral cortex is called a brain wave or rhythm. These signals change over time and often form repeating wave patterns. This happens because many brain cells (neurons) are active together.
When large groups of neurons work at the same time, they create an electrical signal strong enough to be measured from the scalp using EEG.

These brain waves can have different speeds (frequencies) and strengths (amplitudes).
EEG is very sensitive to changes in mental state. It can reflect many conditions, such as stress, alertness, calmness, deep rest, hypnosis, and sleep.

Think of a crowded football stadium. One person clapping is quiet, but when thousands of people clap together in rhythm, the sound becomes loud and clear.
EEG works in a similar way. A single neuron is too weak to measure, but when many neurons firetogether in a rhythm, their combined activity becomes strong enough to detect on the scalp.

3. EEG Recording Techniques
Electrodes captures the signal from the scalp, amplifiers bring the microvolt signals into the range where they can be digitalized accurately, A/D converter converts signals from analog to digital form, and processing/storing device stores, processes or displays recorded data.
I did a simple workflow of how it works.

4. Application: Sleep and Circadian Rhythm Disorder Investigation
Different sleep disorders show different patterns in EEG signals. These patterns reflect how the brain behaves during various stages of sleep.
To properly study sleep disorders, it is important to track how sleep stages change over time. By measuring these changes in a clear and quantitative way, EEG helps reveal problems in sleep structure, timing, and quality.
This makes EEG a valuable tool for understanding how the brain moves between wakefulness, light sleep, deep sleep, and dreaming.

Final Thoughts
The brain carries out countless functions, generating complex patterns of electrical activity. Sleep is not merely a period of rest for the body—it is an active, dynamic process driven by the brain. Remarkably, these processes can be observed and studied through EEG.
This post is the foundation of my journey: to observe, measure, and verify my understanding of how EEG reveals the brain at work.

DNS Record Types Explained

2026-01-24 21:54:44

Have you ever wondered what happens when you type google.com in the browser ?
On the surface it looks simple but behind the screen many things happen and one of them most important is DNS (Domain name server).Let's understand What is DNS, why we need it and how it works.

What is DNS?

Think DNS as phone book for internet.

when you tap on friends name on your phone, your phone already knows which phone number to call. Similar to that when you type google.com in browser DNS knows which IP address to connect.

  • Website names are human readable
  • IP address are machine readable As computer doesn't understand website name it needs IP address. So basically DNS is the Domain Name server which resolves the IP address for a website.

Why do we need DNS?
Domain name by itself does nothing.
DNS works with DNS records, which are set of instruction that tell internet important details about domain:

  • who controls the domain
  • where the website server is located
  • whether domain is allowed to send emails
  • where should the email delivered

Phone book comparison:

  • User phone number
  • nickname
  • email address
  • Home address

How DNS works?
Let's say you type google.com in browser

  1. DNS Query begins Your computer checks if IP address exists in your local cache
  2. DNS Resolver if not found, it ask your DNS resolver, usually provided by your ISP Or public DNS resolver
    • google 8.8.8.8
    • cloudflare 8.8.8.8
  3. Root DNS DNS resolver contacts Root DNS server where to find the information TLD (.com) google.com -> .com is TLD (Top level domain). There are 13 root server globally maintain by ICAINN 4.TLD Server DNS resolver contacts the TLD (top level domain) find it authoritative DNS server
  4. Authoritative DNS This server provides the final IP address
  5. Website loads browser find the web server based on IP address and load the website

Types of DNS Records:

  1. NS = Name Server
    NS records tells who is responsible for your Domain's DNS
    They connect your domain to DNS that manage all records
    This of an apartment building, any query regarding the
    building goes through the front desk. NS record act as front desk

    1. A Points the domain to IPV4 address
    2. AAAA Same as A record but points the domain to IPV6 address
    3. MX (Mail exchange)

where the email for the domain should be delivered

  1. CNAME CNAME creates an alias i.e. one domain can point to another domain

Hope now you understand how the website is loaded in your browser.

I Tried Vibe Coding for the First Time Using Warp — Here’s What I Learned

2026-01-24 21:48:13

Today, I experimented with vibe coding for the first time using Warp, and honestly… it felt wild.

I didn’t write a single line of code manually.

Instead, I just described:

  • what I wanted to build
  • what should happen next
  • And Warp generated the code for me.

For quick prototyping, this felt incredibly fast and smooth.

The Good Part

  • Warp did a solid job with:
  • Generating project structure
  • Writing boilerplate code
  • Moving ideas forward quickly

For early-stage development, this can save a lot of time and mental energy.
If your goal is to test an idea fast, vibe coding definitely helps.

The Reality Check
Once I reviewed the generated code properly, I noticed:

  • Multiple bugs
  • Logical issues
  • Edge cases not handled well

The code worked, but it wasn’t production-ready.

This is where developer experience still matters:

  • Debugging
  • Refactoring
  • Understanding what the code is actually doing

AI can write code, but it doesn’t fully understand context, intent, or long-term maintainability.

Current Status of the Project

The site is still under development.

I’m actively:

  • Fixing bugs
  • Improving logic
  • Cleaning up the codebase

Once everything is stable, I’ll share:

  • What the site does
  • How it was built
  • What worked well with vibe coding (and what didn’t)

Key Takeaway

AI-assisted coding is a powerful accelerator, not a replacement for fundamentals.

Tools like Warp are great for:

  • Rapid experimentation
  • Reducing repetitive work
  • Exploring ideas faster

But understanding the code and taking responsibility for quality is still on us as developers.

If you’ve tried vibe coding or AI-first workflows, I’d love to hear your experience 👇

What worked for you, and what didn’t?

Happy building 🚀

Day 1: Why I’m Doing 60 Days of EEG Data Collection

2026-01-24 21:47:04

Introduction
My behavior often pushes me to research things that are real problems in my life. When something affects me deeply, I feel the need to understand it better. This is how most of my learning starts.

Why I’m doing 60 days of EEG data collection
I decided to collect data for 60 days to organize my research journey. Starting step by step is always better than trying to do everything at once.

  1. Studying EEG and machine learning is not easy to explain in a single post. Breaking the work into smaller parts helps me learn better and also share my progress clearly.
  2. Dividing and counting small steps has always worked for me.
  3. Studying brain activity has always been meaningful to me because I had two close friends with similar brain-related cases .

This made me more curious and more serious about understanding how the brain works.

Conclusion
I took this research in a more personal way, and I’m really happy to start this journey and work with others along the way.