MoreRSS

site iconGeoffrey HuntleyModify

I work remotely from a van that is slowly working its way around Australia.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Geoffrey Huntley

Software development now costs less than than the wage of a minimum wage worker

2026-02-27 16:52:26

Software development now costs less than than the wage of a minimum wage worker

Hey folks, the last year I've been pondering about this and doing game theory around the discovery of Ralph, how good the models are getting and how that's going to intersect with society. What follows is a cold, stark write-up of how I think it's going to go down.

Software development now costs less than than the wage of a minimum wage worker
https://www.theregister.com/2026/01/27/ralph_wiggum_claude_loops/

The financial impacts are already unfolding. Back when Ralph started to go really viral, there was a private equity firm that was previously long on Atlassian and went deliberately short on Atlassian because of Ralph. In the last couple of days, they released their new investor report, and they made absolute bank.

Software development now costs less than than the wage of a minimum wage worker
Dec 2025 - https://www.minotaurcapital.com/reports/quarterly/2025-12

I discovered Ralph almost a year ago today, and when I made that discovery, I sat on it for a while and focused on education and teaching juniors to pay attention and just writing prolifically, just writing and doing keynotes internationally, pleading with people to pay attention and to invest in themselves.

Dear Student: Yes, AI is here, you’re screwed unless you take action...
Two weeks ago a student anonymously emailed me asking for advice. This is the reply and if I was in your shoes this is what I’d do. So, I read your blog post “An oh f*** moment in time” alongside “The future belongs to idea guys that can just do
Software development now costs less than than the wage of a minimum wage worker

It's now one year later, and the cost of software development is $10.42 an hour, which is less than minimum wage and a burger flipper at macca's gets paid more than that. What does it mean to be a software developer when everyone in the world can develop software? Just two nights ago, I was at a Cursor meetup, and nearly everyone in the room was not a software developer, showing off their latest and greatest creations.

Software development now costs less than than the wage of a minimum wage worker

Well, they just became software developers because Cursor enabled them to become one. You see, the knowledge and skill of being a software developer has been commoditised. If everyone can be a software developer, what does that mean if your identity function is that you're a software developer and you write software for a living?

My theory of how it all goes down and gets feral really, really fast. Is quite simple...

Software development now costs less than than the wage of a minimum wage worker

For the past month, I've been catching up with venture capitalists in Australia and San Francisco and rubber-ducking this concept. You see, for a lot of them, they're not even sure whether their business model as venture capitalists still exists.

Why does someone need to raise a large amount of capital if it's just five man show now?

So let's open up with a classic K shape.

Software development now costs less than than the wage of a minimum wage worker

We rewind time to Christmas two years ago, where I originally posted, An "oh fuck" moment in time it was clear to me where this was going. The models were already good enough back then to cause societal disruption. The models were pretty wild; like wild horses, and they needed quite a great deal of skill to get outcomes from them...

Software development now costs less than than the wage of a minimum wage worker

If we fast-forward to the last Christmas holidays, many people had their "oh fuck" moment a year later, and the difference between now and then is twofold.

One: they actually picked up the guitar, played it, and took the Christmas period off because they had the space, capacity, and time to invest in themselves and make discoveries.

deliberate intentional practice
Something I’ve been wondering about for a really long time is, essentially, why do people say AI doesn’t work for them? What do they mean when they say that? From which identity are they coming from? Are they coming from the perspective of an engineer with a job title and
Software development now costs less than than the wage of a minimum wage worker

Two, the horses or models came with factory defaults of "broken in and ready to get shit done", which made them more accessible; they're easier to use to achieve outcomes, so people didn't need to invest as much time learning how to juice them to get disruptive outcomes.

Software development now costs less than than the wage of a minimum wage worker

The world is now divided into two types of companies. Model first companies that are lean, apex predators who can operate on razor-thin margins and crush incumbents.

llm weights vs the papercuts of corporate
In woodworking, there’s a saying that you should work with the grain, not against the grain and I’ve been thinking about how this concept may apply to large language models. These large language models are built by training on existing data. This data forms the backbone which creates output based
Software development now costs less than than the wage of a minimum wage worker

The next side of the equation is nearly every company out there today, which needs to go through a people transformation program, figure out what to do with AI, and deal with the fact that the fundamentals of business have changed.

Software development now costs less than than the wage of a minimum wage worker

Jack is doing the right thing for his company by acting early. What will happen is that the time for a competitor to be at your door will be measured in months, not years. And as models get better, the timeframe only compresses.

The real question is for the folks who, unfortunately, were laid off today; they will need jobs, and they will now see the importance of upskilling with AI. So they'll go on to their next employer or other industries and upskill with AI, and then seek to implement what is needed - automating job functions via AI.

Then the cycle continues across all industries, all disciplines.

But it's not going to be just triggered by layoffs. It'll be just triggered by executives who don't get it. When you understand what is going on and how real AI is, it is maddening to be in a company surrounded by people who don't get it.

You see, there is a difference between employer suicide and employee suicide. The smart folks who don't want to commit employment suicide will leave.

The smarter ones in that segment will just go and found their own companies, then come back and do what they know. And they'll attack their employers vertically, operating leaner and meaner.

Software development now costs less than than the wage of a minimum wage worker

As the models get better, which is slope on slope derivative pace at this stage and as model-first companies get better and better and better at automating their job function, they can be at the door of their previous employer in months, not years.

Software development now costs less than than the wage of a minimum wage worker

To make matters worse, as the models get better, time gets compressed, and the snake eating its tail speeds up.

Software development now costs less than than the wage of a minimum wage worker
idk how to visualize this, if you've got ideas let me know...


Which results in employers who did not take corrective actions, unlike Jack, having to lay off people in the long run because margins are being squeezed by new competitors operating leaner, meaner, and faster.

Then the cycle continues across all industries, all disciplines.

Software development now costs less than than the wage of a minimum wage worker

As I've been stressing in my writing for almost a year now, employers and employees trade time and skill for money. If a company is having problems adopting AI, then that is a company issue, not an employee issue.

Experience as a software engineer today doesn’t guarantee relevance tomorrow. The dynamics of employment are changing: employees trade time and skills for money, but employers’ expectations are evolving rapidly. Some companies are adapting faster than others.

Another thing I've been thinking: when someone says, “AI doesn’t work for me,” what do they mean? Are they referring to concerns related to AI in the workplace or personal experiments on greenfield projects that don't have these concerns?

This distinction matters.

Employees trade skills for employability, and failing to upskill in AI could jeopardise their future. I’m deeply concerned about this.

If a company struggles with AI adoption, that’s a solvable problem - it's now my literal job. But I worry more about employees.

In history, there are tales of employees departing companies that resisted cloud adoption to keep their skills competitive.

The same applies to AI. Companies that lag risk losing talent who prioritise skill relevance.

- June 2025 from https://ghuntley.com/six-month-recap/

Model weight first companies should be scaring the fuck out of every founder right now if they're not a utility service, for what is a moat now in the era when you can /z80 something?

llm weights vs the papercuts of corporate
In woodworking, there’s a saying that you should work with the grain, not against the grain and I’ve been thinking about how this concept may apply to large language models. These large language models are built by training on existing data. This data forms the backbone which creates output based
Software development now costs less than than the wage of a minimum wage worker
Can a LLM convert C, to ASM to specs and then to a working Z/80 Speccy tape? Yes.
✨Daniel Joyce used the techniques described in this post to port ls to rust via an objdump. You can see the code here: https://github.com/DanielJoyce/ls-rs. Keen, to see more examples - get in contact if you ship something! Damien Guard nerd sniped me and other folks wanted
Software development now costs less than than the wage of a minimum wage worker

On the topic of moats, I've been thinking about this for almost a year now, and I think I've now got a clearer sense of what moats are in the AI era, but first, let's talk about what moats aren't...

  • Any business model that's based on per-seat pricing, as AI starts to rip harder and harder, is going to become much harder to maintain headcount within a corporation because model-first companies will be coming into business and operating much leaner using utility-based pricing. It's a margin game now.
  • Any product features or platforms that were designed for humans. I know that's going to sound really wild, but understand these days I go window-shopping on SaaS companies' websites for product features, rip a screenshot into Claude Code, and it rebuilds that product feature/platform. As we enter the era of hyper-personalised software, I think this will be the case more and more. In my latest creation, I have cloned Posthog, Jira, Pipedrive, and Calendly, and the list just keeps on growing because I want to build a hyper-personalised business that meets all my needs, with full control and everything first-party. I think we're going to see more and more of model first companies operating with this mindset.
  • Any business thought that revolved around the high cost of switching from one technology to another, or migrations from one technology to another, was a form of lock-in. This is provably falsified now. It is so easy to rip a fart into Claude Code and migrate from one technology to another. Just last week, I migrated from Cloudflare D1 to a PlanetScale Postgres database automatically using a Ralph Loop, and it just worked. Full-on data migration. When have you ever heard of a database migration going successfully unattended? We're here now, folks.

If you currently work at a company that fits the top three bullet points, then understand that things are going to get really tight at your employer. I don't know when, but with certainty it will happen. Your best choices are either to find a new employer if the people around you don't get it, or, if there is a need and desire for automation, to lean so hard into AI, automate everything, and become the champion of AI within your company. If your company has banned AI outright, you need to depart right now and find another employer.

So with that out of the way, what is a moat?

  • Distribution. Any form of distribution. Brand awareness. Steaks and handshakes.
  • Utility-based pricing, similar to cloud infrastructure on a cents per megabyte or CPU hour.
  • Operating as a model-first company and accelerating the transformation so you can operate under the principles below:
Principles — Latent Patterns
Principles for building products with large language models and the latent space — hard-won lessons from shipping AI-native software.
Software development now costs less than than the wage of a minimum wage worker
Software development now costs less than than the wage of a minimum wage worker
AI erases traditional developer identities—backend, frontend, Ruby, or Node.js. Anyone can now perform these roles, creating emotional challenges for specialists with decades of experience. - https://ghuntley.com/six-month-recap

This is going to be a really hard time for a lot of people because identity functions have been erased, and the hard thing is, it's not just software developers. It's people managers as well. If your identity function is managing people, you need to make adjustments. You need to get back onto the tools ASAP.

Were smaller but effectively cut 2/3rds by telling board I wouldn’t backfill in May 2023. Best decision as got rid of all the people who “are sick of hearing about ai”. 20ish people now do about 30x the output of what having more than 60 did 3 years ago.
- an anonymous founder in my DMs today.

This transformation is going to be brutal. Organisations need to be designed differently and need to transform from this...

Software development now costs less than than the wage of a minimum wage worker

to this...

Software development now costs less than than the wage of a minimum wage worker

And one of the hardest things is that AI is being rammed into the world non-consentually. It's been pushed by employers and Silicon Valley. Yeah, it sucks, but you gotta pull your chin up, process those feelings and deal with it, but for others it's gonna be really, really rough. There are going to be people who have spent years of their lives doing Game of Thrones, social political stuff, to get to where they are within a company, and it will have been all for nothing.

Software development now costs less than than the wage of a minimum wage worker

In the org chart above, consider what the value of the senior engineer, the team lead, the manager and the senior manager in this brave new world is? How much time is spent doing Dilbert activities? What if you can flatten the org chart? If you were a founder, why wouldn't you?

Software development now costs less than than the wage of a minimum wage worker

This is what I've been fearing for a year. I could be wrong, I don't know. Anyone who says that they know for sure is selling horseshit. One thing is absolutely certain: things will change, and there's no going back. The unit economics of business have forever changed.

Whether a company does layoffs really comes down to the quality of its leadership. If they're being lazy and don't have ambitious plans, they will need to lay off, because eventually the backlog will run dry, and everything will get automated.

This isn't me throwing shit at Jack. Like, literally, it's a cold, hard fact that you need fewer people to run a business now. So if you have too many people on your payroll, you need to make changes, but having said that, there will be ambitious founders and leaders who didn't overhire and understand that AI enables them to do anything, and they can do it today. They can make that five-year roadmap happen in a year and provide a backlog for all employees to work on while they utilise AI.

It's going to be really interesting to see how this pans out.

All I can ask you to do is tap someone else on the shoulder and stress to them to treat this topic seriously, upskill, and explain the risks going forward, and then ask them to do the same. You see, for a lot of people, they haven't noticed AI is knocking on their door because AI is burrowing under their house.

Software development now costs less than than the wage of a minimum wage worker

ps. socials

teleporting into the future and robbing yourself of retirement projects

2026-02-05 14:47:00

teleporting into the future and robbing yourself of retirement projects

I'm going to make this a really quick one because this is doing the rounds, and whilst I've tweeted about it, it's time to dig in.

What Gergely is articulating here is something that I and everyone else went through a year ago who were paying attention. AI enables you to teleport to the future and rob your future self of retirement projects. Anything that you've been putting off to do someday, you can do it now.

To quote a post I authored almost eight months ago:

It might surprise some folks, but I'm incredibly cynical when it comes to AI and what is possible; yet I keep an open mind. That said, two weeks ago, when I was in SFO, I discovered another thing that should not be possible.

Every time I find out something that works, which should not be possible, it pushes me further and further, making me think that we are already in post-AGI territory.
- https://ghuntley.com/no/ (dated July 2025)

And another post back in September 2025:

It's a strange feeling knowing that you can create anything, and I'm starting to wonder if there's a seventh stage to the "people stages of AI adoption by software developers"
teleporting into the future and robbing yourself of retirement projects
whereby that seventh stage is essentially this scene in the matrix...

In the previous 12 months, I've cloned SaaS product feature sets of many different companies. I've built file systems, networking protocols and even developed my own programming language.

From my perspective, nothing really changed in December. The models were already great, but what was needed was a time of rest - people just needed to pick up the guitar and play.

deliberate intentional practice
Something I’ve been wondering about for a really long time is, essentially, why do people say AI doesn’t work for them? What do they mean when they say that? From which identity are they coming from? Are they coming from the perspective of an engineer with a job title and
teleporting into the future and robbing yourself of retirement projects

What makes December an inflection point was the models became much easier to use to achieve good outcomes and people picked up the guitar with an open mind and played.

Over the last couple of weeks, I've been catching up with software engineers, venture capitalists, business owners, and people in sales and marketing who are all going through this period of adjustment.

Universally, it can be described as a mild form of creative psychosis for people who like to create things. All builders who have an internal reward function of creating things as a form of pleasure go through it because AI enables them to just do things.

The future belongs to people who can just do things
There, I said it. I seriously can’t see a path forward where the majority of software engineers are doing artisanal hand-crafted commits by as soon as the end of 2026. If you are a software engineer and were considering taking a gap year/holiday this year it would be an
teleporting into the future and robbing yourself of retirement projects

Everyone who gets AI goes through it, and it typically lasts about two to three months, until they get it out of their system by completing all the projects they were putting off until retirement.

Perhaps it could be described as a bit of a reset, similar to what happened during COVID-19, when people were able to reassess what they wanted to do in life.

It's a coin flip, really, because people are either going to commit more to their current employer if they are an employee, but on the other side of the coin, they're realising they are no longer dependent on others as much to achieve certain financial outcomes.

Perhaps this is the tipping point where more people throw their hats in and become entrepreneurs.

People with ideas and unique insight can get concepts to market in rapid time and be less dependent on needing others' expertise as the world's knowledge is now in the palms of everyone's hands.

Technologists are still required, perhaps it's the ideas guys/gals who should be concerned as software engineers now have a path to bootstrap a concept in every white collar industry (recruiting, law, finance, finance, accounting, et al) at breakneck speed without having to find co-founders.

- From Feb 2025

I guess I need to wrap this up now, but I will say this:

I've written about how some people won't make it, and I've spent the last year talking about this, pleading with people to pick up the guitar and play...

If you're having trouble sleeping because of all the things that you want to create, congratulations.

You've made it through to the other side of the chasm, and you are developing skills that employers in 2026 are expecting as a bare minimum.

The only question that remains is whether you are going to be a consumer of these tools or someone who understands them deeply and automates your job function?

how to build a coding agent: free workshop
It’s not that hard to build a coding agent. 300 lines of code running in a loop with LLM tokens. You just keep throwing tokens at the loop, and then you’ve got yourself an agent.
teleporting into the future and robbing yourself of retirement projects

go build yourself an agent and taste building in the recursive latent space

Trust me, you want to be in the latter camp because consumption is now the baseline for employment.

After you come out of this phase, I hope you get to where I am, because just because you can build something doesn't mean you necessarily should. Knowing what not to build now that anything can be built is a very important life lesson.

ps. socials

don’t waste your back pressure

2026-01-17 18:46:56

don’t waste your back pressure

I am fortunate to be surrounded by folks who listen and the link below post will go down as a seminal reading for people interested in AI context engineering.

A simple convo between mates - well Moss translated it into words and i’ve been waiting for it to come out so I didn’t front run him.

Don’t waste your back pressure ·
Back pressure for agents You might notice a pattern in the most successful applications of agents over the last year. Projects that are able to setup structure around the agent itself, to provide it with automated feedback on quality and correctness, have been able to push them to work on longer horizon tasks. This back pressure helps the agent identify mistakes as it progresses and models are now good enough that this feedback can keep them aligned to a task for much longer. As an engineer, this means you can increase your leverage by delegating progressively more complex tasks to agents, while increasing trust that when completed they are at a satisfactory standard.
don’t waste your back pressure

read this and internalise this

Enjoy. This is what engineering now looks like in the post loom/gastown era or even when doing ralph loops.

don’t waste your back pressure
software engineering is now about preventing failure scenarios and preventing the wheel from turning over through back pressure to the generative function

If you aren’t capturing your back-pressure then you are failing as a software engineer.

everything is a ralph loop

2026-01-17 14:43:54

everything is a ralph loop

I’ve been thinking about how I build software is so very very different how I used to do it three years ago.

No, I’m not talking about acceleration through usage of AI but instead at a more fundamental level of approach, techniques and best practices.

Standard software practices is to build it vertically brick by brick - like Jenga but these days I approach everything as a loop. You see ralph isn’t just about forwards (building autonomously) or reverse mode (clean rooming) it’s also a mind set that these computers can be indeed programmed.

watch this video to learn the mindset

I’m there as an engineer just as I was in the brick by brick era but instead am programming the loop, automating my job function and removing the need to hire humans.

Everyone right now is going through their zany period - just like i did with forward mode and building software AFK on full auto - however I hope that folks will come back down from orbit and remember this from the original ralph post.

While I was in SFO, everyone seemed to be trying to crack on multi-agent, agent-to-agent communication and multiplexing. At this stage, it's not needed. Consider microservices and all the complexities that come with them. Now, consider what microservices would look like if the microservices (agents) themselves are non-deterministic—a red hot mess.

What's the opposite of microservices? A monolithic application. A single operating system process that scales vertically. Ralph is monolithic. Ralph works autonomously in a single repository as a single process that performs one task per loop.

Software is now clay on the pottery wheel and if something isn’t right then i just throw it back on the wheel to address items that need resolving.

Ralph is an orchestrator pattern where you allocate the array with the required backing specifications and then give it a goal then looping the goal.

It's important to watch the loop as that is where your personal development and learning will come from. When you see a failure domain – put on your engineering hat and resolve the problem so it never happens again.

In practice this means doing the loop manually via prompting or via automation with a pause that involves having to prcss CTRL+C to progress onto the next task. This is still ralphing as ralph is about getting the most out how the underlying models work through context engineering and that pattern is GENERIC and can be used for ALL TASKS.

In other news I've been cooking on something called "The Weaving Loom". The source code of loom can now be found on my GitHub; do not use it if your name is not Geoffrey Huntley. Loom is something that has been in my head for the last three years (and various prototypes were developed last year!) and it is essentially infrastructure for evolutionary software. Gas town focuses on spinning plates and orchestration - a full level 8.

everything is a ralph loop
see https://steve-yegge.medium.com/welcome-to-gas-town-4f25ee16dd04

I’m going for a level 9 where autonomous loops evolve products and optimise automatically for revenue generation. Evolutionary software - also known as a software factory.

everything is a ralph loop

This is a divide now - we have software engineers outwardly rejecting AI or merely consuming via Claude Code/Cursor to accelerate the lego brick building process....

but software development is dead - I killed it. Software can now be developed cheaper than the wage of a burger flipper at maccas and it can be built autonomously whilst you are AFK.

hi, it me. i’m the guy

I’m deeply concerned for the future of these people and have started publishing videos on YouTube to send down ladders before the big bang happens.

i now won’t hire you unless you have this fundamental knowledge and can show what you have built with it

Whilst software development/programming is now dead. We however deeply need software engineers with these skills who understand that LLMs are a new form of programmable computer. If you haven’t built your own coding agent yet - please do.

how to build a coding agent: free workshop
It’s not that hard to build a coding agent. 300 lines of code running in a loop with LLM tokens. You just keep throwing tokens at the loop, and then you’ve got yourself an agent.
everything is a ralph loop

ps. think this is out there?

It is but watch it happen live. We are here right now, it’s possible and i’m systemising it.

Here in the tweet below I am putting loom under the mother of all ralph loops to automatically perform system verification. Instead of days of planning, discussions and weeks of verification I’m programming this new computer and doing it afk whilst I DJ so that I don’t have to hire humans.

everything is a ralph loop

Any faults identified can be resolved through forward ralph loops to rectify issues. Over the last year the models have became quite good and it's only now that I'm able to realise this full vision but I'll leave you with this dear reader....

What if the models don't stop getting good?

How well will your fair if you are still building jenga stacks when there are classes of principal software engineers out there to prove a point that we are here right now and please pay attention.

everything is a ralph loop

Go build your agent, go learn how to program the new computer (guidance forthcoming in future posts), fall in love with all the possibilities and then join me in this space race of building automated software factories.

ps. socials

llm weights vs the papercuts of corporate

2025-12-08 23:55:28

llm weights vs the papercuts of corporate

In woodworking, there's a saying that you should work with the grain, not against the grain and I've been thinking about how this concept may apply to large language models.

These large language models are built by training on existing data. This data forms the backbone which creates output based upon the preferences of the underlying model weights.

We are now one year in where a new category of companies has been founded whereby the majority of the software behind that company was code-generated.

From here on out I’m going to call to these companies as model weight first. This category of companies can be defined as any company that is building with the data (“grain”) that has been baked into the large language models.

i ran Claude in a loop for three months, and it created a genz programming language called cursed

2025-09-09 11:36:48

i ran Claude in a loop for three months, and it created a genz programming language called cursed

It's a strange feeling knowing that you can create anything, and I'm starting to wonder if there's a seventh stage to the "people stages of AI adoption by software developers"

i ran Claude in a loop for three months, and it created a genz programming language called cursed

whereby that seventh stage is essentially this scene in the matrix...

It's where you deeply understand that 'you can now do anything' and just start doing it because it's possible and fun, and doing so is faster than explaining yourself. Outcomes speak louder than words.

There's a falsehood that AI results in SWE's skill atrophy, and there's no learning potential.

If you’re using AI only to “do” and not “learn”, you are missing out
- David Fowler

I've never written a compiler, yet I've always wanted to do one, so I've been working on one for the last three months by running Claude in a while true loop (aka "Ralph Wiggum") with a simple prompt:

Hey, can you make me a programming language like Golang but all the lexical keywords are swapped so they're Gen Z slang?

Why? I really don't know. But it exists. And it produces compiled programs. During this period, Claude was able to implement anything that Claude desired.

The programming language is called "cursed". It's cursed in its lexical structure, it's cursed in how it was built, it's cursed that this is possible, it's cursed in how cheap this was, and it's cursed through how many times I've sworn at Claude.

i ran Claude in a loop for three months, and it created a genz programming language called cursed
https://cursed-lang.org/

For the last three months, Claude has been running in this loop with a single goal:

"Produce me a Gen-Z compiler, and you can implement anything you like."

It's now available at:

the 💀 cursed programming language: programming, but make it gen z

the website

GitHub - ghuntley/cursed: the 💀 cursed programming language: programming, but make it gen z
the 💀 cursed programming language: programming, but make it gen z - ghuntley/cursed
i ran Claude in a loop for three months, and it created a genz programming language called cursed

the source code

whats included?

Anything that Claude thought was appropriate to add. Currently...

  • The compiler has two modes: interpreted mode and compiled mode. It's able to produce binaries on Mac OS, Linux, and Windows via LLVM.
  • There are some half-completed VSCode, Emacs, and Vim editor extensions, and a Treesitter grammar.
  • A whole bunch of really wild and incomplete standard library packages.

lexical structure

Control Flow:
ready → if
otherwise → else
bestie → for
periodt → while
vibe_check → switch
mood → case
basic → default

Declaration:
vibe → package
yeet → import
slay → func
sus → var
facts → const
be_like → type
squad → struct

Flow Control:
damn → return
ghosted → break
simp → continue
later → defer
stan → go
flex → range

Values & Types:
based → true
cringe → false
nah → nil
normie → int
tea → string
drip → float
lit → bool
ඞT (Amogus) → pointer to type T

Comments:
fr fr → line comment
no cap...on god → block comment

example program

Here is leetcode 104 - maximum depth for a binary tree:

vibe main
yeet "vibez"
yeet "mathz"

// LeetCode #104: Maximum Depth of Binary Tree 🌲
// Find the maximum depth (height) of a binary tree using ඞ pointers
// Time: O(n), Space: O(h) where h is height

struct TreeNode {
    sus val normie
    sus left ඞTreeNode   
    sus right ඞTreeNode  
}

slay max_depth(root ඞTreeNode) normie {
    ready (root == null) {
        damn 0  // Base case: empty tree has depth 0
    }
    
    sus left_depth normie = max_depth(root.left)
    sus right_depth normie = max_depth(root.right)
    
    // Return 1 + max of left and right subtree depths
    damn 1 + mathz.max(left_depth, right_depth)
}

slay max_depth_iterative(root ඞTreeNode) normie {
    // BFS approach using queue - this hits different! 🚀
    ready (root == null) {
        damn 0
    }
    
    sus queue ඞTreeNode[] = []ඞTreeNode{}
    sus levels normie[] = []normie{}
    
    append(queue, root)
    append(levels, 1)
    
    sus max_level normie = 0
    
    bestie (len(queue) > 0) {
        sus node ඞTreeNode = queue[0]
        sus level normie = levels[0]
        
        // Remove from front of queue
        collections.remove_first(queue)
        collections.remove_first(levels)
        
        max_level = mathz.max(max_level, level)
        
        ready (node.left != null) {
            append(queue, node.left)
            append(levels, level + 1)
        }
        
        ready (node.right != null) {
            append(queue, node.right)
            append(levels, level + 1)
        }
    }
    
    damn max_level
}

slay create_test_tree() ඞTreeNode {
    // Create tree: [3,9,20,null,null,15,7]
    //       3
    //      / \
    //     9   20
    //        /  \
    //       15   7
    
    sus root ඞTreeNode = &TreeNode{val: 3, left: null, right: null}
    root.left = &TreeNode{val: 9, left: null, right: null}
    root.right = &TreeNode{val: 20, left: null, right: null}
    root.right.left = &TreeNode{val: 15, left: null, right: null}
    root.right.right = &TreeNode{val: 7, left: null, right: null}
    
    damn root
}

slay create_skewed_tree() ඞTreeNode {
    // Create skewed tree for testing edge cases
    //   1
    //    \
    //     2
    //      \
    //       3
    
    sus root ඞTreeNode = &TreeNode{val: 1, left: null, right: null}
    root.right = &TreeNode{val: 2, left: null, right: null}
    root.right.right = &TreeNode{val: 3, left: null, right: null}
    
    damn root
}

slay test_maximum_depth() {
    vibez.spill("=== 🌲 LeetCode #104: Maximum Depth of Binary Tree ===")
    
    // Test case 1: Balanced tree [3,9,20,null,null,15,7]
    sus root1 ඞTreeNode = create_test_tree()
    sus depth1_rec normie = max_depth(root1)
    sus depth1_iter normie = max_depth_iterative(root1)
    vibez.spill("Test 1 - Balanced tree:")
    vibez.spill("Expected depth: 3")
    vibez.spill("Recursive result:", depth1_rec)
    vibez.spill("Iterative result:", depth1_iter)
    
    // Test case 2: Empty tree
    sus root2 ඞTreeNode = null
    sus depth2 normie = max_depth(root2)
    vibez.spill("Test 2 - Empty tree:")
    vibez.spill("Expected depth: 0, Got:", depth2)
    
    // Test case 3: Single node [1]
    sus root3 ඞTreeNode = &TreeNode{val: 1, left: null, right: null}
    sus depth3 normie = max_depth(root3)
    vibez.spill("Test 3 - Single node:")
    vibez.spill("Expected depth: 1, Got:", depth3)
    
    // Test case 4: Skewed tree
    sus root4 ඞTreeNode = create_skewed_tree()
    sus depth4 normie = max_depth(root4)
    vibez.spill("Test 4 - Skewed tree:")
    vibez.spill("Expected depth: 3, Got:", depth4)
    
    vibez.spill("=== Maximum Depth Complete! Tree depth detection is sus-perfect ඞ🌲 ===")
}

slay main_character() {
    test_maximum_depth()
}

If this is your sort of chaotic vibe, and you'd like to turn this into the dogecoin of programming languages, head on over to GitHub and run a few more Claude code loops with the following prompt.

study specs/* to learn about the programming language. When authoring the cursed standard library think extra extra hard as the CURSED programming language is not in your training data set and may be invalid. Come up with a plan to implement XYZ as markdown then do it

There is no roadmap; the roadmap is whatever the community decides to ship from this point forward.

At this point, I'm pretty much convinced that any problems found in cursed can be solved by just running more Ralph loops by skilled operators (ie. people with experience with compilers who shape it through prompts from their expertise vs letting Claude just rip unattended). There's still a lot to be fixed, happy to take pull-requests.

Ralph Wiggum as a “software engineer”
😎Here’s a cool little field report from a Y Combinator hackathon event where they put Ralph Wiggum to the test. “We Put a Coding Agent in a While Loop and It Shipped 6 Repos Overnight” https://github.com/repomirrorhq/repomirror/blob/main/repomirror.md If you’ve seen my socials lately,
i ran Claude in a loop for three months, and it created a genz programming language called cursed

The most high-IQ thing is perhaps the most low-IQ thing: run an agent in a loop.

LLMs are mirrors of operator skill
This is a follow-up from my previous blog post: “deliberate intentional practice”. I didn’t want to get into the distinction between skilled and unskilled because people take offence to it, but AI is a matter of skill. Someone can be highly experienced as a software engineer in 2024, but that
i ran Claude in a loop for three months, and it created a genz programming language called cursed

LLMs amplify the skills that developers already have and enable people to do things where they don't have that expertise yet.

Success is defined as cursed ending up in the Stack Overflow developer survey as either the "most loved" or "most hated" programming language, and continuing the work to bootstrap the compiler to be written in cursed itself.

Cya soon in Discord? - https://discord.gg/CRbJcKaGNT

the 💀 cursed programming language: programming, but make it gen z

website

GitHub - ghuntley/cursed: the 💀 cursed programming language: programming, but make it gen z
the 💀 cursed programming language: programming, but make it gen z - ghuntley/cursed
i ran Claude in a loop for three months, and it created a genz programming language called cursed

source code

ps. socials