MoreRSS

site iconJim NielsenModify

Designer. Engineer. Writer.20+ years at the intersection of design & code on the web.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Jim Nielsen

Tradeoffs to Continuous Software?

2025-05-29 03:00:00

I came across this post from the tech collective crftd. about how software is in a process of “continuous disintegration”:

One of the uncomfortable truths we sometimes have to break to people is that software isn't just never “done”. Worse even, it rots…

The practices of continuous integration act as enablers for us to keep adding value and keeping development maintainable, but they cannot stop the inevitable: The system will eventually fail in unexpected ways, as is the nature of complex systems:

That all resonates with me — software is rarely “done”, it generally has shelf life and starts rotting the moment you ship it — but what really made me pause was this line:

The practices of continuous integration act as enablers for us

I read “enabler” there in the negative context of the word, like in addiction when the word “enabler” refers to someone who exploits others by encouraging a pattern of self-destructive behavior.

Is CI/CD an enabler?

I’d only ever thought on moving towards CI/CD as a net positive thing. Is it possible that, like everything, CI/CD has its tradeoffs and isn’t always the Best Thing Ever™️?

What are the trade-offs of CI/CD?

The thought occurred to me that CI stands for “continuous investment” because that’s what it requires to keep it working — a continuous investment in the both the infrastructure that delivers the software and the software itself.

Everybody complains now-a-days about how software requires a subscription. Why is that? Could it be, perhaps, because of CI/CD? If you want continuous updates to your software, you’re going to have to pay for it continuously.

We’ve made delivering software continuously easy, which means we’ve made creating software that’s “done” hard — be careful of what you make easy.

In some sense — at least on the web — I think you could argue that we don’t know how to make software that’s done (e.g. software that ships on a CD). We’re inundated with tools and practices and norms that enable the opposite of that.

And, perhaps, we’ve trading something there?

When something comes along and enables new capabilities, it often severs others.


Reply via: Email · Mastodon · Bluesky

Could I Have Some More Friction in My Life, Please?

2025-05-26 03:00:00

A clip from “Buy Now! The Shopping Conspiracy” features a former executive of an online retailer explaining how motivated they were to make buying easy. Like, incredibly easy. So easy, in fact, that their goal was to “reduce your time to think a little bit more critically about a purchase you thought you wanted to make.” Why? Because if you pause for even a moment, you might realize you don’t actually want whatever you’re about to buy.

Been there. Ready to buy something and the slightest inconvenience surfaces — like when I can’t remember the precise order of my credit card’s CCV number and realize I’ll have to find my credit card and look it up — and that’s enough for me to say, “Wait a second, do I actually want to move my slug of a body and find my credit card? Nah.”

That feels like the socials too.

The algorithms. The endless feeds. The social interfaces. All engineered to make you think less about what you’re consuming, to think less critically about reacting or responding or engaging.

Don’t think, just scroll.

Don’t think, just like.

Don’t think, just repost.

And now with AI don’t think at all.[1]

Because if you have to think, that’s friction. Friction is an engagement killer on content, especially the low-grade stuff. Friction makes people ask, “Is this really worth my time?”

Maybe we need a little more friction in the world. More things that merit our time. Less things that don’t.

It’s kind of ironic how the things we need present so much friction in our lives (like getting healthcare) while the things we don’t need that siphon money from our pockets (like online gambling[2]) present so little friction you could almost inadvertently slip right into them.

It’s as if The Good Things™️ in life are full of friction while the hollow ones are frictionless.


  1. Nicholas Carr said, “The endless labor of self-expression cries out for the efficiency of automation.” Why think when you can prompt a probability machine to stitch together a facade of thinking for you?
  2. John Oliver did a segment on sports betting if you want to feel sad.

Reply via: Email · Mastodon · Bluesky

Webkit’s New Color Picker as an Example of Good Platform Defaults

2025-05-24 03:00:00

I’ve written about how I don’t love the idea of overriding basic computing controls. Instead, I generally favor opting to respect user choice and provide the controls their platform does.

Of course, this means platforms need to surface better primitives rather than supplying basic ones with an ability to opt out.

What am I even talking about? Let me give an example.

The Webkit team just shipped a new API for <input type=color> which provides users the ability to pick colors with wide gamut P3 and alpha transparency. The entire API is just a little bit of declarative HTML:

<label>
  Select a color:
  <input type="color" colorspace="display-p3" alpha>
</label>

From that simple markup (on iOS) you get this beautiful, robust color picker.

Screenshot of the native color picker in Safari on iOS

That’s a great color picker, and if you’re choosing colors a lot on iOS respectively and encountering this particular UI a lot, that’s even better — like, “Oh hey, I know how to use this thing!”

With a picker like that, how many folks really want additional APIs to override that interface and style it themselves?

This is the kind of better platform defaults I’m talking about. A little bit of HTML markup, and boom, a great interface to a common computing task that’s tailored to my device and uniform in appearance and functionality across the websites and applications I use. What more could I want? You might want more, like shoving your brand down my throat, but I really don’t need to see BigFinanceCorp Green™️ as a themed element in my color or date picker.

If I could give HTML an aspirational slogan, it would be something along the lines of Mastercard’s old one: There are a few use cases platform defaults can’t solve, for everything else there’s HTML.


Reply via: Email · Mastodon · Bluesky

Product Pseudoscience

2025-05-21 03:00:00

In his post about “Vibe Drive Development”, Robin Rendle warns against what I’ll call the pseudoscientific approach to product building prevalent across the software industry:

when folks at tech companies talk about data they’re not talking about a well-researched study from a lab but actually wildly inconsistent and untrustworthy data scraped from an analytics dashboard.

This approach has all the theater of science — “we measured and made decisions on the data, the numbers don’t lie” etc. — but is missing the rigor of science.

Like, for example, corroboration.

Independent corroboration is a vital practice of science that we in tech conveniently gloss over in our (self-proclaimed) objective data-driven decision making.

In science you can observe something, measure it, analyze the results, and draw conclusions, but nobody accepts it as fact until there can be multiple instances of independent corroboration.

Meanwhile in product, corroboration is often merely a group of people nodding along in support of a Powerpoint with some numbers supporting a foregone conclusion — “We should do X, that’s what the numbers say!”

(What’s worse is when we have the hubris to think our experiments, anecdotal evidence, and conclusions should extend to others outside of our own teams, despite zero independent corroboration — looking at you Medium articles.)

Don’t get me wrong, experimentation and measurement are great. But let’s not pretend there is (or should be) a science to everything we do. We don’t hold a candle to the rigor of science. Software is as much art as science. Embrace the vibe.


Reply via: Email · Mastodon · Bluesky

Multiple Computers

2025-05-19 03:00:00

I’ve spent so much time, had so many headaches, and encountered so much complexity from what, in my estimation, boils down to this: trying to get something to work on multiple computers.

It might be time to just go back to having one computer — a personal laptop — do everything.

No more commit, push, and let the cloud build and deploy.

No more making it possible to do a task on my phone and tablet too.

No more striving to make it possible to do anything from anywhere.

Instead, I should accept the constraint of doing specific kinds of tasks when I’m at my laptop. No laptop? Don’t do it. Save it for later. Is it really that important?

I think I’d save myself a lot of time and headache with that constraint. No more continuous over-investment of my time in making it possible to do some particular task across multiple computers.

It’s a subtle, but fundamental, shift in thinking about my approach to computing tasks.

Today, my default posture is to defer control of tasks to cloud computing platforms. Let them do the work, and I can access and monitor that work from any device. Like, for example, publishing a version of my website: git commit, push, and let the cloud build and deploy it.

But beware, there be possible dragons! The build fails. It’s not clear why, but it “works on my machine”. Something is different between my computer and the computer in the cloud. Now I’m troubleshooting an issue unrelated to my website itself. I’m troubleshooting an issue with the build and deployment of my website across multiple computers.

It’s easy to say: build works on my machine, deploy it! It’s deceivingly time-consuming to take that one more step and say: let another computer build it and deploy it.

So rather than taking the default posture of “cloud-first”, i.e. push to the cloud and let it handle everything, I’d rather take a “local-first” approach where I choose one primary device to do tasks on, and ensure I can do them from there. Everything else beyond that, i.e. getting it to work on multiple computers, is a “progressive enhancement” in my workflow. I can invest the time, if I want to, but I don’t have to. This stands in contrast to where I am today which is if a build fails in the cloud, I have to invest the time because that’s how I’ve setup my workflow. I can only deploy via the cloud. So I have to figure out how to get the cloud’s computer to build my site, even when my laptop is doing it just fine.

It’s hard to make things work identically across multiple computers.

I get it, that’s a program not software. And that’s the work. But sometimes a program is just fine. Wisdom is knowing the difference.


Reply via: Email · Mastodon · Bluesky

Notes from Alexander Petros’ “Building the Hundred-Year Web Service”

2025-05-15 03:00:00

I loved this talk from Alexander Petros titled “Building the Hundred-Year Web Service”. What follows is summation of my note-taking from watching the talk on YouTube.


Is what you’re building for future generations:

  • Useful for them?
  • Maintainable by them?
  • Adaptable by them?

Actually, forget about future generations. Is what you’re building for future you 6 months or 6 years from now aligning with those goals?

While we’re building codebases which may not be useful, maintainable, or adaptable by someone two years from now, the Romans built a bridge thousands of years ago that is still being used today.

It should be impossible to imagine building something in Roman times that’s still useful today. But if you look at [Trajan’s Bridge in Portugal, which is still used today] you can see there’s a little car on its and a couple pedestrians. They couldn’t have anticipated the automobile, but nevertheless it is being used for that today.

That’s a conundrum. How do you build for something you can’t anticipate? You have to think resiliently.

Ask yourself: What’s true today, that was true for a software engineer in 1991? One simple answer is: Sharing and accessing information with a uniform resource identifier. That was true 30+ years ago, I would venture to bet it will be true in another 30 years — and more!

There [isn’t] a lot of source code that can run unmodified in software that is 30 years apart.

And yet, the first web site ever made can do precisely that. The source code of the very first web page — which was written for a line mode browser — still runs today on a touchscreen smartphone, which is not a device that Tim Berners-less could have anticipated.

Alexander goes on to point out how interaction with web pages has changed over time:

  • In the original line mode browser, links couldn’t be represented as blue underlined text. They were represented more like footnotes on screen where you’d see something like this[1] and then this[2]. If you wanted to follow that link, there was no GUI to point and click. Instead, you would hit that number on your keyboard.
  • In desktop browsers and GUI interfaces, we got blue underlines to represent something you could point and click on to follow a link
  • On touchscreen devices, we got “tap” with your finger to follow a link.

While these methods for interaction have changed over the years, the underlying medium remains unchanged: information via uniform resource identifiers.

The core representation of a hypertext document is adaptable to things that were not at all anticipated in 1991.

The durability guarantees of the web are absolutely astounding if you take a moment to think about it.

In you’re sprinting you might beat the browser, but it’s running a marathon and you’ll never beat it in the long run.

If your page is fast enough, [refreshes] won’t even repaint the page. The experience of refreshing a page, or clicking on a “hard link” is identical to the experience of partially updating the page. That is something that quietly happened in the last ten years with no fanfare. All the people who wrote basic HTML got a huge performance upgrade in their browser. And everybody who tried to beat the browser now has to reckon with all the JavaScript they wrote to emulate these basic features.


Reply via: Email · Mastodon · Bluesky