MoreRSS

site iconRiccardo MoriModify

A writer, freelance translator, and an enthusiast photographer. I’m also a Mac consultant and conservator.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Riccardo Mori

More stray observations — on Liquid Glass, on Apple’s lack of direction, then zooming out, on technological progress

2025-06-29 04:30:07

What sparked my long-form article on Liquid Glass, and all the criticism I’ve posted on Mastodon since the WWDC25 keynote is just that the Liquid Glass redesign made me angry. Yes, there are better things to get angry about in the world right now, and I want you to know that I’m very angry about them all.

In the past, technology used to be my coping space. A place for a knowledge worker like me to nerd about his tools and related passions — user interfaces, UI/UX design, typography, etc. And if I have developed these passions and interest is largely because of Apple. Apple had a huge impact on my life ever since I started using their computers. I carried out my apprenticeship in Desktop Publishing on a workstation that was comprised of a Macintosh SE, a Bernoulli Box external drive, and a LaserWriter printer back in 1989. I’ve always appreciated the care and attention to detail Apple put in their hardware design but also in their UI design.

But it’s true — something important died with Steve Jobs. He was really Apple’s kernel, for better and for… less better. This Apple has been dismantling Mac OS, as if it’s a foreign tool to them. They’ve bashed its UI around. And they seem to have done that not for the purpose of improving it, but simply for the purpose of changing it; adapting it to their (mostly misguided) idea of unifying the interface of different devices to bring it down to the simplest common denominator.

And this Liquid Glass facelift just makes me almost irrationally furious. Everything I’m seeing of Liquid Glass — at least on Mac OS — is a terrible regression. And I don’t want it on my production Mac. I do not want to look at that shit 14 hours a day. It’s ultimately that simple.

When Steve Jobs introduced the iPhone in January 2007, speaking about the software powering the iPhone, he said:

Now, software on mobile phones is like baby software. It’s not so powerful, and today we’re going to show you a software breakthrough. Software that’s at least five years ahead of what’s on any other phone. Now how do we do this? Well, we start with a strong foundation. iPhone runs OS X.

Now, why would we want to run such a sophisticated operating system on a mobile device? Well, because it’s got everything we need. It’s got multi-tasking. It’s got the best networking. It already knows how to power manage. We’ve been doing this on mobile computers for years. It’s got awesome security. And the right apps. It’s got everything from Cocoa and the graphics and it’s got core animation built in and it’s got the audio and video that OS X is famous for. It’s got all the stuff we want. And it’s built right in to iPhone. And that has let us create desktop-class applications and networking. Not the crippled stuff that you find on most phones. This is real, desktop-class applications.

Now it’s all going backwards, with Macs running a version of Mac OS that feels more and more simplified and fossilised iteration after iteration. The process isn’t yet complete, but I dread what’s coming next: Macs running something that is more iOS than Mac OS. Functionally interchangeable. To reuse Jobs’s words above, Macs running the crippled stuff that you find on most phones and tablets. Macs running smartphone-class applications (this is, sadly, already happening, as I’m painfully reminded every time I open an app whose interface and interaction design were clearly meant for a phone or tablet).

I’ve often wondered why we’ve come to this. Why Apple has come to this. On one side we have developers and expert users asking for more functionality, versatility, and features, while on the other side Apple just adds a few little things here and there, seemingly more interested in continuously retouching the Mac’s user interface at a cosmetic level — and doing that with a kind of awkwardness (incompetence?) that still manages to interfere with the UI’s functionality.

Francisco Tolmasky, during an exchange we had on Mastodon, expressed with great clarity a sinking feeling I’ve had for a while, a feeling I was actually trying to verbalise for an upcoming article (emphasis mine):

Well I think it is very clear that Apple does not believe there are new ideas to be had. This is a much deeper discussion, but to me all of their actions are representative of a company that believes technology is ‘mature’ and all that is left to do, at best, is polish. Setting aside whether one agrees with Apple’s decisions/taste/whatever, I think it is not up for discussion that while these changes may be disruptive, they are not, nor are intended to be, “transformative”.

Baked into the explanation that Liquid Glass “frees your content from the tyranny of the UI” is the inescapable admission that you have determined that the highest priority item left for iOS is to “return roughly 40px of screen real estate, or 3% of the vertical space of an iPhone, to users”. That is the important part here. Not whether Liquid Glass does or doesn’t deliver, but rather that Apple did not find, and thus does not believe there exists, anything more interesting to do in all of 2025.

This is the thing that should actually be concerning about Apple. Not whether or not they’ve lost their design skills, but whether they no longer believe they have any good new ideas. This better explains all their recent behavior than ‘greed’ or ‘hubris’. They waste time on redesigns because they don’t know what else to do. They fight so hard for recurring revenue and the App Store cut not because they’re dicks… but because they think there probably isn’t another Mac or iPhone in their future.

By the way, that ‘freeing the content from the tyranny of the UI’ is just tragicomic to me, and a sign that content for Apple is just something you consume passively or stare at. The act of creating things on a computer, working with them, interacting with them in any way, very much implies the reliance on a UI. And when I’m writing, translating, switching between documents, editing photos, engaging in a written conversation with someone, extracting information from a web browser, etc., I don’t want the UI to ‘recede’, I don’t want affordances out of my way.

Tolmasky then makes another valid point, bringing up something I admit I hadn’t even noticed:

Image: Apple, Inc.

This screenshot from their marketing page told me everything I needed to know about this year’s iPadOS update. No one plans a trip, reads a recipe, emails, and… learns violin at once. This is nonsensical. Cartoonish multitasking is not what multi-window support is for. But then you realize… they don’t know what it’s for. They don’t know why people keep asking for it. They actually have no idea why anyone would want it. There’s a reason the Mac is more single-window every day.

This screenshot is not some anomaly. It’s true across all their marketing materials and the keynote. You quickly realize that there isn’t a single mildly interesting example, let alone a ‘killer use case’. The entire pitch is, “You asked for it, here it is”. They make no effort to appeal to someone who didn’t already want this. During the keynote they play off the absurd demos as part of the ‘jokey act’, but notice they can come up with practical uses for the iPhone features that they show.

I’ve been sifting through my blog’s archives looking for a particular piece where I’m sure I expressed my concern towards Apple’s seemingly lack of direction (and ideas). After a couple of hours I still haven’t found it, but I found a lot of breadcrumbs of the same nature — quick notes, passing observations before or after past WWDCs. To bring the discourse back to user interface concerns, there’s this bit from This nine-year chasm (October 2020):

As I’ve repeatedly stated in my observations about Big Sur now that I’ve been testing the betas since August, the next version of Mac OS shines when it comes to performance, responsiveness, and stability — that’s my experience, at least — but when we examine the look and feel of its user interface, it mostly feels directionless. Where is the purpose? Why these changes? Is it to make the interface more usable? Is it to make that interaction work better or to make that element just look sleeker? It’s often hard to see the intention or even the logic behind some of them. The background colour of the System Preferences pane has subtly changed at least three times in the course of five betas. Things you used to make with one click, now take two or more clicks, just because someone at Apple felt like touching up a certain part of the interface for no apparent reason other than ‘trying something different’ or ‘fixing a previous, equally arbitrary cosmetic change’.

Ah, I think I found the article I was looking for. It’s from July 2021, but for the most part I could have written it last week: Habits, UI changes, and OS stagnation. It’s hard to extract brief quotes from it, so I urge you to read it in full when you have time. Here are a few bits that hopefully can stand on their own, with the last one being perhaps the closest to what Tolmasky was saying:

Under Cook and the new executive branch, Apple has app-ified Mac OS. Forgive the atrocious expression, but that’s how it feels to me. While I don’t deny that there have been significant innovations under the bonnet […], Apple’s approach when presenting the last few major Mac OS releases has always felt as if the most important thing to work on an operating system were its look & feel, rather than how this foundational tool can actually improve people’s work or tasks.

[…] I’ve had a lot of experience dealing with regular, non-tech-savvy users over the years. What some geeks may be shocked to know is that most regular people don’t really care about these changes in the way an application or operating system looks. What matters to them is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways. Like saving mouse clicks or making a multi-step workflow more intuitive and streamlined.

But making previous features or UI elements less discoverable because you want them to appear only when needed (and who decides when I need something out of the way? Maybe I like to see it all the time) — that’s not progress. It’s change for change’s sake. […]

The self-imposed yearly OS update cycle doesn’t help, either. Apple feels compelled to present something ‘new’ every year, but you can’t treat Mac OS development as iPhone hardware development. […]

I’ve also been thinking that this self-imposed yearly update cycle is ultimately an obstacle to a deeper kind of development — the kind that makes an operating system evolve as a tool. In a recent discussion on Twitter, note Léo Natan’s response, the reason he gives as to why older operating systems were essentially less user-hostile than what we have today:

That’s because they were trying to make a difficult concept, computing, easier for the mass public. That has, to a large extent, been achieved. Now you have overpaid “““designers””” that need to show “““impact””” every year, so they have to reinvent the wheel over and over.

This act of ‘reinventing the wheel over and over’ has been incredibly stifling and has, in my opinion, largely led to operating system stagnation. Roughly since Mac OS X 10.7 Lion onward, Mac OS has gained a few cool features, but it has been losing entire apps, services, and certain facilities — like Disk Utility — have been dumbed down. Meanwhile the system hasn’t really gone anywhere. On mobile, iOS started out excitingly, and admittedly still seems to be moving in an evolving trajectory, but on the iPad’s front there has been a lot of wheel reinventing to make the device behave more like a traditional computer, instead of embarking both the device and its operating system in a journey of revolution and redefinition of the tablet experience in order to truly start a ‘Post-PC era’.

And with Mac OS it feels like its journey is over, the operating system has found a place to settle and has remained there for years. Building new stuff, renovating, rearranging, etc., but always on site, so to speak.

In other words, if we look at Mac OS as a metro railway line, it’s like Apple has stopped extending it and creating new stations. What they’ve been doing for a while now has been routine maintenance, and giving the stations a fresh coat of paint every year. Only basic and cosmetic concerns, yet sometimes mixing things up to show that more work has gone into it, a process that invariably results in inexplicable and arbitrary choices like moving station entrances around, shutting down facilities, making the train timetables less legible, making the passages that lead to emergency exits more convoluted and longer to traverse, and so on — hopefully you know what I mean here.

However, at every yearly iteration of all operating systems and platforms Apple maintains, it’s starting to become clearer and clearer to me that Mac OS isn’t the only one in trouble. The atrophy is spreading. And if your next objection is, Rick, it’s virtually impossible to make technological leaps and bounds every year. You can’t expect innovation at every corner, then re-read the quoted passage above.

When you self-impose timelines and cadences that are essentially marketing-driven and do not really reflect technological research and development, then you become prisoner in a prison of your own making. Your goal and your priorities start becoming narrower in scope. You reduce your freedom of movement because you stop thinking in terms of creating the next technological breakthrough or innovative device; you just look at the calendar and you have to come up with something by end of next trimester, while you also have to take care of fixing bugs that are the result of the previous rush job… which keep accumulating on top of the bugs of the rush job that came before, and so forth.

This is what I mean when sometimes I say that Apple feels progressively more directionless to me. Rather than actually going somewhere, they’re moving in circles more and more often. The very shape of Apple Park is hugely symbolical here.

Of course the classic pushback I get after many of my long-form critiques is people asking me for solutions, people asking me what kind of technological innovation I want to see.

Instead of merely copy-pasting a quote from a piece I wrote more than five years ago, I’ll rephrase it. From what I’ve understood by examining the evolution of computer science and computer history, scientists and technologists of past decades seemed to have an approach that could be described as, ‘ideas & concepts first, technology later’. Many figures in the history of computing are rightly considered visionaries because they had visions — sometimes very detailed ones — of what they wanted computers to become, of applications where computers could make a difference, of ways in which a computer could improve a process, or could help solve a real problem.

And sometimes there were no detailed plans, but intuitions, insights, that were enough to point towards a direction. When a technological advancement was achieved, such as the microprocessor, it made previously-theorised designs and applications happen for real. Ideas were tested, put in practice, re-tested and refined. But no one sat on their laurels. There was always the urge of ‘what’s next?’ — ‘what kind of opportunities and avenues this stage has opened now?’

The reason I published the revised transcripts of the interviews with Larry Tesler, Steve Jobs, and Alan Kay (Part 1, Part 2) of the 1992 documentary series The Machine that Changed the World; the reason I published the annotated transcription of the lecture Origins of the Apple human interface, given in 1997 by Larry Tesler and Chris Espinosa at the Computer History Museum in California, is because I wanted my readers to understand how these people thought and worked. And to realise just how dramatically things have changed in tech.

What I’m seeing today is more like the opposite approach — ‘technology first, ideas & concepts later’: a laser focus on profit-driven technological advancements to hopefully extract some good ideas and use cases from. Where there are some ideas, or sparks, they seem hopelessly limited in scope or unimaginatively iterative, short-sightedly anchored to the previous incarnation or design. The questions are something like, How can we make this look better, sleeker, more polished?

Steve Jobs once said, There’s an old Wayne Gretzky quote that I love. ‘I skate to where the puck is going to be, not where it has been.’ And we’ve always tried to do that at Apple. Since the very, very beginning. And we always will. If I may take that image, I’d say that today a lot of tech companies seem more concerned with the skating itself and with continuing to hit the puck in profitable ways.

Today I don’t see many thinkers, visionaries, technologists asking questions like, What’s next? Where do we go from here? How can we circumvent these interface limitations? How can we meaningfully change the way X is done? Can we create new advanced methods to achieve X, to actually make things better? — and so forth. You know, general questions, larger in scope, not tied to a single product or even the previous iteration of the same product. Not tied to what can make them the most money in the shortest term.

Today, both manufacturers and users have this fascination for the product, the gadget, the tool. People want the faster horse, tech companies give them faster horses and focus almost exclusively on how to make the next horses even faster. It’s why so many people appear fascinated by all the ‘AI’ hype and tech companies — utterly starved for ideas and mostly atrophied when it comes to actual research and development — are happy to play along and try to make the most out of that.

Perhaps I’m being hopelessly idealistic here, but I would like to see more fascination for the purpose, for the exploration of different ways to do things and achieve goals, for the end more than the mere means to an end. For things that help us evolve rather than making us dumber and ever-entertained. (Please go and watch, or rewatch, WALL·E).

More assorted notes on Liquid Glass

2025-06-27 20:26:01

Over the past couple of weeks, I’ve been trying to make sense of Apple’s latest user-interface redesign — Apple calls it Liquid Glass — that will affect all their platforms in the next iteration of their respective OS versions. But it’s hard to make sense of it when, after checking Apple’s own guidance, I’m mostly left with the feeling that at Apple they’re making things up as they go.

If you’ve been following me on Mastodon, you’ll be already familiar with a lot of what follows. I just wanted to gather my posts there in a more organic piece here.

Let’s start with a few notes on Adopting Liquid Glass, part of the Technology Overviews Apple has made available on their Developer site.

In the Navigation section, we find this figure:

The text above it says:

Key navigation elements like tab bars and sidebars float in this Liquid Glass layer to help people focus on the underlying content.

Now take a look at the area I’ve highlighted in the image. Why would you want to “focus on the underlying content” here? Tab bars and toolbars still cover the underlying content, and the more transparent/translucent they are, the worse. When something fades to the background, it literally ceases to be in the foreground, so there’s no point in focusing on it. This is like proposing an interface that helps you focus your sight on your peripheral vision.

Below the figure, in the paragraph starting with Establish a clear navigation hierarchy, developers are advised to:

Ensure that you clearly separate your content from navigation elements, like tab bars and sidebars, to establish a distinct functional layer above the content layer.

Which is in direct contrast to what you’ve just shown on the image above. First you propose to blur the lines between controls and content, then you advise to “clearly separate your content from navigation elements”. Which is it? If you stop and think, it’s ironic that Ensure that you clearly separate your content from navigation elements, like tab bars and sidebars, to establish a distinct functional layer above the content layer is the exact description of what’s happening in the ‘Before’ image!

Moving on, we get to this figure related to the Extend content beneath sidebars and inspectors paragraph:

In other words, create the illusion of an image that extends under a sidebar, and while you won’t actually be able to see the part of the image under the sidebar, on the other hand the transparency effect applied to the sidebar will make the text on it less legible overall. A great lose-lose situation, visually, don’t you think? Also, this might be just a matter of personal perception, but to my eyes, the blank area below the image you can see behind the sidebar looks weird, as if there’s something missing.

In Organisation and layout we find this:

To give content room to breathe, organizational components like lists, tables, and forms have a larger row height and padding. Sections have an increased corner radius to match the curvature of controls across the system.

Which is largely unnecessary. It reduces the amount of information displayed on screen, and you’ll have to scroll more as a consequence. Look at the Before and After layouts: the Before layout doesn’t need solutions to increase its clarity. You’re just injecting white space everywhere. It’s also ironic that where more space and ‘breathing room’ are actually necessary, the header (“Single Table Row” in the figure) is pushed even nearer to the status bar.

And don’t get me started on those redesigned, stretched-out switches. They’re the essence of ‘change for change’s sake’.

 


 

Then I went to have a look at the current Human Interface Guidelines and I still can’t get over that introduction:

Let’s start with Hierarchy: “Elevate and distinguish the content beneath them”: is this really the role of controls and interface elements? Should content and controls even occupy the same space? Should the lines be blurred between them?

In my opinion, the best way both controls and content can shine is by having each their own space: controls are out of content’s way, letting it shine and helping the user focus on it. And in their own space, controls can be clear, neatly organised, ready to be accessed in order to manipulate the content.

Next, Harmony:

Align with the concentric design of the hardware and software […]

No, seriously, how does one align in a concentric context? Is that a matter of picking a circle, an arc, a shape? All snark aside, this just sounds poorly worded to me. I get what Apple means here: in your app design, you should pick shapes that resemble the contours of the hardware — the shape of a MacBook’s display and bezel, for example — and the typical shapes that you find in the system’s UI. Pretty obvious stuff that’s wrapped in ‘pretentious designer vocabulary’.

Last but not least, Consistency:

[…] to maintain a consistent design that continuously adapts […]

The definition of consistent is something that is “unchanging in nature, standard, or effect over time”. So, how does a consistent design continuously adapt?

This paragraph should have read something like: Adopt platform conventions to create a design that remains visually and functionally consistent across window sizes and displays.

Making icons less iconic

In the Design section of the guidelines for App icons, we find this:

Find a concept or element that captures the essence of your app or game, make it the core idea of your icon, and express it in a simple, unique way with a minimal number of shapes. Prefer a simple background, such as a solid color or gradient […]

Not only is this the recipe for blandness, it’s also borderline contradictory. Like, Make a unique dish using a minimal number of simple ingredients. While it’s possible to make a few different dishes using just two or three things, you touch the ceiling of uniqueness and variety pretty damn soon.

Another thing that irks me about this obsession with icon simplification is that when you abstract things this much, you dilute their meaning instead of distilling it. Take the progressive degradation of the Dictionary icon, for example. In its subsequent iterations (as soon as it loses the ‘book’ shape), it could just be the icon for a font managing app. Because it ends up losing a lot (if not all) of its uniqueness.

This image is taken by this post on the history of some of Mac OS icons by Basic Apple Guy. Go take a look at that post and you’ll see a pattern emerge with application icons: they get progressively abstracted to the point that they barely represent what they should represent: the icon for Stickies goes from being an actual depiction of a few yellow sticky notes to being some small vague rounded rectangles inside a clear rounded rectangle. The icon for Notes goes from representing an actual notepad to being a flat square with two lines and a coloured top area. The icon for Calculator, same thing: from depicting a calculator to being what looks more like a security keypad. Game Centre: from an icon representing different types of games, to… a group of colourful bubbles.

The most recent iteration of Migration Assistant’s icon is yet another example:

Migration Assistant icon in Mac OS 15 Sequoia (left) and how it appears in Mac OS 26 Tahoe Beta 2 (right)

Look at it. It’s utterly meaningless. Maybe it can work in an airport to mark an emergency exit or something. The old one is so simple and clear. From an ‘old, now inactive’ system to a ‘fresh new one’. Migration, indeed. Right there. All while preserving the Mac identity. This once again feels like changing things for change’s sake and nothing else.

I’m pretty sure that if you were to interview one of the designers at Apple responsible for this icon devolution, they would say something about reducing icons to their essence. To me, this looks more like squeezing all life out of them. Icons in Mac OS X used to be inventive, well crafted, distinctive, with a touch of fun and personality. Mac OS X’s user interface was sober, utilitarian, intuitive, peppered by descriptive icons that made the user experience fun without signalling ‘this is a kid’s toy’.

Same for NeXTSTEP, from which Mac OS X originates. Here, some icons have a more 3D effect, others are flatter; some are logos (like the icon for the Webster’s Dictionary), others are descriptive to a fault (the user’s Home folder is an illustration of a tiny house), but they’re instantly memorable. They do what icons are supposed to do and they take full advantage of the high resolution monitors NeXT sold for their workstations (also remember that some of those monitors were greyscale, so icons had to work even with limited palettes).

In recent years, the reverse has happened: Apple has been infantilising and dumbing down Mac OS’s user interface in order to be more similar to simpler mobile devices and to their UIs, while transforming the icons into something bland and ‘corporate’.

But it gets worse.

In the iOS 5 days, the HIG for icons weren’t too restrictive, apart from some basic requirements and guidance. This gave developers plenty of freedom, and the results (if you exclude the usual trash apps) were tasteful and varied; some opted for a rich, skeuomorphic look; others for flatter designs; others for something in between. Apps were instantly recognisable.

 

Now Apple gives you the option of removing colour and depth to all icons. To make everything look samey and nondescript…

…So that you can “complement your wallpaper.”

On my main Mac I’ve left the default Ventura wallpaper because the only time I see it is when I wake the Mac mini from sleep and I’m presented with the Login screen. People who actually work with computers and mobile devices don’t stare at wallpapers and matching icons.

But it’s not just that, it’s that these ‘Icon Appearances’ also remove colour, depth, and personality from third-party app icons too. This further dictates (and interferes with) what kind of design a third-party developer may choose for their apps. All this after recommending employing “a minimal number of shapes” and “prefer[ring] a simple background”.

As C.M. Harrington rightly notes:

I’ve said this before, but Apple is forcing third party devs to be in service of Apple. The guidelines and rules are meant to sublimate the brands of the third party, and replace it with Apple.

And I also must quote Louie Mantia who, in his brilliant piece Rose-Gold-Tinted Liquid Glasses writes (emphasis mine):

Apple has effectively infinite resources and operates on their own timeline, but everyone else does not have this kind of luxury. Springing big changes like this all at once forces so many independent developers, entire companies, and the industry as a whole to freeze their own development schedules to accommodate Apple’s design system.

It’s asking a lot. For almost nothing in return. I keep looking at all the changes Liquid Glass brings, and I cannot find one instance where it has markedly improved the experience in any way.

Everything that got rounder—except for the things that didn’t — why? Everything that got inset that wasn’t before — why? Everything that is now blurry — why? I don’t think it’s a secret that the content area of some apps decreased. The margins and padding increased — except where it didn’t.

In some ways, there’s almost more UI variance than there was before, which doesn’t make any sense. But in other ways, everything feels far more restrictive than it once was. Which I admit, also doesn’t make much sense. App icons weren’t just more expressive on OS X, they could be a much wider-range of materials than merely glass.

I know I can still draw anything I want within that square, and that the glass appearance on objects inside of it is purely optional. But the edge of every icon now has a glass appearance I can’t do anything about. If my icon is paper, wood, metal, or—god forbid—leather? It has a glass specular highlight. On macOS, it’s currently locked at a 45° angle. Which is not something I agreed to.

Swinging for the fences like this comes with substantial risk. Especially for matured products like macOS. This product is almost 25 years old, and I would hope there would be a little more caution when expecting effort from and forcing changes upon a developer community you’ve largely lost your goodwill with. These kinds of decisions have long-lasting effects and I’m sure many developers would’ve appreciated their time being considered before asking them to incorporate a design they did not sign up for.

And in the paragraph just preceding this section I’ve quoted, Mantia writes (emphasis his):

In a way, one could say Liquid Glass is like a new version of Aqua. It has reflective properties reminiscent of that. One could also say it’s an evolution of whatever iOS 7 was, leaning into the frosted panels and bright accent colors. But whatever Liquid Glass seems to be, it isn’t what many of us were hoping for.

Mantia’s piece is so good it’s difficult to extract a few quick quotes. Please take your time and go read it in full.

In Adopting Liquid Glass there are a few passages that unequivocally convey the message that Apple is in control of your app’s appearance (or part of it). Take for example this, in the Visual Refresh section:

Any custom backgrounds and appearances you use in these elements might overlay or interfere with Liquid Glass or other effects that the system provides, such as the scroll edge effect. […] Prefer to remove custom effects and let the system determine the background appearance […]

Or this, under App icons:

Let the system handle applying masking, blurring, and other visual effects, rather than factoring them into your design.

Compare and contrast this with the language used in the 2010 iOS Human Interface Guidelines under Application Icons:

Try to balance eye appeal and clarity of meaning in your icon so that it’s rich and beautiful and clearly conveys the essence of your application’s purpose. Also, it’s a good idea to investigate how your choice of image and color might be interpreted by people from different cultures.

After recommending to create different sizes of your application icon for different devices, the guidelines note that

When it’s displayed on an iPhone Home screen, iOS adds rounded corners, a drop shadow, and a reflected shine.

But:

You can prevent iOS from adding the shine to your application icon. To do this, you need to add the UIPrerenderedicon key to your application’s Info.plist file […]

Sure, even back then there were visual requirements for icons, but I wouldn’t define this short list as particularly restrictive:

Ensure your icon is eligible for the visual enhancements iOS provides. You should produce an image that:

  • Has 90° corners
  • Does not have any shine or gloss (unless you’ve chosen to prevent the addition of the reflective shine)
  • Does not use alpha transparency

The language in these guidelines from 2010 strikes me as supportive, like in this passage:

Create a 512×512 pixel version of your application icon for display in the App Store. Although it’s important that this version be instantly recognizable as your application icon, it can be subtly richer and more detailed. There are no visual effects added to this version of your application icon.

The language in the Adopting Liquid Glass document is overall more prescriptive and impersonal, and as I was reading all the various recommendations, I couldn’t help but feel the underlying message, We created this beautiful look based on glass effects, don’t you dare ruin it with your custom designs, effects, materials, brand identity.

The language in the current guidelines for app icons isn’t much different. It also reflects Apple’s current philosophy of ‘keeping it simple’ which, out of context, could be valid design advice — you’re designing icons with small-ish dimensions, not full-page detailed illustrations for a book, so striving for simplicity isn’t a bad thing.

And yet — and I might be wrong here — I keep reading between the lines and feel that these guidelines are more concerned with ensuring that developers maintain the same level of blandness and unimaginativeness of Apple’s own redesigned app icons:

Embrace simplicity in your icon design. Simple icons tend to be easiest for people to understand and recognize. An icon with fine visual features might look busy when rendered with system-provided shadows and highlights, and details may be hard to discern at smaller sizes. Find a concept or element that captures the essence of your app or game, make it the core idea of your icon, and express it in a simple, unique way with a minimal number of shapes. Prefer a simple background, such as a solid color or gradient, that puts the emphasis on your primary design — you don’t need to fill the entire icon canvas with content.

Going back to the Mac OS X Human Interface Guidelines from 2009 is like entering a different dimension. The chapter dedicated to icon design starts off like this:

Aqua offers a photo-illustrative icon style — it approaches the realism of photography but uses the features of illustrations to convey a lot of information in a small space. Icons can be represented in 512×512 pixels to allow ample room for detail. Anti-aliasing makes curves and nonrectilinear lines possible. Alpha channels and translucency allow for complex shading and dimensionality. All of these qualities allow you to create lush, vibrant icons that capture the user’s attention. […]

Icon genres help communicate what users can do with an application before they open it. Applications are classified by role — user applications, software utilities, and so on — and each category, or genre, has its own icon style. Creating icons that express this differentiation helps users distinguish between types of icons in the Dock.

For example, the icons for user applications are colorful and inviting, whereas icons for utilities have a more serious appearance. Figure 11–2 shows user application icons in the top row and utility icons in the bottom row.

You may argue that these are simply different icon design guidelines from different eras reflecting different tastes and aesthetic sense, and that it’s not a matter of one being better than the other, or a matter of right versus wrong, and I’ll concede that. But the older guidelines were informed in such a thoughtful way as to give third-party developers a lot of room for creativity and a wide range of choices while remaining within the required system-wide aesthetics of the time. If you look at the Figure 11–2 above, you could have very illustrative icons like the ones for Disk Utility (the hard disk with a stethoscope) or Front Row (the theatre armchair), but also more minimalistic designs such as the icon for the Terminal and AirPort Utility applications.

Tangentially, I found this bit ironic given where we are now:

Use transparency only when it is convincing and when it helps complete the story the icon is telling. You would never see a transparent sneaker, for example, so don’t use one in your icon.

This piece of advice is reiterated in the 2013 edition of Mac OS X’s Human Interface Guidelines:

Use transparency when it makes sense. Transparency in an icon can help depict glass or plastic, but it can be tricky to use convincingly. You would never see a transparent tree, for example, so don’t use one in your icon. The Preview and Pages app icons incorporate transparency effectively.

Also, since the introduction of retina (high-resolution, high-density) displays in 2012, this part was added in the HIG:

Take Advantage of High-Resolution Display

Retina display allows you to show high-resolution versions of your art and icons. If you merely scale up your existing artwork, you miss out on the opportunity to provide the beautiful, captivating images users expect. Instead, you should rework your existing image resources to create large, higher-quality versions that are:

  • Richer in texture
  • More detailed
  • More realistic

The aesthetics for icon design may have changed dramatically in the intervening years, but I just find it sad that, with the gorgeous displays we have today, Apple recommends simple designs made out of a few boring shapes, and everything is now in service of a ‘liquid glass’ effect the system superimposes on every aspect of the user interface — as if this surface gimmick is more important than the elements it distorts.

I’m sorry to sound like a broken record by now, but this is, once again, form before function, looks before workings. And don’t bother deviating from this new norm, because your app will be assimilated.

In case of emergency, break glass

2025-06-12 08:24:40

A few observations after Apple’s WWDC25 keynote

The title of my article obviously refers to the new UI Apple presented on 9 June, which they call Liquid Glass. I won’t beat around the bush: my very first impression is that we’re in UI emergency territory, but we won’t be able to break this particular glass. Only Apple can, and obviously they won’t because they’re very proud of it.

I truly don’t know where to begin with my observations, as I’m still trying to rein in my many reactions to what I’ve seen of this new UI. Let’s see if I can break it down in sections.

Consistency and depth

Apple’s UI ‘reset’ is also accompanied by the decision to homogenise all version numbers for their platforms, so that instead of having iOS 19, watchOS 12, tvOS 19, Mac OS 16, visionOS 3, and iPadOS 19, the next iteration of all these operating systems will indicate the year (or maybe season) of their release, so we’ll have iOS 26, watchOS 26, tvOS 26, Mac OS 26, and so forth. When I first heard about this, my immediate reaction was something like, Well, it goes to show just how Apple cares about the feedback of developers, pundits, and power users. All of them have pointed out repeatedly, ad nauseam even, that Apple’s software keeps being too buggy and that Apple should really rethink the yearly release approach. And Apple’s response has been to rebrand all their platforms’ versions so that they’ll be identified by year, to reflect their yearly releases. Sigh.

Anyway, the decision to reset the version numbers and to introduce the new Liquid Glass UI design for all platforms has seemingly been taken to emphasise consistency and boost visual familiarity across these platforms. Something I’ve been always opposed to, for reasons that should be obvious by activating one’s common sense. Here’s a quick example. Among other things, Bosch manufactures washing machines, dishwashers, and microwave ovens — why do their interfaces differ? They’re all appliances made by the same brand! Because they have different purposes and you use them in different ways. Appliances made by the same brand may have some similar design choices — e.g. they all feature touch buttons — so you know that if you choose that brand, you’ll expect touch buttons instead of switches or knobs or regular push buttons. But that’s it.

Back in March 2021, in Follow-up: the feedback on my articles about Snow Leopard, and more about user interface design, I wrote:

What I’ll never tire of pointing out is that the mere fact of altering Mac OS’s interface to make it more similar to iOS and iPadOS’s works against its very usability. If the idea behind this insistence on homogenising these interfaces is to bring new users to the Mac — that is, people who only know and use Apple’s mobile devices — and welcome them with a familiar interface, then Apple is not really doing them a favour.

By having a Mac OS release (Big Sur) with an interface that superficially resembles iOS’s interface and sometimes behaves in a similar way, is less user-friendly than it seems. Because when behaviours do differ — due to the fact that a traditional computer with an interface that revolves around the desktop metaphor and mouse+keyboard as input devices, is different from a phone or tablet with a Multi-touch interface — then you actually add an amount of that cognitive load you originally wanted to remove by making the two UIs (of Mac OS and iOS) more uniform. If it looks like a duck, walks like a duck, but then it barks, then things may get a bit confusing.

With this premise, it’s easy to think that making Mac OS also behave more like iOS is the necessary next step. This is likely what Apple has in mind for the future of the (Apple Silicon) Macs. But if you think about it, a design method that starts from the visuals and then has the visuals influence the workings of a system, is a method that works backwards with respect to what’s typically considered good design. The interface of a Mac, an iPhone, and an iPad should be focused on being the best for each specific device.

In the case of Liquid Glass, several design cues appear to be borrowed from visionOS, and that is hugely ironic to me if you want to bring consistency and familiarity, given that the user base of Apple Vision Pro is the smallest of all Apple’s platforms at the moment.

Also talking about ‘depth’ and ‘physicality’ when discussing the visuals of Liquid Glass is something I find rather amusing. As a material, glass is commonly used to reduce both depth and physicality. You use glass to make objects feel lighter. If you put a similarly-sized glass jar and a wooden container side by side, the wooden container will feel bulkier and more ‘present’ than the glass jar; it may even be perceived as heavier while actually being the lighter of the two. If you put a glass layer over a surface, you won’t really add depth to it, at least visually. To do so, you have to at least simulate a thick, textured layer of glass, like Nothing does with their wallpaper Glass effect for the Nothing Phone’s lock screen:

 

‘Depth’ in Liquid Glass reminds me of the way Apple talked about depth when introducing iOS 7 in 2013:

Source: Apple website, September 2013

In practice, of course, there’s barely any depth. Just very thin layers:

And when you invoke Search, what you see is hardly depth, but a very 2D space divided between the search field, the virtual keyboard, and what little remains of the home screen:

 

Control Centre in iOS 26 beta looks like the transparent door of one of those chest freezers you see in supermarkets:

Source: Craig Grannell, Apple’s Liquid Glass looks like it’s beamed in from the movies. I don’t think that’s a good thing — Stuff.tv

And this isn’t depth either:

Source: Danielle Foré on Mastodon. Check the whole thread starting here; she makes some insightful remarks I fully agree with.

Those notifications look like transparent stickers applied over a window pane. The distance between background and foreground elements appears minimal exactly because these are glass effects with too much transparency and very little opacity and contrast. The separation is very faint.

In iOS 6, depth was achieved through ‘material’ textures and by visibly blurring or obscuring the elements that had to lose focus, in a sort of exaggerated camera depth-of-field effect. Look what happens when I select a folder in iOS 6 — you can clearly see what’s in focus and what is not. You can easily distinguish the hierarchy of layers. You can perceive depth. It’s almost tangible.

In Mac OS, Liquid Glass does an even worse job at conveying depth. For starters, Finder windows look amorphous, the differentiation between active (in focus) and inactive (not in focus) windows is barely noticeable, and some details are still rough around the edges (no pun intended), as highlighted in my annotation of Mario Guzmán’s screenshot here:

In general, the treatment of Finder windows looks like a terrible ‘flat’ port of a VR experience — in this case, the visionOS environment.

How the fixed UI elements in a Finder window interact with its contents is also worth noting, and it’s worth pointing out that this is an atrocious way of conveying depth:

Source: Niki Tonsky on Mastodon

The visual hierarchy is muddled: why have a seemingly 3D toolbar, but the three semaphore controls on the left keep being flat and 2D? Here, it seems that the sidebar area of the window is flat, and the area on the right with the toolbar and the window’s contents is 3D and layered, while the area on the far right, with the additional info on the selected item, has thin layers that make it appear as a sort of intermediate state between 2D and 3D:

Safari’s UI also looks bizarre, in general, but in this other screenshot by Niki Tonsky the weird mix between flatness and depth is pretty on the nose:

It reminds me of the Impossible trident:

In past versions of Mac OS, depth and visual hierarchy were handled much better. This is a screenshot I took on my iBook G4 running Mac OS X 10.4.11 Tiger:

You can easily see a semi-transparent Dock (probably the best Dock Mac OS had in all its history) that is above the elements placed on the Desktop (icons, windows), in the same plane as the menu bar. Finder windows and app windows have visible depth, thanks to their drop shadows. And the different amount of drop shadow also clearly indicates the order in which the windows overlap (here it’s obvious because we only have two windows, but imagine a messier Desktop on a bigger display).

Also, within a window, it’s very clear which elements are contained and which are the containers. The window chrome is well defined. Buttons have depth. Active/inactive states are unequivocal.

Which brings me to the next point.

The obsession with dynamic UI elements

I’ll never tire of citing this quote from Alan Dye, Apple’s VP of Human Interface, who said at WWDC 2020:

We’ve reduced visual complexity to keep the focus on users’ content. Buttons and controls appear when you need them, and they recede when you don’t.

This epitomises the visual downward spiral of Mac OS from Big Sur onwards. In my article A retrospective look at Mac OS X Snow Leopard (February 2021), after mentioning this quote, I wrote:

I still believe this is not a good approach in general, and especially for essential elements like scroll bars, which should always be visible by default, because they are UI elements whose usefulness isn’t limited to when you use them or interact with them — they signal something even when not strictly needed. In the case of the scroll bars it’s a visual estimate of how many elements a folder contains, how long a list of items is, and more importantly your current position when scrolling.

The first time I discussed this quote by Dye, back in July 2020, I wrote this:

And that’s one of the main things that bother me about Big Sur’s UI. I’m not a VP of Human Interface, but I’d say that a desktop operating system you interact with using complex and precise input methods and devices, can in fact afford a certain visual complexity without getting in the user’s way. Which is what I (and I suspect many other people) have always loved about Mac OS. An operating system characterised by a user-friendly, easy-to-use, but not-dumbed-down interface. I’d hate to see a progressive oversimplification of the Mac’s UI that could potentially introduce the same discoverability issues that are still present in iPadOS.

I’ve always considered the look of an operating system to be a by-product of how it works, rather than a goal to achieve, if you know what I mean. If something is well-designed in the sense that it works well, provides little to no friction during use, and makes you work better, it’s very rare that it also ends up being something ugly or inelegant from a visual standpoint. How it works shapes how it looks. If you put the look before the how-it-works, you may end up with a gorgeous-looking interface that doesn’t work as well as it looks.

Followed by this, which rings even truer today:

The renewed insistence on transparency and the alarming amount of reduced contrast present in many places of the UI makes the experience look as if it was designed by twenty-somethings with perfect vision for twenty-somethings with perfect vision. The Accessibility preference pane looks more and more like a place that is not devoted to people with physical impairments, but to people who are not on Apple’s design team or who are not within the trendiest segment of the intended target audience.

The more time passes, the more I see Mac OS’s interface simplify and degrade before my eyes, the more it’s apparent that that ‘visual complexity’ Alan Dye hates so much is what makes Mac OS a distinctive and very functional desktop operating system, but since it clashes with Apple’s agenda of ‘making Mac OS more like iOS’, that’s something to be reduced or effectively eliminated.

And let’s be perfectly honest while we’re on the subject: what visual complexity? Well-defined chrome windows with distinguishable areas like a title bar, a toolbar, a sidebar, a status bar and a list of the contents? Always-visible scroll bars? Prominent drop shadows? A clearly-delimited menu bar? Are these problematic UI elements that give the user cognitive overload? Give me a break.

On the contrary, these are all elements that help make the user interface more usable, more predictable, less ambiguous, with fewer discoverability issues. Go and have another look at Mac OS 26 beta UI. Visual complexity has been replaced by a systematic (and system-wide) blurring of the lines of the entire UI structure, making it utterly amorphous and shallow. And all for the sake of ‘the looks’, all for the sake of ‘familiarity with iOS and iPadOS’. The designers at Apple today seem to forget that there are people out there who have to work with this interface for many, many hours a day. The regular and professional users are not the actors in Apple’s marketing videos who just look, starry-eyed, at the interface as if they’re window shopping.

Here’s the same Finder window, in Mac OS 15 Sequoia (with button shapes turned on and always-on scroll bars) and in Mac OS 26 beta with a few annotations. (Source: Jonathan Fischer on Mastodon)

  • Notice how in the Sequoia Finder window, the window is clearly divided into two main areas, the sidebar on the left, and the ‘information area with controls’ on the right. While not as good as the Finder window design in past Mac OS versions (see Leopard below) — which divided the window in more distinguishable areas more prominently — it’s still a better implementation than what is proposed in Mac OS 26, where different parts of the window are treated as thin layers floating on top of the window’s base surface creating varying degrees of separation.
  • In Sequoia, at least with button shapes enabled, the row of buttons on the top still retains a toolbar semblance. In Mac OS 26, those buttons become little isolated blobs, and are given a certain tridimensionality by the use of drop shadows. They hover over the window’s contents… but also over the folder name? But also over the sidebar? This layer hierarchy looks random and arbitrary. And it messes with my sense of depth, because this assembly of UI elements looks like a mishmash of 2D and 3D effects, as I pointed out earlier.
  • Contrast is off, too. Notice how in Sequoia, the selected Home folder in the sidebar has more prominent highlighting than in Mac OS 26. Conversely, the scroll bar in Mac OS 26’s Finder window look almost too contrasty compared with the rest of the elements. They jump at you from this expanse of white and faint grey. In general, it looks like a mix of too little contrast with too much contrast.
  • Apparently, with Mac OS 26, we’re back to having monochrome sidebar elements. At least in Sequoia you have colours that help visually differentiate the various sections listed in the sidebar.

And here’s a similar Finder window in Mac OS X 10.5 Leopard. The difference between how things were and how they have become is staggering. It is pretty evident how the window chrome, the hierarchy of the various areas and UI elements, and the general window structure has degraded, even dissolved, over time. From a functional and usability standpoint, this isn’t progress.

In Leopard — but also in other versions of Mac OS that came before and after, at least until the Big Sur redesign — the Finder window structure and hierarchy is well defined and self-evident: the main chrome is the area above and below the window’s contents, represented by the title bar, the toolbar below it, and then the status bar at the bottom of the window. The chrome clearly frames the Finder window. Then, inside, we have the sidebar on the left, and the folder contents on the right; it’s more or less the same structure as a Web browser. The folder contents are the Web page, the sidebar shows a lists of places (or bookmarks if you like), the status bar on the bottom works in a similar fashion as a browser’s status bar. It’s a clear representation of what is content versus what are controls. Content and controls don’t bleed into each other’s territory.

But in the world of Alan Dye, it’s all content inside roundrects with thin bezels, and controls hover above it in quasi-borderless states, options become little treasures hidden behind ‘More…’ icons (the circle with three dots in it), panels and windows get deconstructed like those ‘designer dishes’ you see in fancy restaurants.

You could argue that there’s nothing wrong in updating a user interface design every now and then, and I would agree with you. But one thing is updating its visuals, another is dismantling its solid foundations, of tried-and-trusted principles and paradigms, that always made Mac OS the better, more user-friendly operating system.


I already talked about the fundamental problem behind the dumbification of Mac OS, in Safari 15 on Mac OS, a user interface mess (July 2021)

The utter user-interface butchery happening to Safari on the Mac is once again the work of people who put iOS first. People who by now think in iOS terms. People who view the venerable Mac OS user interface as an older person whose traits must be experimented upon, plastic surgery after plastic surgery, until this person looks younger. Unfortunately the effect is more like this person ends up looking… weird.

These people look at the Mac’s UI and (that’s the impression, at least) don’t really understand it. Its foundations come from a past that almost seems inscrutable to them. Usability cues and features are all wrinkles to them. iOS and iPadOS don’t have these strange wrinkles, they muse. We must hide them. We’ll make this spectacular facelift and we’ll hide them, one by one. Mac OS will look as young (and foolish, cough) as iOS!

I have little hope this trend will stop, because things have only got worse since Big Sur.

Looks first, usability later (if ever)

Craig Federighi, talking about the new look for Mac OS:

The menu bar is now completely transparent, making your display feel larger.

My first reaction when I saw this was, of course, You know what could actually make the display feel larger here? Removing that stupid notch! But really, this is an unnecessary change that only makes the icons and information on the menu bar more difficult to read or distinguish with most wallpapers, whether static or dynamic. And it also doesn’t help with apps that may put colourful icons in the menu bar e.g. to accentuate different app states.

I still think the best way to introduce translucency was in Snow Leopard:

 

 

First of all, having a translucent menu bar was an option, and secondly, even when the translucency wasn’t that subtle with certain wallpapers, the menu bar identity and space were preserved and identifiable. And — translucency or not — colourful menu extras could happily coexist with monochromatic ones. Whether transparency was enabled or not, all elements in the menu bar maintained a high degree of legibility.

(Yes, I’m aware that that Siri icon in the Mac OS 26 menu bar is in colour, but I actually realised it after a double take. I rest my case.)

When the new Control Centre appears during the keynote, the legibility issue is even more acute:

No depth, no opacity, very little contrast. I bet that, with a clearer wallpaper behind it, some of the text on these labels becomes barely visible.

And if you want even more confusing visuals and an even more amorphous look, you can choose what Federighi calls ‘clear look’:

 

 

As I was watching the keynote and these examples popped up, I got mad. This is not a matter of personal taste, it’s a matter of common sense: how can those at Apple who looked at this and approved it think it can be a functional design? It looks good (enough) in an arranged screenshot meant for show, but in day-to-day use? How could you want the icons in the Dock to be all the same non-colour? Apple has already forced third-party developers to create app icons with the same identical shape (squircle); if you also take their colours away with this setting, you’ll definitely have a hard time distinguishing app icons from one another. Not to mention the text in the widgets or in Control Centre (imagine them appearing over a stack of open Finder windows).

Let’s get back to Control Centre and look at when you want to customise it:

Relying on excessive visual subtlety to distinguish UI elements — especially if you rely on a background’s colour to be the sole provider of contrast — is a dangerous route. In this image taken from the keynote, the background colour is good enough for contrast, but what if you like more subdued colours or images for your wallpapers? What if there are multiple Finder and app windows behind that frosted glass panel?

When it comes to user interface and user interaction, I’m just an enthusiast who has perused the relevant literature for the past 30 years, I don’t have a degree on human-computer interaction, I just had some professional involvement in it in the past. But when I see stuff like this, I think that the least one can do when designing a new system-wide look is to test that look in all possible situations and evaluate whether it actually works or not. It seems such a basic step in the design process to me.

Instead, as revealed by a lot of screenshots of iOS 26 beta and Mac OS 26 beta I’ve seen on social media after the keynote, it appears that such testing has been minimal at best. Barely readable text inside icons/buttons, continuous and distracting colour changes to maintain contrast and legibility, the constant obsession with blurring the contours between fixed UI elements and content, the constant obsession with removing affordances and unambiguous elements that help users interact and manipulate objects in Mac OS’s environment and workspace. All in the name of prioritising looks over substance. It’s design at its shallowest. The ‘how it looks’ part without the ‘how it works’. It’s the affluent, elderly woman who goes to an art gallery and wants to buy that painting with a lot of beige and orange in it because it matches the colour scheme of her living-room decor.

And these issues are not new, either. It’s the iOS 7 situation all over again, with added glass effects. It’s a slightly different kind of flatness, but it’s flatness all the same. In every sense.

Other assorted remarks

1.

Generally speaking, what mostly disheartens me is the progressive crippling treatment Mac OS has been receiving for the past 6–7 years. Many directions and approaches could have been taken. For example, the transition to the vastly more powerful Apple Silicon chips could have been accompanied by a similar boost in the system software department, creating more sophisticated first-party tools and applications to take advantage of all that power. Another direction could have been to finally realise that to have the best of both worlds (Mac OS and iOS/iPadOS), the more sensible path would be to develop each platform’s OS in distinctive ways and with user interfaces and interaction models that are more appropriate for each platform and for each device, considering their different uses and, well, interfaces.

Instead, Apple is clearly opting for convergence at all costs. And it’s not just from a visual standpoint, but also a functional one. Instead of enhancing each operating system’s strengths, they’re working towards the lowest common denominator. Instead of striving to make Mac OS, iOS, iPadOS excellent each in its own environment and user experience, they’re levelling them down, and we’re reaching a point where Mac OS will become virtually interchangeable with iOS/iPadOS, which is just a terrible outcome.

2.

Regarding the most obvious flaws and misguided UI choices in Liquid Glass, a common sentiment I’ve seen on social media is something like, Well, this is just the first beta. Hopefully things will improve by the time the official releases are out. While I understand this, I also implore people to stop cutting Apple so much slack in these matters. It’s not a two-year-old startup. This is one of the richest companies in the world, with resources and (supposedly) more than 40 years of experience in UI/UX design. Has nobody at Apple — at any stage of design development — noticed all the issues we’ve been noticing since the Liquid Glass reveal? And if they have and approved them, shouldn’t that be worrying? Isn’t it tiring and exasperating that, still after all these years, developers and end users get to be Apple’s free beta testers, when the lion’s share of issues should be studied and resolved internally before even showing things publicly? This drives me up a wall every single time. C.M. Harrington on Mastodon rightly observes:

It’s especially egregious because, sure, this is the first dev beta. But it’s also 30 days before a public beta. Considering their cadence for releasing a new OS every year (ugh), they really can’t just pop something like this out in a half-baked state, as there are fundamental issues with the premise that need to be fixed… and won’t be before it ships ‘for real’.

3.

Another frustrating aspect of these periodical ‘resets’ and ‘redesigns’ from Apple is that, to me, this looks more and more like a strategy to sweep older bugs and issues under the rug, forcing developers — and to a less pronounced extent, users — to focus on the new look, the new features, and especially on the new requirements these ‘resets’ and ‘redesigns’ mandate. Really, it’s iOS 7 and Mac OS X 10.10 Yosemite all over again.

Or maybe even slightly worse — the Liquid Glass aesthetic is the most rushed-out-the-door and amateurish endeavour I’ve seen from Apple in forever. And forcing developers to adopt this style is, indirectly, a way to degrade their work. I’m not a developer, but I can imagine a developer asking themselves, “What if these glass effects don’t work at all with the distinctive style of my app?” — What about those developers who don’t like and don’t want to give their apps the glass treatment because maybe it conflicts with the design of their apps (they may have opted for opinionated solutions or for certain distinctive textures and effects)?

Whether they like it or not, all developers will be forced to reexamine their apps in a way or another, and that kind of overhead is all on them, while Apple smiles, rubs its hands, and thanks them for their cut in App Store revenue.

And what about the end users? Don’t like the new Liquid Glass UI? Can’t work with it because its visuals are all over the place, it’s more difficult to parse, etc.? Well, it sucks to be you seems to be the attitude of Apple’s designers. They’re evidently all in their 20s and 30s, with perfect vision, and yet can’t see past it — ironically enough.

This kind of opinionated redesigns have impact. And this new UI aesthetic, in my opinion, has the same impact as stopping to produce a smaller phone. Do you have small hands? Do you prefer a more pocketable device? Do you yearn for a comfortable device that can be operated one-handed? Well, it sucks to be you. Take this 6.1” slate and be on your merry way. Oh, you want a more ‘pro’ iPhone? It comes in 6.3” and 6.9” sizes. Bye now!

But everybody loves futuristic interfaces, right? Craig Grannell, in his piece at Stuff.tv, says it best:

Apple’s interfaces now look like they’re auditioning to cameo in Minority Report or The Avengers. You know the look: transparent displays with see-through elements sliding around. All very futuristic, until you actually use one.

Of course, in the movies, glass makes practical sense, because you can see the stars, even if the camera is behind the screen. The goal is to see Tom Cruise’s mug, not for Tom Cruise to read his to-do list. In the real world, however, there are – for good reason – distinctly few interfaces comprising sheets of overlapping glass.

Still, Apple went to extraordinary lengths to convince everyone (including, I suspect, itself) that Liquid Glass was the new black. There was endless talk of dynamic animations and reflected light. In one memorable moment, we were shown tvOS and told how “playback controls refract the content underneath, beautifully complementing the action without distracting from the story”. Because nothing says ‘not distracting’ like James Bond’s face reflected in a pause button.

4.

If this article ever ends up on Hacker News or some similar site, I bet there will be some smart-arse — it wouldn’t be the first time — who’s going to comment, Who is this dude, anyway? Who does he think he is? Does he know better than Apple’s designers and engineers? I don’t think so!

This dude (I’m pointing my thumbs at myself) may not be a developer or a designer by trade, but has worked with developers and designers in the past; has studied plenty of UI/UX literature for the past 30+ years; started using computers in 1981; has examined the user interfaces of the majority of operating systems out there, past and present, desktop and mobile; has conducted usability tests with sample groups on behalf of a few software studios; has been involved in the gaming industry consulting on game UI and UX (can’t name names, unfortunately, what with NDAs and the like)… Do I need to continue?

I hate to be spewing out this stuff (a friend of mine uses the expression ‘flashing the badge’, like in a display of authority) but I also hate when people don’t take me seriously. I constantly do my homework and make an effort to be as detailed and articulate in my observations as possible, in a way that should convey I know a thing or two of what I’m talking about. I’m not always right and I don’t pretend to be. I welcome corrections and listen to people I know they know more than I do. And yet, these comments — which I have seen on Hacker News, and received privately on social media and via email — either reveal poor reading skills, lack of critical thinking, or come straight from a place of misguided fanboyism: “Apple is infallible — who are you to criticise?”

5.

Speaking of reading skills, I hope it’s clear that what I have talked about and criticised in this article is limited to my first impressions on the first iteration of the Liquid Glass UI Apple announced on 9 June. I have not discussed any particular new feature in iOS 26 or iPadOS 26 or Mac OS 26 Tahoe. Feature-wise, I did see some interesting things during the WWDC25 keynote, and maybe I’ll talk about them in a future piece. This was especially about Liquid Glass and my criticisms, for now, are mostly about that.

UI ambiguity, exhibit №917

2025-06-01 05:19:15

I maintain several vintage and obsolete Macs and iOS devices. The main reason is because I like to squeeze any possible residual usefulness out of them while enjoying their operating systems’ user interfaces, which were better designed than what we have now. But another, equally important reason is that all these devices are little UI time capsules: with Mac OS, I can go back to Mac OS X 10.3 Panther by firing up an old iBook G3; with iOS, there’s still a first-generation iPod touch in the household, running iPhone OS 3.1.3 (yes, the iOS name debuted with version 4 in June 2010). I like to routinely be able to examine the UI of older versions of Mac OS and iOS in a ‘live’ setting rather than having to do Web searches fishing for screenshots.

Now, with these older iOS devices in particular, battery life is what it is, and I don’t always remember to keep them all charged at all times. It happens with my Mac laptops as well. Whenever I revive one of these devices, if it’s still able to access iCloud and other Apple ID-related services, I get a notification on all my other Apple devices that a certain device has now access to FaceTime and iMessage.

The wording in this notification has changed for the worse in more recent versions of Mac OS and iOS/iPadOS. This is the current wording:

Warning for New device added to your account - New wording

‘A Mac’? Which Mac? If I don’t recognise this device (whose name you’re not telling me straight away), I can remove it in Settings. Yes, I can do the extra step of going to Settings > Apple ID (or Apple Account) and look through the — long, in my case — list of devices to see if some new device with a name I don’t recognise has perhaps appeared there.

There are, of course, far worse examples of bad or ambiguous UI; what’s annoying for me in this case is that this is yet another interface regression. The older version of this notification — which I still see on iPhones running iOS 12 and on Macs running Mac OS 10.13 High Sierra and 10.14 Mojave, for example — was clearer and did not require me to take the extra step of verifying the device in Settings:

Warning for New device added to your account - Old wording

I can immediately recognise which Mac (or iOS device) it is because the notification itself is telling me its name. And yes, to be perfectly pedant, this should generally be a non-issue because such notification is expected after signing in on a recently-revived Mac. But the notification doesn’t appear immediately afterwards; there is always some delay, and there have been times in the past where I saw this warning pop up on my iPhone while I was out and about, and caught me slightly unawares. Given the vagueness of the new wording, I did stop in my tracks and — since it wasn’t a good time to fiddle with my phone — I rushed to find a quiet spot to enter Settings and check my devices. The device list took a long time to finally load, and while I waited I recalled I had recharged my 11-inch 2013 MacBook Air the previous evening, so the warning was probably about that sign-in. Even so, there were moments of uneasy trepidation as I was waiting for the device list to display. When it finally did, nothing was out of the ordinary.

Some may argue that the fact that the new wording for such warning ‘makes you look’ and check is a sign of better security and better UI. But I don’t agree, and the reason is that people very quickly learn to dismiss any warning that has become predictable and annoying. At least, with the old wording, I can dismiss the warning while seeing a device name I recognise right there. (Of course, if your name is John Smith, I hope you’ll call your Mac something different than John Smith’s MacBook Pro). Dismissing a warning with a more generic wording in this case is a bit riskier, because what if someone else has actually gained access to your Apple ID and added their devices?

Attention to details, Apple. Do you remember?

Update: On Mastodon, Gregory makes a great point, something I overlooked due to the fact that I see this alert mostly on my desktop Macs, and it’s a bit less intrusive there:

The problem is deeper. It’s that this is a modal. It demands your attention right this moment. It stands in your way when you’re clearly in the middle of something else.

These kinds of in-your-face attention-diverting modals are a pet peeve of mine. And I absolutely don’t understand how Apple — the company that always prides itself on its UX prowess, and that is endlessly imitated — could be fine with this for as long as iOS has existed.

It could’ve been a notification. It could’ve been an email. It could’ve been any number of things that allow the user to deal with it on their own time, but for some unfathomable reason, Apple thinks it’s okay to rudely interrupt the user like that when they unlock their device with a clear goal in mind. Same goes for low battery alerts, by the way.

I couldn’t agree more. I’ll also leave a link to a six-year-old (but evergreen) post by Gregory, where he talks more about similar disrespecting disruptions: Respect your users.

Tech fog

2025-05-01 04:54:52

(In dialogue form)

– You’ve been quiet lately.

– I have. It’s increasingly hard and wearing to find something to talk about when we talk about technology. And when I see something or read something that could push me to write an article here, it always feels rant‑y. In my head and sometimes in the article’s draft, if I even get to that stage.

– So you’re telling me that you’re going back to that feeling of ‘tech fatigue’ you often spoke about last year?

– Nah, it’s different now. It’s not fatigue, or lack of enthusiasm. It’s becoming flat-out disappointment. Wariness. Distrust. And even beyond that, as we’ll see later if we touch on the subject again. I still remember a time when I felt ‘empowered’ (to use a buzzword) by tech because it felt like we were on the same side, wanting the same things. Today, the tech world, tech companies and entities… it’s like dealing with banks and taxes — necessary, virtually unavoidable elements.

– I don’t know, maybe you should talk about that, about what’s wrong with tech today.

– You see, the funny thing is: on the one hand it’s a bit of a daunting task. Making a point, then doing research, then writing something that looks like a dissertation, chock-full of links and footnotes… And it’s not that I’m lazy and don’t want to do it. It’s that it increasingly feels pointless when the average interlocutor starts dismissing your arguments because he or she ‘feels/believes’ differently or ‘has heard differently’, and all you present them gets thrown out of the window as circumstantial evidence. And then there are like-minded people, who don’t need to examine Exhibits 1 to 45 to agree with you.

On the other hand, I often find myself shouting at the screen, Isn’t it clear enough already what’s wrong with tech today!? It’s getting more and more frustrating.

– Something I find increasingly annoying is listening to tech people (I mean tech company people). They sound utterly detached from the rest of us.

– Most of them are. Elon Musk said something like “empathy should be eliminated”, which speaks volumes about his own sociopathy. Empathy is one of the superior traits of any human being. A world without empathy is self-destroying. Don’t even get me started with Zuckerberg, another prime example of sociopath. But these two are obvious. There are dozens, hundreds of less prominent tech bros who see the world as a sandbox to play their profit-driven games. The crypto bros, the ‘AI’ bros… The other day I was reading this article on TechCrunch. Headline: Perplexity CEO says its browser will track everything users do online to sell ‘hyper personalized’ ads. Small excerpts:

CEO Aravind Srinivas said this week on the TBPN podcast that one reason Perplexity is building its own browser is to collect data on everything users do outside of its own app. This so it can sell premium ads. […]

Srinivas believes that Perplexity’s browser users will be fine with such tracking because the ads should be more relevant to them.

See, what this CEO and many like him do not seem to get is that people don’t want ads. They don’t. Personalised or not, they do not want them. And people don’t want to be tracked.

– Also… advertising rarely truly works anymore.

– Right!? When was the last time you actually bought something because you saw an ad for it when you clicked on a YouTube video? I block everything on my browsers. If I’m reading something in a magazine, or even online on a site that manages to show me ads despite the ad-blocker, my eyes are basically trained to focus on what I’m reading and skip all extraneous content. In 95% of cases, when I’m watching a YouTube video and the host starts saying “And now a word from our sponsor”, or “Today’s sponsor is…”, I skip forward.

Perhaps it’s a superficial analysis, perhaps people are more gullible than I’m crediting them for, but I think that the more advertising you want to throw at people’s faces, the less effective it becomes. But these tech bros think that the reason of advertising’s ineffectiveness is because it’s not targeted or not targeted enough. No, no, it’s because it’s too fucking much.

– Earlier you were saying something about you not feeling tech on your side anymore.

– Yes. I remember a time when tech companies designed products and sold you products as if they were like any other tool: a hammer, a screwdriver, a pen, an eraser, etc. They wanted you to have the best tools for your task. They put care in their products, especially software products. They designed (or tried to design) decent interfaces. They focused on the user. If you read about how much research and testing went into creating the operating system’s UI for the Apple Lisa and the first Macintosh, you can’t help but notice the sheer amount of care and attention. It’s true that people were essentially illiterate in the 1980s when it came to personal computing, which was a new thing then, but still, these designers and engineers went above and beyond in creating an interface and interaction paradigm that were as intuitive and friendly as possible. And when that wasn’t enough, if you pick any application software or system software manual written in that era, you’ll find the best tech writing imaginable. Especially as a Mac user, all this really made me feel as if Apple was on my side and that the company wanted what I wanted: tools that made me work better, and tools that could provide meaningful entertainment when I wasn’t working.

Tech companies today for the most part do not focus on what their users want or what’s best for their users. They want the user to focus on them. The products they offer are rarely finished, polished tools the user can acquire and use the way they see fit. Instead, the products are tokens, tokens that get the user hooked on whatever service the company provides. Tech companies today want to create a ‘relationship’ with the user, but within the confines of a lock-in… which means this ‘relationship’ becomes unbalanced and abusive pretty fucking quickly. Products become pretexts: where tools used to be ends, now products are means to an end. It’s the Gillette model: tech companies give you the razor, but the aim is to sell you interchangeable blades indefinitely.

– I also think this could be another way to explain the general decrease of software quality.

– In a way, sure. When everything is about providing users with good products, and good software tools, you put everything into them — again, especially true when talking about Apple. When the product or software tools are reduced to being a cog in a more complicated mechanism, then quality is less of a concern — the widget can simply be ‘good enough’ to keep the user within a certain software or brand ecosystem.

– The way app stores are designed has also killed the value of software as a product.

– People already struggled with giving digital goods a fair value, but the onslaught of $0.99 apps in the App Store was a fatal blow to the whole value system. I could practically hear some of my past clients yelling in their offices, “See!? This stuff is really worth one dollar!” How do you sell these people software applications and utilities at $50, $35, hell even $15? The reaction to this race to the bottom, in my opinion, has turned a lot of smaller developers into little tech companies that have to adopt the same tactics as the bigger tech companies to keep selling and to keep growing: there are always notable exceptions, but with subscriptions — now everywhere — it’s the same mechanism: less focus on the product for what it is, more focus on providing the product as a service, more focus on access rather than ownership. Keep paying to keep the lights on for me developer, and I in turn will guarantee you your ‘fix’.

– Hashtag: not all developers.

No, of course; the spectrum of this racket is more nuanced. There are genuinely good developers who keep writing useful and well-designed applications that are definitely worth the asking price, whether upfront or via subscription. I’d still prefer they all offered one-time payments, and with certain tools I wouldn’t mind paying, say, $50 or more. And I would certainly be a returning customer if I were satisfied with their products, gladly purchasing a paid upgrade for a major release.

On the opposite extreme there are downright frauds: honeypot apps designed to attract enough attention, and make you subscribe at exorbitant prices (charged weekly) through the use of dark patterns, like presenting you a welcome splash screen at launch, and when you press the bright Continue button, you automatically opt in for a trial period of e.g. one day, then you’re automatically subscribed. You should have tapped the minuscule ‘X’ icon in the upper right corner, in a colour just a shade darker than the splash screen’s background.

Between these two extremes there are all kinds of intermediate situations, with varying degrees of trustworthiness. Often you find stuff that is a bit more legit, from people who repeat the same platitudes, justifying the subscription by saying something like, “It guarantees that the app stays updated and maintained, blah blah blah”, then you peruse the app’s description and metadata in the App store and you see that it hasn’t been updated in 16 months, and you’re like, why should I give these guys money?

– Further diluting quality and poisoning the well today we also have AI…

– ‘AI’, always in quotes. Artificial intelligence doesn’t exist.

– Okay. ‘AI’.

– A.k.a. let’s turn the entire world in a giant Mechanical Turk that vacuums all kind of data to create the best stuff assembled by a predictive engine. It really looks like a recycling plant that tries to work things backwards: give me enough broken displays and motherboards, and I’ll produce a perfectly working display for your desktop computer.

– Well, out of that metaphor, that’s what’s supposed to happen with e‑waste recycling. The end result is producing working devices.

– Yes, well, I meant it quite literally, with the e‑waste entering this huge machine on one side, and coming out in perfect working order on the other.

– We could say it’s like a giant machine that you feed with all kinds of organic garbage in the hope it spits out edible food.

– Yes, edible food, perfectly cooked and plated by a Michelin chef.

Anyway, whenever I’m asked about ‘AI’, I always say, go subscribe to Ed Zitron’s newsletter. Ed is more knowledgeable and articulate than I on ‘AI’, he’s more up-to-date, too. I vehemently share his (negative) stance on ‘AI’ and the ‘AI’ industry, and he can certainly provide more specific examples and reasons as to why.

I’m just baffled at how a lot of regular people seem eager to accept ChatGPT and similar tools in their lives almost without question. “I’ll just ask ChatGPT about this and that”. Instead of doing Web searches, they just use the damn chatbot. They forfeit their critical thinking and critical method, and take the shortcut of least resistance (a shortcut which, I’ll reiterate, doesn’t guarantee you’ll get where you want).

It reminds me of some of my mates back in high school: why do any effort to collect information from different sources, when we can just copy and paste from the encyclopedia? And these people obsessed with ChatGPT make the same mistake my mates did — the point of the homework isn’t to copy and paste information the professor already knows. The point is that you learn to search for information, you learn to collate it, you learn to process different facts and sources to make your deductions. Presenting correct information is of course essential, but the enriching part is the journey you make from the beginning to the end of the assignment.

– Well, sometimes asking ChatGPT is done for fun. Like, they know they’ll probably get a shit answer. They just want to tease the machine.

– Sure, maybe some do this. But many really, already blindly rely on these tools for their queries and research. My wife not long ago told me, “You’d be horrified to know just how many students at the university [where she works] use ChatGPT”, and she’s right, I am. I wish they realised that asking these ‘AI’ bots information is really no different than asking stuff to the first stranger you meet in the street when it comes to trusting the answers.

– At least if you ask stuff to someone in the street, and they don’t know anything about it, they’ll tell you they don’t know.

– Exactly, while ‘AI’ literally fabricates, assembles coherent sentences that may or may not contain trustworthy information. I think some people think that these chatbots are like the supercomputer you see in so much science fiction, this all-knowing black box you interact with by asking stuff in natural language and then receive accurate information with zero failure rate. It’s science fiction, and will continue to be science fiction for a while longer.

The miracles of ‘AI’ are just like the miracles of cryptocurrency. It’s astounding that so many people understood just how much hypeware crypto is, but don’t seem to realise that ‘AI’ is essentially the same kind of hypeware. And a resource-intensive hypeware at that. In times like these, where we ought not to waste this much energy. Even if there are benefits in large language models, even if there are useful applications for these LLMs, I’m still convinced that — at least for now — the costs vastly outweigh the benefits.

– I’m starting to understand why you’re talking about ‘tech fog’.

– Well, yes, it’s all this (makes an all-encompassing gesture with his hand) but I’m also thinking of tech fog as a kind of present, generalised feeling connected to brain fog, and brain rot. But I’m also thinking that, banally, this fog is what’s making me uncertain about the very tools I’m using in my day-to-day tech life. I don’t trust Apple system software updates any more, because the company is getting terribly careless about software quality, and terribly controlling when it comes to hardware. I’m still using Macs and Mac OS because a complete switch to another platform isn’t feasible yet for me. But on the hardware front today’s Macs — and tomorrow’s Macs, unless something systemic changes — are black boxes that can’t be tinkered with, not even for banal interventions on the part of an intermediate-to-expert user. I used to be a power user, and troubleshooting Macs in case of issues was something I was entirely capable of — in the PowerPC and Intel era. With Apple Silicon, troubleshooting has become unnecessarily complicated and user-hostile. Because ‘security’.

On the software front, each major update seems to bring features I’m not interested in or that I downright do not want in my Macs or iOS devices and I can’t opt out of. And each minor update fixes a couple of bugs only to introduce four more. In how many languages do I have to say this, that most users need stability and reliability in the software tools they use all the time and make a living with!? That your OS isn’t something users stare at like staring at an artwork, contemplating the cleanness and translucency of its UI, the smoothness of its animations, and the shapes of icons and buttons!? People need to work. They need a functional, coherent, usable, accessible, friendly user interface that gets out of their way. The best UIs are the best because of these characteristics: the how it works of their design is so good that it shapes the how it looks. The other way round doesn’t work this well. Often it doesn’t work at all. But nobody seems to listen, and I mean the people who could make the necessary changes, not the many like-minded users that agree with me when I rant on social media.

So, this tech fog is sort of making me aimless, directionless. The newest Mac OS and iOS don’t work for me. I’m not going to update for the foreseeable future. Yes, I’m at this point. And it’s not just Apple. A lot of tech is like this today — either you’re okay with the changes companies arbitrarily introduce, or you have to take a ‘rebellious’ stance and do something drastic. Like staying on an older version of your operating system or even your software tools. I said it earlier, it’s like dealing with banks. You know every time they unilaterally change something in your contract and in the fine print they tell you something like, If you agree with the aforementioned changes, no action is required on your part. Otherwise you’re free to terminate the contract before the changes take effect. In other words, either you like our ways or you can change bank. All or nothing. And I wish there were more options and not just caving and adapting the best you can or tearing down everything and switching to another platform — where things aren’t ultimately that different.

– This sounds like you’re past tech fatigue.

– I’m indeed past tech fatigue, way past. This is pure, unadulterated frustration. Frustration, uncertainty, and doubt.

I’m not sure that “What others have on the market is worse”

2025-03-20 21:57:43

Re-reading some of the quotes curated by Michael Tsai in the already-discussed Rotten commentary round-up, I noticed this bit by Om Malik, which had escaped my attention for some reason:

I have my own explanation, something my readers are familiar with, and it is the most obvious one. Just as Google is trapped in the 10-blue-link prison, which prevents it from doing something radical, Apple has its own golden handcuffs. It’s a company weighed down by its market capitalization and what stock market expects from it.

They lack the moral authority of Steve Jobs to defy the markets, streamline their product lineup, and focus the company. Instead, they do what a complex business often does: they do more. Could they have done a better job with iPadOS? Should Vision Pro receive more attention?

The answer to all those is yes. Apple has become a complex entity that can’t seem to ever have enough resources to provide the real Apple experience. What you get is “good enough.” And most of the time, I think it is enough – because what others have on the market is worse. They know how to build great hardware; it’s the software where they falter.

I agree with this almost completely, save for the part I have emphasised in bold.

From a hardware standpoint, the gap between Apple and the competition isn’t as wide as it was when the first generation of Apple Silicon CPUs debuted in 2020. Intel chips have got better, but AMD has especially upped their game. The AMD Ryzen AI MAX+ 395 (codename: Strix Halo) delivers impressive performance, as shown in this video by Dave2D reviewing the Asus Flow Z13 — which is, in turn, an impressive gaming 2‑in‑1 ’Surface-like’ device.

Design-wise, there are brands like Lenovo or Asus which, on the one hand offer home and business laptops with austere, iterative designs, but on the other hand get more creative in other lineups (Asus with their gaming laptops, Lenovo with their convertibles). Then there are brands like Framework: their laptops may not win industrial design awards, but what they’re doing on the modularity and repairability fronts is perhaps unmatched at the moment. And they, too, have recently upped their game with their new offerings.

Apple’s hardware design is still remarkable, but increasingly more on the inside than the outside of their machines. MacBooks haven’t changed much since the unibody chassis was introduced in 2008, and what seems to characterise them today is a notch on the top centre of their displays, which is among the most idiotic hardware design choices I’ve seen in more than 30 years.

Software-wise, well, I have a bit of a bias, having used Macs since 1989. I clearly know the Mac (and iOS/iPadOS) ecosystem and software selection far better than any other platform. But in recent years I’ve been familiarising myself with Linux (mainly Ubuntu and Crunchbangplusplus) and have been using Windows 10 and 11 on my ThinkPads, my Surface Pro, and my Lenovo Legion 7i gaming laptop. And overall it’s been a pleasant experience. With hiccups here and there, but again, mostly deriving from lack of habit or familiarity. I think Windows 10 and Windows 11 have been good examples of UI improvement on Microsoft’s part. And as far as reliability goes, I’ve been using my Legion 7i gaming laptop for more than a year now, and I’ve had zero issues with Windows 11. No weird crashes, no instability, no misbehaving apps, nothing.

I’ve also recently discovered the probably-still-niche world of e‑ink tablets, devices that are mostly used for note-taking, sketching, and reading. These devices may not compete with iPads in terms of versatility, but the fact that they don’t want to be jacks-of-all-trades is also their core strength, in my opinion. Which means they are more focused devices not suffering from identity crisis like the iPad, and in my experience (owning one of them) they actually offer a better user experience when it comes to handwriting ‘feel’, and an e‑ink display still looks more natural to me when writing and reading, especially during long sessions. Other advantages are the much reduced eye-straining, of course, and the exceptional battery life.

I know that, from a financial standpoint, suggesting you actually, extensively try different platforms may be unfeasible. I’ve done it largely by acquiring second-hand devices and computers, and even by receiving generous gifts from readers of this blog who wanted to get rid of stuff without increasing e‑waste. Of course we all ultimately have our preferences, but I think it’s healthier to have preferences without prejudices. I’ve been guilty of this myself until circa 2016. Before that, I was using Apple devices and software 95% of the time; the rest was superficial knowledge, mostly gathered through hearsay and sporadic usage of non-Apple platforms.

And yes, coincidentally Apple had also the best UI during Jobs’s tenure, along with some striking industrial design. But it’s important to understand that the ‘good enough’ Apple offers today isn’t necessarily better than what the competition offers — everyone is ‘good enough’ in tech now, though this ‘good enough’ for some is a step up from their previous mediocrity. For Apple is a step down, especially in the UI and UX departments, where they inarguably used to excel.