MoreRSS

site iconSix ColorsModify

Six Colors provides daily coverage of Apple, other technology companies, and the intersection of technology and culture.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Six Colors

Does iPadOS 26 steer the iPad in the wrong direction? ↦

2025-07-26 03:57:29

My longtime colleague Harry McCracken has a new piece about the future of the iPad at Fast Company that is definitely worth your consideration:

As someone who’s used an iPad as my main computer for almost 14 years, I can’t join the chorus of unbridled enthusiasm for iPadOS 26’s embrace of Mac conventions such as floating, overlapping windows and a menu bar at the top of the screen. Apple may well be making the right decision to please the largest pool of people who want to get work done on its tablet. But it’s also moving decisively away from some of the philosophies that attracted me to the platform in the first place, and I’m trepidatious about where that might lead.

Harry is (quite rightfully, I think!) concerned about what it means for the future trajectory of the iPad, especially since the Mac already exists. I think iPadOS 26 is looking great, but asking what this means for Apple’s overall product philosophy is an absolutely fair question—and I’m not sure even Apple is entirely sure where it’s going.

Go to the linked site.

Read on Six Colors.

The iPhone Upgrade Program is compatible with AppleCare One

2025-07-25 21:49:18

One of the questions raised by the recent announcement of AppleCare One came from Six Colors member Jono, who—when I mentioned that the only AppleCare I had was via the iPhone Upgrade Program—wondered “if there is any interaction with that at all.”

It was a good question, and honestly, one I should have thought about more myself, being an iPhone Upgrade Program subscriber.

The answer, perhaps surprisingly, is yes! These two programs are compatible, though if you want to do that, it does require a bit more work.

Apple spokesperson Anna Mitchell told me that iPhone Upgrade Program subscribers can contact Apple Support and unbundle their coverage from the iPhone financing, then upgrade to AppleCare One. Keep in mind that AppleCare One is only available in the U.S. at present, and requires a U.S.-based Apple Account in good standing.

Of course, my personal experience with the iPhone Upgrade Program—while it has gotten better every year that I’ve been a subscriber—is that it sometimes struggles with unusual cases, so if you do go this route, be prepared for some potential extra challenges if you choose to upgrade your phone this fall.

三星的抗氧化剂传感器被芝士饼干骗了 ↦

2025-07-25 21:20:24

The Verge’s Victoria Song dives into the “antioxidant sensor” on Samsung’s Galaxy Watch 8 with some surprising (or not) results:

I colored my thumb with a yellow-orange marker. Wouldn’t you know it? My Antioxidant Index shot up to 100. Next, I colored it with a blue marker. My score dropped to zero. Unfortunately, my color-based hypothesis was foiled by a piece of roasted broccoli. It, too, scored 100 and is, in fact, rich in carotenoids.

Perhaps the blackberry had failed because, when pressed against the sensor, it exploded in a mess of purple juice that was subsequently difficult to clean from the watch. Perhaps I was deficient in my antioxidant consumption. Or so I thought, until the Cheez-It.

This piece is a great example of one of the issues facing the health wearable field—and it’s hardly unique to Samsung: the sensor arms race.

The truth is there are health metrics that would be genuinely interesting and helpful to people—things like blood pressure and glucose levels—that are simply very hard to implement (perhaps even impossible by current technological standards) in a smartwatch.

But the market keeps moving forward, and that never-ending arms race encourages companies to keep adding new sensors and features of questionable usage. Even Apple’s been plagued by this in the past, which is one reason features like the Apple Watch’s temperature sensor are described with very careful language: “The temperature sensing feature is not a medical device and is not intended for use in medical diagnosis, treatment, or for any other medical purpose.”1

Is Samsung’s antioxidant sensor, which seems to be basing more of its readings on the color of your skin than any actual scientific data, the pinnacle of this movement? Personally, I doubt we’ve reached peak ridiculousness quite yet, but I think it’s coming.

Song’s overall point here is well taken—that you should take most of these sensors and metrics with a grain of salt2:

Even if a bunch of science went into developing detection algorithms using high-tech sensors, there’s always going to be errors and room for misinterpretation. This seems obvious, but it’s easy to get sucked into the quantified rat race toward perfection. If tracking a specific metric makes you feel worse about yourself, you’re allowed to take a break from it — or even decide it’s not worth paying attention to. None of this is meant to be taken that seriously.

Likewise, a good reminder that you always need to consider that these sensors aren’t altruistic productions; they’re features on a product that a company wants to sell you.


  1. I also love how the feature is described, with a straight face, as the perfectly natural “nightly wrist temperature.” Who among us has not been concerned about the temperature of their wrists? This is a case of doing exactly what it says on the tin. 
  2. Though not too much salt, because you don’t want to raise your blood pressure. 

Go to the linked site.

Read on Six Colors.

初体验:iOS 26 公共测试版

2025-07-25 01:20:00

Three iPhone screenshots showing different home screens and a lock screen. The first screenshot displays the time, date, and weather. The second screenshot shows the home screen with app icons. The third screenshot features a lock screen with a superhero image.

iOS 26! It feels like just last year we were here discussing iOS 18. How time flies.

After a year that saw the debut of Apple Intelligence and the subsequent controversy over the features that it didn’t manage to ship, Apple seems to have taken a different tack with iOS 26. In addition to the expansive new Liquid Glass design that spans all of its platforms, Apple has largely focused on smaller, “quality of life” improvements rather than marquee new features. That’s not a bad thing, either—these are often the types of things that Apple does best, and which actually make a meaningful impact on the lives of their customers: saving them time waiting on hold on the phone, helping them avoid dealing with spam, and improving their driving features.

It’s also worth noting that, with very few exceptions, all of the iOS 26 features that Apple demoed during its WWDC keynote this year are available, right now, in the public beta. The exceptions include the digital ID feature in Wallet that uses info from your passport and the age rating/content restriction updates in the App Store. That’s it. Everything else has been there since the earliest beta builds.

I’ve spent the last few weeks running those initial developer betas of iOS 26 so you don’t have to. As the public beta arrives, you may be tempted to dive in, so allow me to run down the biggest changes to your phone. And, as per our usual reminder, this is the beta period, so everything is still subject to change and the final version, when it arrives this fall, might look or work differently from the way it does today.

With that disclaimer out of the way, let’s take a look at what might convince you to take the plunge.

Liquid Glass half-full

Apple’s new design language, dubbed Liquid Glass, applies across all their platforms, but unsurprisingly, it feels most at home on the iPhone and iPad. That’s in part because of the touch interface; the literal hands-on nature makes the feel responsive and more like physical things that you’re interacting with. For example, dragging the new magnifying loupe across the screen, watching the way it magnifies and distorts text and images as it passes over them—this interaction has always been unique to iOS for practical reasons, but the way it feels here doesn’t have a direct analogue on other platforms.

Perhaps the truest “liquid glass” interaction, in that the loupe when it moves back and forth deforms like a water droplet.

Controls now overlay content rather than sitting in designated toolbars or areas of the screen reserved for those controls, and are rendered in transparent glass that refracts and distorts the colors of whatever passes behind it. That’s impressive but also, at times, distracting: sometimes you see a distortion of text from what you’re reading within the UI, which is odd. Or, when scrolling past content that goes abruptly from light to dark, the buttons might similarly flip appearance from, say, black icons to white icons in a way that can feel jarring.

App icons are built in transparent layers; interestingly, if developers adopted the design changes to better support iOS 18’s “tinted” theme, their icons already get some of the benefits baked in. That tinted theme has been expanded with both light and dark options and there’s also a new clear theme that turns all your app icons ghostly, which is a great way of testing your muscle memory for where you put your apps. I’m not sure it’s for me—everything looks a bit too same-y—but it is definitely a look, and I’m equally certain there will be folks who love it.

The Liquid Glass design has undergone perhaps the most substantial tweaks during the beta period to date. But those changes have been kind of all over the place. At times, Apple has seemed to dial back the transparency in an effort help legibility of UI controls, but in the most recently build, the company seems to have ramped the transparency back up, once again at the expense of readability. The challenge of design is finding things that both look and work great, and the company seems to be continuing to wrestle with that balance during the beta period.

Screenshot showing the popover menus in iOS 26. The left hand shows a small popover with Copy, Apple Intelligence, and Find Selection, followed by a small arrow. The right hand shows that menu expanded into a full popover menu, including more options like Translate, Search Web, Speak, and more.
iOS 26’s new popover menu makes it easier to find the option you want, rather than having to endlessly scroll.

The redesign is more than skin deep, however. Apple has rethought the way some of its most fundamental interactions work. For example, the increasingly long horizontal popover menus that hid options behind an interminable scroll have morphed into a dual-stage design. Tapping and holding on the screen brings up a popover with a few common options, but it now doesn’t make you scroll; instead, there’s an arrow indicating more options. Tap that, and you’ll get a big pop-up panel of all the available commands in a much easier to read and use format. As someone who frequently finds himself swiping through a very long list to find the one command I want (and somehow, it’s always the last one), this is a tangible improvement.

It would be nice if that first menu were more customizable, though. For example, I can imagine someone who’d want Translate or Speak higher up. And although the menu varies from app to app, some of the organizational choices are puzzling. In the Safari screenshot above, I’m not sure why Writing Tools is visible. After all, I’m looking at uneditable text on a web page. Am I rewriting the web now? This feels less like a feature focused on user needs and more like a reflexive promotion for Apple’s AI tools.

Other system-level features have been expanded as well. For example, while you used to be able to swipe from the left side of the screen to go back or up a level in a hierarchy, that gesture now works anywhere on the display, making it both more discoverable and easier to use.

You paid for the whole lock screen but you’ll only need the edge!

As with any change this sweeping, it’s always going to take some time to adjust. There are some who will decry it as change for change’s sake, but as undesirable as that might be, the countervailing argument is that you shouldn’t keep things the same just because it’s the way you’ve always done them. My experience with Liquid Glass has had its ups and downs, with interactions that feel both interesting and dynamic to those that are downright frustrating.

Lock step

After the last few years of Lock Screen customization options, this year’s additions are more muted, and mostly in step with updating the look for Liquid Glass.

The biggest addition—literally—is the new clock, which can expand to fit the empty space on your screen. If you have a rotating set of lock screen photos, it will dynamically adjust for each one, trying not to obscure people’s faces; while you can manually adjust the size of the clock in the lock screen customization screen, it seems as though it still alters dynamically, so I’m not entirely sure of the point of that exercise.

Two iPhone lock screens. The left screen displays a large clock reading 2:34 PM with a beach background. The right screen shows a small clock reading 2:35 PM with a cathedral background.
The clock on the lock screen now dynamically adjusts from very very large to what used to be normal.

I’m also happy to say that one of my favorite features of last year—the rainbow-hued clock available in some lock screen styles, like Live Photo—still exists—you just have to change the clock style to solid rather than glass in order to see it. There’s also an option to move the widgets that used to sit below the clock all the way to the bottom of the screen, right above the notification stack, so as to not block the subject of your photo. I kind of prefer this location—I find it easier to tap a widget and open the app if I want, and I find the data from them don’t get as lost. (I was, however, able to overlay the widgets on the clock, which feels like a bug.)

Your Lock Screen photos themselves can also be more dynamic now, with the addition of spatial scenes. That’s a feature imported from the Apple Vision Pro where iOS will apply a three-dimensional effect to an image, animating depth as you move the phone around. How effective that is varies from photo to photo, although it feels less compelling here than viewing true spatial scenes viewed on a Vision Pro; the animation of the spatial versions are sometimes a little jerky, and some people with motion sensitivity might find them off-putting. Apple’s attempting to identify what makes a “good” spatial scene and whatever system is making that determination can be hit or miss.

Speaking of images that move, the lock screen also now has an animated artwork option for music apps—note that I said “apps” not “the Music app” since it’s an API available to developers of third-party apps. But it will need adoption from the producers of albums in order to take full effect. When it shows up, it takes over the entire lock screen rather than being constrained within a little album square. It’s an interesting approach, although one that you may not notice depending on how often you actually visit the lock screen while music is playing. So, while it’s a cool idea, I’m not sure it does much for me. Maybe it’s time to commission some animated artwork for the Six Colors podcast?

Point and shoot

I’d venture a guess that Camera is the most used on the iPhone, though I’ve got no real numbers to back that up. But given the amount of time Apple has spent upgrading the camera hardware on the iPhone over the years, I feel pretty confident in my assessment.

As a result, redesigning the Camera app—hot on the heels of last year’s redesign of Photos—is a bold choice. But it’s not surprising that the company’s alterations here are focused on the minimal, reinforcing the way that most people already use the app. (And if anybody’s got the metrics to know how people use its apps, it’s obviously Apple.)

For example, controls for more advanced features like aperture, exposure, and photographic styles are now buried in a separate menu, available by tapping a button at the top of the screen or by swiping up from the bottom. Given that I’ve definitely ended up in these controls by accident over the years—and I suspect I’m not alone—that’s not a bad thing.

Screenshot of a smartphone camera interface with a photo of a framed picture of a computer. The interface shows options for flash, live, timer, exposure, styles, aspect ratio, and night mode.
The simplified Camera interface makes it easier to point, shoot, and get out.

Likewise, what used to be an at times overwhelming carousel of modes—panorama, portrait, photo, video, slo-mo, time lapse, etc.—has now been visually reduced, by default, to just the two most popular: photo and video. The others are still there if you scroll left or right, but you’re less likely to accidentally find yourself shooting spatial video at a lower resolution when you don’t mean to. Similarly, those resolution and format choices are also now squirreled away behind a separate button, there if you need them without being omnipresent.

The redesign reflects the fact that most people want to get in, snap a picture or shoot a video, and get out. Not to mention that Apple has spent a lot of time designing its phones so that they take great photos without having to tweak those details. Those advanced features are still there—and, arguably, more accessible using something like the Camera Control button on the latest iPhones—and for those who long for more than Apple’s Camera app offers, there are an assortment of popular and powerful third-party camera apps to fill in the gaps.

The counterargument, of course, is that by hiding those features away, they are less discoverable. This is the eternal battle in interfaces, especially in someplace as constrained as an iPhone screen. In other places, Apple has done its part to pop up hints about features you might not see at first glance, including here in the Camera app. Personally, I think this redesign walks a solid line—the new interface is not so different from the old that I had any trouble with it, and I appreciate that there are fewer distractions.

There are also a couple of AirPods-related features in Camera: first, if you’ve got the latest models with H2 chips, the microphones should be improved. Apple touts them as “studio quality”, a meaningless qualifier that could mean anything from “suitable for a recording studio” to “you can use these in your studio apartment,” but at least it doesn’t sound like you’re in a wind tunnel anymore. In one of my test calls, my wife was genuinely impressed when I asked, at the end, how I’d sounded. “I wouldn’t have known you were on your AirPods if you hadn’t told me.”

And you can now use the AirPods’s stem controls to take a picture or start recording a video: handy for people using a selfie stick or tripod, or even just a quick way to snap a group photo (as long as you don’t mind having an AirPod in your ear in said photo). Bear in mind, this is a feature you’ll have to turn on in Settings under your AirPods, though it does let you choose between a simple press or a press-and-hold.

Calling cards

An update to the Phone app? Are we sure iOS 26 doesn’t stand for 1926? People knock the Phone app, but, well, I still make phone calls.1 In addition to a couple of handy features, there are also some substantial design changes afoot.

Screenshot of iPhone call settings menu with options for Classic and Unified call styles, and tabs for Calls, Missed, Voicemail, Unknown Callers, and Spam. Manage Filtering option is also visible.
The new filtering menu in Phone works hand in hand with call screening and spam filtering features.

A redesign strikes again! The new Unified view pins your favorites to the top, then shows you your recent calls, missed calls, and voicemails all in a single very long list on the Calls tab, with separate tabs for Contacts and the Keypad. Some might not care for this approach, but I find it kind of a no-brainer. It did encourage me to pare my Favorites list down a bit to the one line of people I actually call as well as finally update their contact pictures to the more recent poster format. I don’t mind having voicemails mixed in; I don’t get very many. But if you hate this new interface, don’t worry: Apple will let you switch back.

Unquestionably good is the new set of Filtering features available in the menu at the top right. By default, this includes options to view just Missed calls or Voicemails, but there’s also now, praise the heavens, a Spam section for calls that are recognized as such. Apple’s using a combination of carrier tagging (those calls that you’ve seen flagged as “Spam Risk”) and its own analysis. You can manually mark a call as spam by swiping left on it in your recents list and choosing Block and Report Spam.

Three buttons with icons and text: 'Block and Report Spam' (orange), 'Reminder' (blue), and 'Delete' (red).

The real challenge, as always, is the calls that fall in between your contacts and out-and-out spam. For this there’s the new Screen Unknown Callers feature. You might remember that Apple previously added a Silence Unknown Callers feature in iOS 13 that would mute calls from numbers that weren’t recognized—with the challenge that if you got a call from a doctor’s office, tradesperson, or even food delivery, you might not see it. That was followed by Live Voicemail in iOS 17, which helped mitigate the issue, but Screen Unknown Callers goes a step further: when activated, which you can do in the Phone app or in Settings > Phone, callers not in your contacts will be asked to provide more information before the call rings through. You can also choose to leave unknown calls totally silence, or turn screening off entirely to have all calls ring your phone.

There’s a separate but connected feature in iOS 26 called Call Filtering. Once you turn this on, you’ll see an Unknown Callers category in the filter list in Phone, not dissimilar from the Messages filters that have existed for a few versions. From there, you can choose to mark the numbers as known, at which point they will ring through—without having to be added to your contact list, which is nice. However, I’m not sure how you move a number back to “unknown” if you accidentally mark it as known—you can delete it from the list or block it, but I’m not sure what to do if you want to simply move it back to the “Unknown Callers” section. You can also choose to have calls detected as spam by your carrier simply not ring at all, which seems like a real no-brainer.

Overall, I’ve got mixed feelings about the Screen Unknown Callers feature. On the one hand, it will undeniably help weed out potentialy spam calls. On the other, some part of my upbringing feels embarrassed about the idea that someone—especially a likely underpaid person in a service industry—is going to have to justify their call to a robot. I’ve gotten calls from AI assistants from my dentist office recently, and frankly…I just hang up. I’m not going to spend my time chatting with a computer, and I don’t blame anybody else for feeling the same. That said, I have turned it on, though I haven’t actually seen it in use yet.

A smartphone screen shows a notification for a call from (248) 434-5508. The notification states, 'You'll be notified to pick up the call.'

Along similar lines, Apple’s also added a feature called Hold Assist that automates the oft-annoying task of waiting on hold. I did get a chance to try this out, and it worked fine except for one caveat. The idea behind the feature is that when you’re put on hold with some cheesy hold music or deafening silence, you can trigger this feature and be notified when somebody comes back on the line.

In my experience, however, one problem I encountered was that it registered the occasional recorded messaged while I was on hold with the Massachusetts Department of Revenue—”Your call is important to us!” or “Did you know you can go to our website?”—as a human coming back, and notified me, leaving me to scramble for the phone only to find that I wasn’t talking to a live person after all. My understanding is the feature should be able to distinguish between a regular recorded message and a human, but that was not my experience in one of the earlier betas—I haven’t yet had a chance to put the feature through its paces in the more recent builds.

Just browsing

Three screenshots of a podcast app with a menu and a search bar. The first screenshot shows the app's logo and a list of podcast episodes. The second screenshot shows a menu with options like 'Share,' 'Copy,' 'Voice Search,' 'Switch Tab Groups,' 'Move to Tab Group,' 'Close All 30 Tabs,' and 'Close Tab.' The third screenshot shows a podcast episode with a play button, a progress bar, and options like 'Share,' 'Add Bookmark to...,' 'New Tab,' and 'New Private Tab.'
Safari’s reduced interface hides its commands in a plethora of pop-up menus, which leads to some oddities like two Share buttons.

While Safari may not have gotten quite the expansive overhaul of some of Apple’s other built-in apps this year, it’s still worth mentioning, if only because, like Camera, it remains one of the most used apps on iOS.

Apple’s taken a variety of stabs at UI minimalism in Safari over the years, both on iOS and macOS. Often those first, more substantial changes, get dialed back. In iOS 26, these changes aren’t quite as radical, but they’re more than just a coat of Liquid Glass. Gone is the longstanding toolbar with its back/forward arrows, Share icon, bookmarks, and tab menus beneath a location bar. In its place, by default, is a more reduced UI with a back button2, location bar, and now seemingly ubiquitous “More” button denoted with three dots.

You’ll find many of the previous controls under that More button, including both bookmark and tab management, as well as Share. But some controls are still accessed by tapping on the icon in the location bar—including Reader mode, if available, translation, extension management, and so on—and others are instead squirreled away under a long press on the location bar, including closing tabs, tab groups, and…another Share button. The button so nice they included it twice!

As with the Phone app you can, if you so wish, revert back to classic Safari—either with the location bar at the top or bottom. In a few weeks of usage, I’ve elected to stay with Apple’s new design, though I still struggle to remember whether the control I want is accessed via location bar or More button. Or…both? At least some common gestures, like swiping left and right on the location bars to switch tabs or flicking upwards on the URL to see your tab overview, have remained.

I never really felt like the old toolbar style was getting in the way of my content, so I’m not sure if this change is anything but an attempt to mix things up. I’ve largely gotten used to the look, though at times the effects of a non-uniform website background on Liquid Glass can lead to disparate effects like one pop-up menu being a light color while another is dark.

Beyond the design changes, most of Safari’s other updates are under the hood. Developers of web extensions don’t need to use Xcode or even a Mac anymore; they can just upload their extensions to the App Store in a ZIP file. Hopefully that’s another step closer to being able to bring some of the myriad of extensions out there to Safari. And any web page can be opened as a full web app from the home screen now, rather than just essentially being a bookmark.

Let’s get visual…visual

Apple Intelligence may have been the big news in iOS 18, but this year its new features are somewhat more muted. While those capabilities that didn’t end up shipping in 2025—Personal Context and a smarter Siri among them—are still expected to arrive in the future, Apple has with this release focused on some smaller capabilities, like integrating ChatGPT into Image Playground, the ability to combine two emoji in Genmoji, and summaries of voicemails. It’s also brought back summaries for notifications with more granular controls over what kinds of apps you want it to apply to—plus more stringent warnings about the fact that said notifications may be inaccurate, which certainly raises questions about whether they are useful.

Perhaps the most significant of these Apple Intelligence-powered features in iOS 26, though, is an expansion of the Visual Intelligence feature launched last year. Instead of being confined to pictures taken with the camera, Visual Intelligence in iOS 26 now offers the same capabilities with screenshots. In fact, the feature is built right into the existing screenshot interface, so now whenever you squeeze the sleep/wake button and volume up button to take a picture of what’s on your screen, you’ll seen two new controls at the bottom: Ask and Image Search.

Two iPhone screenshots show a webpage from 512 Pixels. The first screenshot displays the article title 'Beta 3 Brings New Wallpaper: 'Tahoe Day'' with a preview image of a landscape. The second screenshot shows a ChatGPT response explaining that the blog is written by Stephen Hackett, covering Apple history, technology, and software updates. Both screenshots include the website's URL and options to share or download the content.
Story checks out.

The former lets you ask ChatGPT questions about the image, while the latter brings up Google results. You can even highlight a portion of the image by running your finger over it if you only want to search on a subset of the picture. It’s a shot over the bow of Google’s longstanding Lens feature, with a dash of AI thrown in. I’ve barely used Visual Intelligence on my iPhone 16 Pro since its debut; I’m not sure if screenshot integration is enough to get me to change my ways, but it does open up some new possibilities for extracting information from your screen, in the same way that Live Text has done.

Speaking of Live Text, in case answers from two different tech giants aren’t enough, Apple is also using a bit of that same machine learning technology to pull out relevant details from the image, whether it be a URL or a calendar event and present them in a little floating lozenge at the bottom of the screen. That can be handy, though it’s also at the whims of whatever information is captured in the screenshot.

It is a little odd that Visual Intelligence is offered in two different places with two different interfaces, but given there is a distinction between screenshots and taking photos, perhaps that’s not as jarring as seems at first blush.

Bits and bobs

As with any major platform update, there’s simply too much to cover absolutely everything. Here, then, are a few other features that I’ve noticed in my time with the iOS 26 beta.

The Battery section of Settings has been redone, providing a quick glance at when you last charged or, if your phone is plugged in, how long until it’s charged. The main graph now your battery usage to your average daily usage—including in the app-by-app breakdown—rather than providing the somewhat less useful hour-over-hour view. There’s also a new Adaptive Power mode that supposedly helps prolong battery life if you’re using more than usual by toning down some things like display brightness.

Two screenshots of a smartphone's battery usage screen. The first screenshot shows the battery level at 53% with a last charge time of 11:22 AM and a daily usage of 52%. The second screenshot shows the battery level at 52% while charging, with an estimated time remaining of 47 minutes and 1 hour 49 minutes. Both screenshots include a bar graph comparing daily usage to the average.

As on the iPad, you can record your audio locally on the iPhone with the new Local Capture feature, whether it’s via the built-in mic, AirPods, or a USB-C microphone (not to mention a new audio input picker that lets you choose which mic you want to use). While it still needs controls for audio input volume—some mics, including my ATR-2100x, which I would be most likely to use with this feature, are distorted because they’re simply too loud—this does make it feasible to record a podcast on your iPhone. I honestly never thought I’d see the day, but it’s here.

Notes may not support writing in native Markdown, but it does now let you export a note into the format. That includes any images that you’ve embedded in the note, which is handy. Despite being a Markdown fan, I’m not sure I’m likely to use this feature…I like Markdown because I want to write in it for the web, not have to take the extra step to export. But it’s nice that there’s at least an easy and accessible way to get your data out of the Notes app.

Screenshot of a password manager app showing a history of password creation. The app is named 'LAcon V' and has a 'Clear' button in the top right corner. The history section lists a password created 9 hours ago for 'lacons.org' with a note that it may not have been submitted.

The Passwords app adds a history of changes you’ve made to passwords (only, of course, for changes since installing iOS 26). That’s a nice feature because I have definitely ended up not realizing I’ve already got a password and then reset it. In fact, in one of my favorite moves, it will even tell you when it created a password for a site, even if that password may not have actually gotten submitted—something that’s happened to me more than a few times.

Remember how even iTunes had the ability to crossfade between songs maybe twenty years ago? Well, Music‘s new AutoMix feature takes that to eleven by trying to actually find the perfect DJ-style moment to segue into the next song. In my experience it does work, but it is definitely kind of trippy.3 You can also pin favorites to the top of your library, whether it’s a song, album, playlist, or artist.

Can’t remember the name of that café you stopped at on your most recent road trip? If you opt into the new Visited Places feature in Maps, you can search or scroll through places you visited—even by category or city. All the information is stored privately and securely, so nobody else can see it, not even Apple, and you can easily remove items from the list (or correct erroneous ones). It’s also a great way to retroactively build a guide of places for a trip you’ve taken. There’s also a Preferred Routes feature that supposedly learns about how you make regular trips, but as someone who works from home, I don’t expect to get too much use out of this.

I don’t generally use alarms, so an adjustable snooze time in the Clock app doesn’t really do much for me, but I know some people will be excited, so here you go. However, this does come with one interface criticism about the alarm screen, which now has equally sized Stop and Snooze buttons, leading to the possibility of sleep-addled people hitting the wrong button. Here’s hoping a future beta considers that and maybe makes some tweaks.

Screenshot of a notification on a mobile device. The notification reads 'Found in iCloud Inbox' and 'Siri Found an Order' with a 'Show' button. Below it is another notification from 'Gap' addressed to 'Dan Moren' with the day 'Wednesday.'

I do, however, order lots of stuff on the internet so I’m fascinated to see how Wallet‘s new Apple Intelligence-powered tracking of orders works. There have long been popular third-party apps that handle bringing all your orders and shipments into one spot, but if this really can do it all automatically, that’s worth the price of admission for me. You can also see this pop up in Mail, where a banner will tell you that it’s detected a shipment and prompt you to add it to Wallet.

Hallelujah, you can now select the text inside a bubble in Messages. I know it’s not the flashiest improvement, but it’s always seemed absurd that this was an all-or-nothing proposition. I mean, you can copy just some text out of an image these days, for heaven’s sake. A small but very meaningful improvement.

Last, but hardly least, CarPlay gets a handful of new features, including the new Liquid Glass design, a smaller call notification, tapbacks in Messages, and Widgets. I really want to like widgets but two things hold me back: first, my CarPlay screen is very small and can only show one widget at once, and second, I’ve struggled to figure out which widgets are actually useful while I’m driving. Most of the time I really do just want my map and whatever media is playing. Maybe on a bigger screen they’d be more compelling. I’m a little worried that tapbacks will encourage people to interact with the screens too much, but at least it’s a quick action and not, say, typing out a reply.

Even in what seems like a modest update, there’s way more in iOS 26 than I can go through here. And as the beta period progresses, it will be interesting to see how the thinking on major elements like the design continue to evolve and change. Apple’s already shown that it’s receptive to feedback, and while not every complaint or criticism that people have is likely to result in a change, you never know. There’s plenty to dig into over the next few months before this fall’s release, so if you want to give it a shot, the public beta is out there and waiting for you.


  1. Largely to my mom, my wife, and my best friend from college who doesn’t really do texts. Hi Brian. 
  2. A forward button does appear next to it if you go back, however. 
  3. This morning, I had The Fratellis’ “Flathead” segue kind of seamlessly into “Down Under” by Men at Work, and I’m trying to imagine the DJ who would give that a go. 

初体验:iPadOS 26 公共测试版

2025-07-25 01:20:00

Screenshot of a tablet displaying a webpage, a note-taking app, and a music player.

iPadOS 26, now available as a public beta, is one of the biggest updates in iPad history. There’s a new design that changes the look and feel of the whole interface, yes, but also the introduction of a whole raft of productivity features that lift the iPad closer to the Mac—for those who want to use it that way.

It’s like a weight has been lifted from the soul of the iPad. It remains a very nice device to use in full-screen mode with all the simplicity attendant to that mode, or via a single tap it can turn into a multi-window, multitasking device that’s appropriate for the Mac-class hardware underpinning today’s iPads. The iPad no longer feels like it’s trying to live up to the promise of being the Future of Computing; with iPadOS 26, it’s more comfortable being itself.

An invasion of glass

A music app screenshot showing a library with playlists like '1980s,' 'Workbook,' 'So,' 'War,' and 'Deep Cuts.' The top menu includes options like 'Home,' 'New,' 'Radio,' and 'Library.'
Album art can slide under the frosted-glass sidebar in the Music app.

Apple is introducing a new design language across all its operating systems for the ’26 “model year,” and that means iPadOS is getting more rounded elements, a bunch of “Liquid Glass” elements that are translucent with glass distortion effects, and more.

In the time since introducing these operating systems, Apple keeps tweaking the design, so it’s hard to judge it completely. Apple’s trying all sorts of different techniques to keep the cool look of transparent glass elements without sacrificing legibility, with varying degrees of success. Backgrounds dim or are faded in or out to enhance contrast; text colors sometimes invert; color schemes alter based on what you’re scrolling and how fast you’re scrolling it. And in the four developer betas that preceded the Public Beta, the sands seemed to be shifting with every single release.

On the iPad, it’s a real jumble. Some stuff looks cool, while other stuff is unreadable. For the most part, the new design didn’t hinder my use of iPadOS 26, and given those shifting sands I’m going to withhold my most withering design criticisms for a later time. But, yeah… Apple either needs to figure this thing out, and fast, or it should just frost all the glass for release and keep working on it in the background until it finds a more usable solution.

Embracing windows (and the Menu Bar)

stoplight animation
With window power comes window management, like the familiar “stoplight” buttons. You have to tap/click to enlarge them before using them, though.

iPadOS 26 will be remembered as the update where Apple declared bankruptcy on all its previous attempts to do windowing and multitasking on the iPad, and released an entirely new windowing system that has been unabashedly inspired by the Mac.

In earlier eras, Apple reluctantly accepted multitasking by introducing Split View and Slide Over, and then later Stage Manager, which created a windowing system that was not Mac-like at all. Windows couldn’t be resized freely, or placed freely, or overlap other windows in the wrong way.

Apple is over it. Go ahead, put those windows wherever you want (even hanging off the side of the screen), resize them to any size, put other windows on top, and even control them using the three familiar stoplight buttons in the top left corner. It works more or less the same as the Mac, and it works on all iPads that can run iPadOS 26, even the iPad mini. It also works on external displays, and I admit to forgetting more than once that I was using iPadOS when it was attached to my Studio Display.

There’s a limit to the total number of active windows you can have open at once, but I certainly never felt constrained in my use on a 13-inch iPad Pro, or even on that 27-inch Studio Display. If you’re an inveterate window-keeper-opener you might feel differently, but it all seemed perfectly normal to me. Apps that have been built for prior forms of iPad windowing work with the new system well.

There are plenty of ways that iPadOS stands ready to assist you with window management, too. Stage Manager is no longer a windowing system, but just an optional window-collection utility like it is on the Mac. You can click and hold on the green stoplight button (yep, those familiar Mac elements are now part of iPadOS) in any window to be offered a set of tiling options, just as on the Mac. And a full suite of keyboard shortcuts will move windows around, too, from Globe-F to toggle full screen to more esoteric ones like Globe-Control-Right Arrow to tile your window on the right half of the screen.

If you’re not using a keyboard, you can also flick windows with your finger to manage them. Flicking to the top makes them go full screen, and to the sides will tile them on half the screen. Windows remember their previous position and size, so if you drag a window back out from a tiled position, it goes back to its previous state. Double tapping on the top of the window also will toggle between full screen and a floating window.

A tablet screen displays multiple windows. The main window shows a webpage with text and an image of a desk setup. Other windows include a music player, a list of articles, and a settings panel. The dock at the bottom shows various app icons.
Exposé helps you keep track of many iPad windows.

And there’s more. Swipe up a little from the bottom (either on your trackpad or with your finger) and Exposé kicks in, showing all your current open windows so you can pick the right one to bring forward. Swipe a little further up and you get access to your home screen, including widgets. (You can also click on the wallpaper to do this.) The windows all slide to the sides of your screen, so you can bring them back with one tap, or swipe up again to hide them entirely. You can also tap on an app in the Dock to bring up all its open windows.

It’s really a flexible set of controls that works well whether you’re using a keyboard and trackpad or your fingers. Not only does this all work well, but it will be instantly familiar to Mac users. After a decade of Apple resisting the Mac as a model for iPad multitasking, it’s finally given in to the obvious: the Mac is great at this, and if Apple can’t come up with something better for the iPad, then it should implement the very best windowing interface in the world. The iPad isn’t becoming the Mac, but it’s built a windowing system that works really well, and that’s thanks to the Mac.

And if you don’t want to use windowing on your iPad? Well, the feature is turned on and off with a single button in Control Center. Just as I use my iPad with the Magic Keyboard only a small portion of the time, I use my iPad in multi-window mode a very similar portion of the time. The rest of the time, it’s in single-window mode and works just fine.

I’ve heard from some fans of Split View and Slide Over, two original iPad multitasking features that have been killed in iPadOS 26. While I understand their frustration, it’s quite easy to tile two windows in iPadOS 26, at which point you’ve basically got Split View. Having an app hiding on the side ready to slide in is an interesting use case, but it was always too easy to stick an app there and not be able to get it out, so I suspect Apple is making the iPad less susceptible to frustration by killing the feature entirely. Multitasking: you’re either in or you’re out. No middle ground.

A screenshot of a notes application showing a list of notes and folders. The active note contains text about books and movies, with options to create new notes, folders, and manage shared folders. The interface includes a toolbar with various icons.
Exploring the many menu items in Notes.

Speaking of the Mac’s influence in the iPad, it extends beyond windowing to… the Menu Bar. That’s right, after teasing us with the possibility for four years, Apple has finally decided that apps in multi-window mode can offer a Menu Bar. It only appears when you’re in multi-window mode, and only when you activate it by moving the pointer to the top of the screen, swiping down with a finger, or typing Command-shift-slash to search an app’s menus. And it’s centered, which is… not quite Mac-like, but close enough.

Currently, the contents of that Menu Bar are mostly the things that were previously on offer when you held down the Command key to see what keyboard shortcuts were available, but app developers now have access to an API that lets them set more complex sets of menu items.

I love this idea. For years I’ve been using iPad apps that are sometimes close to as dense and full-featured as their Mac equivalents, and frequently find myself baffled about just where a certain feature might be hidden behind an icon, inside a tab, down at the bottom of a scrollable pane. Organizing functions in menus is another classic Mac feature that really does make sense for complicated software—and even for relatively simple stuff! It just takes a little bit of a mental shift. But within an hour of installing the first iPadOS 26 beta, I found myself invoking a feature of an iPad app from the Menu Bar.

In fact, I think Apple should go one step further and let users opt to keep the Menu Bar visible all the time. The status bar items on the top left and right of the screen are basically the same height, so why not? Perhaps in the future, Apple could also give users more control of what appears in the status bar, following its own lead in allowing Control Center items on macOS to be arbitrarily added or removed from the Menu Bar itself.

As a fan of the original iPad pointer, I’m sad to report that it’s been replaced by a new, Mac-inspired one. The reason the old one died is a pretty good one: it was meant to represent the touch target of iPad software designed for fingers, and Apple is now accepting that sometimes pro users want more precise pointer control than that. (Also, those new stoplight buttons are smaller than the old pointer circle!) I’ll miss the morphing cursor because I think it might’ve been the strongest example of the iPad rethinking and outdoing an old Mac idea, but the new pointer fits like a comfortable old shoe.

Pro podcasts and Files

For years, those of us who do audio or video work have lamented that iPads (and iPhones) are unsuitable for a lot of podcasting/video work because they aren’t able to record audio or video locally while also participating in a call over an app like Zoom. In iPadOS 26, Apple has solved the issue by adding support for local background recording.

It’s a relatively simple interface: Choose your input microphone, turn on local recording in Control Center, and your call will be saved out when you’re done. It won’t work when you’re not actively using a microphone (and optionally, a camera), so the feature can’t be used for snooping. In all of my testing, it’s worked quite well, though I wish it supported the ability to manually adjust the gain of the microphone since some microphones are too loud and others are too quiet. But beyond that, it just works—and means that podcasters can just bring an iPad or iPhone and a USB microphone and get their jobs done.

A screenshot of a file management interface showing a list of options for sorting files by different attributes, such as kind, date modified, date created, date last opened, size, tags, and iCloud status.
You can choose what shows up in Files columns now.

Ever since the introduction of the Files app, Apple has been slowly tiptoeing toward the idea that some iPad (and iPhone) users might need to manage files in a filesystem the same way that Mac users do. The Files app in iPadOS 26 has been upgraded to feel more, shall we say, Finderesque. As someone who lives in List views, it’s a relief to see that Apple has let the columns in this view be customizable, so I can sort my list by creation or last-modified date, while seeing the type and size of the files, just the way I like it. Files also displays folders with a disclosure triangle, which you can click or tap on to expand and see the contents inside that folder. Again, not a revolutionary idea—but it wasn’t there, and now it is, and that’s progress.

Another huge feature that Mac users take for granted, but wasn’t really a part of iPadOS before, is the ability to open files in specific apps and to set default apps for file types. To assign all your files to a specific app, just select one, choose Get Info, and choose a default from that panel. If you’re someone who is opening files from within Files, you will find it a major improvement.

There’s also a major improvement when it comes to long file copies, especially ones happening across a network: they generate a progress window that can be made into a Live Activity, allowing you to leave Files while you keep tabs on the progress of the operation.

This is part of a larger upgrade to iPadOS that allows apps that perform lengthy, finite tasks to do so in the background. Previously, apps with lengthy exports—Final Cut Pro and Logic Pro are two good examples—had to be kept in front until they were done with their jobs. That sort of single-mindedness would never fly on the Mac, where you can always switch to another app while you’re doing an export (or file copy). That ends in iPadOS 26, and I look forward to trying out apps that have been updated to support the new background tasks API.

Shortcuts get smarter

The screenshot shows a mobile app interface with actions like 'Get file from Shortcuts at path,' 'Split File by New Lines,' and 'Use Cloud model.' The description box offers instructions for generating filenames and describing images.
I rebuilt an iPad shortcut to use Apple’s Private Cloud Compute model.

With iPadOS, Shortcuts take a huge leap forward with the introduction of the new Use Model action, which lets you tie in workflows to Apple’s on-device and Private Cloud Compute AI models, as well as ChatGPT itself. The Six Colors staff has already built several automations using this feature, and while the results can be somewhat random and take some effort to process, they can also enable the creation of workflows that were previously impossible using Apple’s existing tools. Shortcuts also now includes pre-baked Apple Intelligence features, like Writing Tools summaries.

These models will transform some automations. Let me give you an example: When I write on my iPad, I still need to upload images to my server to be displayed along with the article. I built a workflow in Shortcuts to do the job, but given the inscrutable hidden filenames of items in the Photos library, I had to build in a step where I see a preview of my image and give it a filename.

I rebuilt this workflow in iPadOS 26 using Use Model to pass the image to Private Cloud Compute and ask it for two responses: a text description of the contents of the image and an appropriate filename. While I had to add some validation to avoid weird responses, the results were instantaneous: I no longer need to name the files, because Private Cloud Compute does the job perfectly well. Uploading images from my iPad just became much easier, and that’s just my first step.

New apps and updates

There are a few notable app additions in iPadOS 26, the biggest of which is Preview, a longtime Mac favorite that’s finally making the move to the iPad. I’ve written a lot of this review in multi-window mode with both a text editor and Preview open, and it’s just a pleasure to use compared to using the previous stock choice, Quick Look mode inside the Files app.

Preview’s a great utility for viewing PDFs and other files, and on the iPad it gets access to mark-up features that are especially useful to users of the Apple Pencil. I’m sure dedicated PDF marker-upers have their own preferred apps to do the job, but as a person who very occasionally has to mark up a PDF, I’m looking forward to having access to Preview combined with my Pencil Pro. (I probably won’t use the new calligraphy pen style while marking up my PDFs, though.)

The Journal app, which was introduced by Apple in 2023 but only on the iPhone, has come to the iPad and the Mac this year. I never really used the Journal app on the iPhone, because for me the iPhone is the device I use when I’m out and about—and too busy for something like journal entries. The iPad, on the other hand, fits perfectly in my life for those more laid-back, contemplative moments. I’m glad the app has made the move… but seriously, what took it so long? It took two OS cycles to put an iPhone app on Apple’s other platforms?

One final note about a new feature in an old app—or is it an old feature? In any event, last year Apple tried to merge the main Photos interface into a split view to show off both the photo library and its rich Collections view. I thought it was a smart, if inelegant, design choice since so many people have no idea how much Apple has put into surfacing photos using Collections.

The inelegance (and a pretty vociferous user reaction to the change) seems to have prompted a re-think. Photos now defaults to the Library view, with Collections available from the sidebar. I hope people will tap over there and look at their Collections; there’s no point in having tens of thousands of photos in iCloud if you only ever look at the dozen most recent ones.

Big swings for the platform

This is just a public beta, but the vibes are good. Apple’s new windowing system is great, embracing all the things that make the Mac work without forcing it on people who don’t want it. The improvements to Files, support for background recording, and the new background tasks Live Activities are somewhat small changes on their own, but assembled together they create an iPad that just feels more ready for professional productivity tasks.

Not everyone wants to use the iPad in a professional productivity context, and that’s fine. But if Apple’s going to keep selling the iPad Pro at prices higher than a MacBook Air and approaching a MacBook Pro, with the hardware of those computers inside, it needs its software story to match up. I know it seems hard to believe, but iPadOS 26 may end that narrative for good.

初体验:watchOS 26 公共测试版

2025-07-25 01:20:00

watchOS 26 Public Beta

After a pretty big overhaul a few years back with watchOS 10 and a more modest update in watchOS 11, I’d describe this year’s update—now numbered 26 like the rest of Apple’s platforms, and available as a public beta—more focused.

Sure, there’s a new Liquid Glass design that aligns with the rest of the company’s platforms, but the vast majority of big new features focus on a single app—Workout—which gets not only its own UI overhaul, but also a big new Apple Intelligence feature, Workout Buddy.

watchOS 26 isn’t without its tweaks and enhancements, though how much they help you may rely more on both what version of the Apple Watch you’ve got, as well as the ins and outs of how you use your Apple Watch everyday. And, of course, there are a few features debuting across Apple’s platforms this year that show up on the Apple Watch too.

Through the looking glass

Like the rest of Apple’s platforms this year, watchOS 26 gets a new Liquid Glass look. You’ll see this most prominently on the Photos watchface, where the numerals of the digital clock are now refractive. It’s…a look. I’m not sure I love it on the Apple Watch, but bear in mind that my personal watch, on which I installed the beta, is a Series 7 that lacks either the nice wide-angle OLED display of the Series 10 or the larger display of the Ultra/Ultra 2.

Three Photos watch faces with Liquid Glass numerals
Crouching numbers, hidden time.

Given the size of the numerals on the screen, I often found them harder to read when against a bright or varied background. My usual watchface is a rotating set of photos of my wife and kid, which can have a lot of fine detail—I made a separate Photos face with nature pictures, which fared better (and definitely ended up with cooler layering effects for the numerals).1

One interesting note: as opposed to the iPhone, where the clock changes from Liquid Glass to a more solid look when the display is dimmed in its Always On mode, the Apple Watch retains the glass look when dimmed. I think I prefer the iOS approach here; it’s less attention-grabbing and more legible, which is what I want when the display is inactive.

Liquid Glass throughout watchOS 26
Besides in the Photos face, you’ll see Liquid Glass most prominently in watchOS 26’s system UI elements.

You’ll also see the new look in other places throughout watchOS’s user interface, such as notifications, the Smart Stack, and Control Center. Even the numeric keypad you use to unlock your Apple Watch has gotten a a glassy overhaul. As with all of Liquid Glass, one of the challenges is that it can seem somewhat distracting: for a stated goal of getting the UI “out of your way” it all to often seems to yell “look! look how cool I am!” And on the watch, where the UI is rarely overlaying actual content, I have questions about how much that stated goal really applies.

For all of that, Liquid Glass is generally less prominent on the Apple Watch, given the more limited screen size and content. If you don’t like it for the clock on the Photos watchface, good news: you can easily switch the tint of the colors to solid white or any color you like.

Workout, buddy?

Sometimes it feels a bit like Apple has one of those machines they pick lottery balls from, and every year it picks a ball to decide which app is going to be lavished with attention. This year it’s Workout’s turn. Not only does it get a big new Apple Intelligence-powered feature, Workout Buddy, but it gets an extensive redesign that reminds me of the overhauls seen by Fitness and Weather back in watchOS 10.

The main screen in the Workout app in watchOS 26
The interface for Workout has been redesigned to a more watchOS 10-style.

The stacked “cards” of workouts have been replaced by a full-screen model, though you can still cycle through the available options by using the Digital Crown or by swiping up and down. Rather than burying functions behind one of those three-dotted More buttons, as it previously did, Apple’s now divided them up into several different icons at the corners of the interface: in the top left you’ll find options to customize your workout view; in the top right, options for your workout such as goals or routes; in the bottom left, media options; and in the bottom right, notification settings, including Workout Buddy. And of course, there’s a big button right in the middle to start your workout.

While I may not be a die-hard exercise fanatic, I have long used the outdoor walk and outdoor cycling workouts, and lately I’ve been trying to run more regularly with the Nike Run Club app, so let’s say I dabble. I like some of the new features, in particular the ability to have a workout start playing a certain playlist—and keep in mind, that this is specific to type of workout. You can have it start playing your running playlist for outdoor runs, or your podcast queue—or nothing—for outdoor walks. It can choose music it thinks is appropriate for the workout you’re doing or you can specify the audio you want. And, in the latter case, you’re not limited to Music—media from the Apple Podcasts app is also available, as is audio from third-party apps that support the requisite API.

I do think the different types of workouts require a few more taps than they used to, but I do appreciate that you can create and store different workouts and then launch any of those at a tap, whether distance-, time-, or calorie-based, as well as more complicated workouts like intervals, pacers, or race routes.

And then there’s Workout Buddy. This feature compiles and analyzes fitness data from your previous workouts, compares them to your current workout, and then gives you feedback using one of three synthesized voices.2

Look, I may be a bit biased: as I said above, I’ve been using the Nike Run Club to get back into running recently, and it offers Guided Runs recorded by a real live human coach.3 I’m going to say, flat out, that a synthetic voice that keeps you updated on your progress is no substitute at all for a real person. While a human coach can offer thoughts and even emotional support, Workout Buddy is far more focused on metrics. It peppers those with occasional bits of encouragement, it’s true, but overall it’s closer to a spoken notification—there’s no soul there. Sorry, robots. You’re not quite ready to dream of electric sheep yet.

That said, I don’t want to discount that some people may find value in it. Perhaps it will help you avoid looking at your watch to see how far you’ve gone. Perhaps those little bits of encouragement are all you need. And, as always, Apple’s feature could be an on-ramp for people who might otherwise never try something like this, and could prompt them to check out other options.

But there are some limitations to the feature that might also impact people: for one, it requires an Apple Intelligence-capable iPhone be near your Apple Watch during use, so you’ll need at least an iPhone 15 Pro or later in order to use it. It also means you’ll have to carry your iPhone with you on your workout, which might be a non-starter for some folks. (Personally, I prefer to run with just my Apple Watch and AirPods Pro.)

Stacked roster

Starting in watchOS 10, the Smart Stack redesigned one of the main aspects of the Apple Watch experience. No longer were you limited to either small complications on the watchface or a full-blown watch app. Instead, you could scroll down to view a variety of widgets that apps could offer, including Live Activities that showed up when appropriate.

A smart stack hint for the Camera app.
The Smart Stack hint for the Camera Remote is subtle, but then you have to tap again to actually use it.

In general, I’m a fan of the Smart Stack widgets. For me, they strike a nice balance of providing more detailed information than a complication without having to launch a whole app. watchOS 26 adds a few small improvements to the Smart Stack to try and make it more useful.

First, there are Smart Stack hints. These take the form of a little icon that pops up when watchOS detects you doing something where you might want to open an app. For example, if you open the Camera app on your iPhone, you’ll see a little icon prompting you to open Camera Remote on your Watch—no sound, no haptic, just an unobtrusive icon.

But here’s the first headscratcher. Tapping that icon does not open the Camera Remote. Instead, it scrolls you down into the Smart Stack to a full widget that you can tap to open the Camera Remote. I suppose that this is in line with it being a gentle hint, but it feels like if I’m being prompted to do a thing, it shouldn’t take two taps to do it.

The next frustration is that, in my time so far with the watchOS 26 beta, this is the only Smart Stack hint I’ve gotten to appear reliably. Apple says that the system uses “improved prediction algorithms that fuse on-device data and trends from your daily routine” to surface suggestions, but so far, I guess I don’t do anything regularly for the system to provide hints. I’ll be keeping an eye on this for the rest of the beta period to see if it does end up finding more utility.

Configuring Smart Stack widgets
Glory be, you can finally configure the three-up widget style that Weather and other apps use with just the data you want.

The other tweak to the Smart Stack is equally small, but more welcome: some widgets with multiple types of data on them are now more configurable. The prime example here is the Weather widget, which defaults to three gauges showing temperature, windspeed, and air quality. Personally, I rarely care about windspeed, but I do care about the UV index; now I can just swap that in, rather than having to add an entirely separate Weather widget for it.

I’m not sure how many other widgets take advantage of this particular format and thus will offer these configurations, but even just offering it in the Weather widget is a tangible improvement for me.

Bits and bobs

In addition to a few of the features available across several Apple platforms this year—Call Screening and Hold Assist in the Apple Watch, Live Translation and backgrounds in Messages—there are a handful of other features that come to the Apple Watch.

A checklist in Notes on watchOS 26
Note to self: story checks out.

After a decade of absence, Notes finally comes to the Apple Watch. You can view existing notes, add a new note with Siri or via the keyboard, and trash or pin notes. But while you can interact with certain elements—most significantly checking off checklist items—you can’t edit the content of your notes.

There’s a new gesture, Wrist Flick, which lets you quickly dismiss a notification or mute calls. Handy, but unfortunately, it’s only available on the Series 9 and later (not including the SE) and the Ultra 2, so I wasn’t able to test it on my Series 7.

For those absolute monsters who don’t mute their Apple Watch, a new automatic volume adjustment feature promises to detect the noise level of your environment and tweak your watch’s volume so that you don’t get a loud DING when you’re some place quiet. Or you could just leave your watch on silent like a good person.

Even Apple has realized that the watchface situation has gotten overwhelming, so it’s reorganized the Face Gallery (which you see when you add a new watchface) into categories, including new, health and fitness, photos, clean, data rich, and more.

For users of Live Listen, you can now view a live transcript of what’s being heard right on your Apple Watch, including the ability to jump back ten seconds if you missed something. It’s an impressive piece of technology, hopefully helpful to those who need it.

While this year’s Apple Watch update might be on the smaller side, there are definitely things to like about it. And not every year needs to be a blockbuster revision of everything that comes before. Sometimes quality of life improvements are worth it just for that: improving your life’s quality. And for a device that often goes everywhere with people, that they wear right on their body, improving the quality of life can have a meaningful impact.


  1. Interestingly, the closest precursor to the Liquid Glass design is the “soap bubble” style numerals of the handwashing timer that Apple added back during the height of the pandemic. I wonder if that’s where the idea started… 
  2. If you’re old, like me, you might remember way back when Apple and Nike teamed up to offer a running accessory that included a sensor to go in your shoe and a dongle for your iPod, and occasionally offered pre-recorded nuggets of encouragement from famous athletes. I’d argue even that was better than this, but your mileage may vary. Literally. 
  3. The very upbeat Chris Bennett, who’s been described as “a real life Ted Lasso”