MoreRSS

site iconSix ColorsModify

Six Colors provides daily coverage of Apple, other technology companies, and the intersection of technology and culture.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Six Colors

(播客)Clockwise 625:我的房子已达到智能家居的最高水平

2025-10-02 03:41:33

Amazon’s new hardware rollouts, the health tech we use, our favorite third-party apps, and Google’s new home ecosystem.

Go to the podcast page.

visionOS 26 评测:继续迈向未来

2025-10-02 02:34:56

A man with glasses looking up in a modern, glass-roofed corridor with plants and bright lighting.
The side of my head does look like that!

If visionOS and the Vision Pro are all about charting a course to the future of wearable devices in front of our eyes, Apple needs to keep pushing toward that future at every opportunity. Fortunately, visionOS keeps moving forward, with several substantial feature improvements that have rolled out in updates over the past year-plus.

The ultimate fate of Apple’s vision products remains unclear. I have to assume that the long-term goal is a pair of lightweight glasses that we can use to overlay software on top of our world. Everything between now and then is about developing the technology to make that possible. And with visionOS 26, Apple is doing what it needs to be doing: iterating, making everything better, and building out an entirely new operating system one block at a time.

Better Spatial Personas

Apple had already taken its most uncanny launch feature (the dead-eyed Personas) and made it shine with a software update. It could have considered it a solved problem, but it didn’t.

Instead, visionOS throws away the improved Personas entirely and replaces them with even better versions. The new Persona engine makes these digital replicas of their users look real in a way that the previous versions did not. Part of the trick is that more of the head is now captured when you’re setting up a Persona—which means that as a Persona turns its head to the side (or you move around), the Persona still looks like a person and not a face hanging off the front of a mannequin head like cheese on a pizza.

I’d say that Personas have exited the Uncanny Valley, but that’s not really true—the old update got them out of the valley. But these are better, more expressive and human. Hair and complexion are improved, and there are more customization options for glasses. I absolutely despised the original implementation of Personas on visionOS, but Personas version 3.0 is excellent.

Maybe it’s the fact that they look better when you see the side of the head now, but Apple has chucked the idea of personas as two-dimensional objects, at least for FaceTime calls. They’re Spatial by default now, your friends can just drop down in your space or whatever virtual environment you’re in. It was already a good experience, and now it’s even better. I do occasional calls with my tech journalist friends who have Vision Pros, and it’s like we’re hanging out in person. There’s really nothing like it, and it just got way better in visionOS 26.

Unfortunately, while beards look better on visionOS 26, they still limit a Persona’s mouth movement.

Geographic persistence and widgets

A small kitchen with a washing machine, cabinets, and a digital clock showing 10:33. A screen displays weather and energy usage data. Lofted bed visible in the background.
I can attach widgets in my laundry area, and they’ll stay there!

In the long run, assuming AR glasses are a thing (which is what we’re all assuming here, because that’s why this whole project exists), you’ll want to be able to place a virtual item somewhere and have it still be there when you come back to it later. You know, like actual items in actual reality.

In previous versions of visionOS, there was basically none of that persistence across reboots. If the Vision Pro shut down (this happened to me most frequently during long breaks between sessions, where it would shut down to preserve battery), all windows were closed. You could set up a whole careful world of windows and objects, but they were all temporary, so why bother?

visionOS 26 addresses that. Now you can leave items in one place and they’ll appear when you enter that space, even if the Vision Pro has rebooted or shut down in the interim. Windows are always where you left them. It’s great for short-term reusability, and a must if you take the long view. Now I can place a virtual object like a TV set from the Sandwich app Television and stick it on a table, and know that the next time I come back to that location, it’ll be there. (And you can always press and hold the Digital Crown to gather all your windows in front of you, in case you left your Safari window in another room.)

A big beneficiary of geographic persistence is the new ability for visionOS to use widgets from other apps. Widgets have spread across all of Apple’s other platforms, but this is their first appearance on visionOS, and they’re a great fit.

visionOS lets you place widgets on physical surfaces like walls, where they’ll remain anchored. (You can actually do this with just about any visionOS window so that you can snap a window to a wall, ceiling or floor and it’ll stick there.)

In visionOS 2, I tried a bunch of different widgets from apps like Widgetsmith and Windora, but their lack of persistence made me just give up on using them. Now that shouldn’t be a problem. To add a widget, you open the new Widgets app and then browse through a list of widgets offered by various apps, including both Apple’s and those from third parties.

I was able to pop a couple of widgets onto the wall in my bedroom, right above the TV, and refer to them while watching a movie. They were the Clock widget from Apple and a weather widget I built for myself using JavaScript in Scriptable that worked just fine as a widget on visionOS, docked to the wall.

The new Photos widgets are interesting—you can attach a panorama to a large widget and stick it on a wall, making it look like a window, very much in the same style as the visionOS app Windora. Unfortunately, when I tried it, the panoramas were so zoomed in that most of what I wanted to see wasn’t visible. I also couldn’t really make the window very large. I wish there were more ways to adjust panoramas, within reason. Other photo widgets featuring spatial images were a bit uncanny, popping out of the frame and moving whenever I moved my head. My best luck came in sticking plain old flat pictures into widgets on walls.

Environments in a holding pattern

A digital interface overlay on a Jupiter background shows a music player with 'Jupiter from Amalthea' playing. Controls include play, pause, and volume, with options for 'Sunset' and 'Night' modes. Text describes the track's theme.

Immersive Environments (the desktop wallpaper of visionOS) are another favorite visionOS feature of mine. While it’s clear that Environments are on Apple’s mind in visionOS 26, the direction things are going is anything but clear.

There’s one new Environment in visionOS 26: Jupiter. This environment places you on the tiny moon Amalthea, looking out at the solar system’s largest planet. Previous environments have been more or less static—a swaying palm tree here, a stray rain shower there—but Jupiter is much more interactive. You can enter Control Center and adjust the time of day (in other words, the amount of Jupiter’s face that’s in sunlight) and the speed at which time passes. Embedded in all of this is the suggestion that future Environments may be more customizable.

Jupiter is spectacular to look at and play with. As a tech demo, it’s outstanding. But like the previous space-based Environment on the surface of the Moon, I found the Jupiter environment tough to work from. I love space stuff more than the average person, but I don’t enjoy working there.

While I’m glad that Apple is apparently investing in figuring out what an Environment should be when it grows up, I also have to admit that I’m disappointed in the lack of progress on any other front. Jupiter is the only new Environment added in visionOS 26, and it wasn’t that big of a list to begin with.

What’s worse, there are some really fun environments built into third-party apps that remain entirely locked into those apps rather than being able to be used globally. Recently, Disney+ added the creepy Alien: Earth containment room as an environment, and HBO Max added a Harry Potter-themed environment. Previous environments have included Star Wars, Avengers, and Game of Thrones-themed ones. But if you want to hang out and check your email atop Avengers Tower, you can’t. It’s only available in the Disney+ app.

I don’t get it. As with so much of visionOS 26, Apple has left me wanting more. A lot more.

Spatial scenes for everyone

In a welcome sign of rapid iteration, Apple has thrown out last year’s algorithm that turned flat photos into remarkably good 3-D ones, and replaced it with the same multi-layered spatial scenes that it’s featuring in effects in the other 26 operating systems.

The result is an image that doesn’t just look 3-D, but which adds more of a perspective change when you move your head toward the image or from side to side. It can’t reveal information that’s not really there, of course—there’s some smudgy generative filling going on in the background—but the effect is still impressive.

Total Party Kill in 360.

I also got to live out an alternate life as an extreme sports enthusiast by watching some Insta360 footage directly on my Vision Pro, courtesy of visionOS 26’s new support for extremely wide field-of-view video formats from 180- and 360-degree cameras from the likes of Insta360 and GoPro. I took some 360 footage I shot myself (of people playing Dungeons and Dragons at a table, and not an adventurer parachuting into snowy backcountry—c’mon, this is me!). After opening the file in Files, visionOS asked if I wanted to convert it to a more Apple-friendly format, and began playing the video immersively. Just like that, I was standing atop a table surrounded by D&D players.

Spatial browsing

There’s a new Spatial Browsing mode in Safari that answers the question, “What if Safari Reader, but in three dimensions?” When you enable Spatial Browsing, spatial environments drop away, the browser window resizes itself, and is surrounded by a gray backdrop. On pages where Safari Reader isn’t supported, it’ll look like a normal webpage. On pages with Reader support, they’ll be displayed in Reader format, with a little floating window off to the side offering an Apple Intelligence summary.

As you scroll across images in this mode, Safari will automatically use Apple’s new spatial image processing to convert them into 3-D images. It feels a little weird that Apple is altering images on other people’s websites. For news sites featuring stories about current events stories it can be a little disconcerting.

The very nature of visionOS makes me perfectly happy to witness Apple deploying a few wacky features and asking us, “Is this a thing?” I’m not sure Spatial browsing is really a thing. I think I’d rather just have Safari Reader and maybe the option to convert any image I’m looking at into a 3-D one.

The rest is visionary

There are a bunch of other features hiding at the edges of visionOS 26 that deserve a mention.

A new “look to scroll” feature frees you up from needing to make hand gestures in order to scroll content in apps like Mail and Safari. It’s a little awkward, since I also look to read, which makes me very aware that I’m looking at words I’m not reading yet in order to scroll the page so I can see more words to read. In general, I don’t mind using a couple of fingers to scroll, but I don’t mind having another, less obtrusive option.

Apple has added passthrough support for more devices, notably iPhones and game controllers. This is important when you want to see something—like your hands, or a keyboard or laptop—in an immersive environment. The new feature worked spottily for me, sometimes exposing my iPhone—and with a new auto-unlock feature, making my iPhone much more usable when I’ve got the Vision Pro on—but other times exposing the edge while the screen itself was invisible. Frustrating. I’d also love it if Apple would add pass-through support for keyboards other than its own.

While native visionOS apps are best, the ability to use iPadOS apps in visionOS is a major boost to productivity. Unfortunately, visionOS’s iPad app support is stuck in a previous era. As of iPadOS 26, iPad app windows can be resized to any sort of shape when in multi-window mode—but visionOS still forces you to use them in a locked portrait or landscape aspect ratio. This needs to be addressed, as resizing iPad apps would make them fit in much better on visionOS.

visionOS 26 is looking to the future by adding support for consensual viewing of items when two people are using Vision Pro in the same room. Right now, it’s awfully unlikely that you and a friend are going to bring your combined $7000 in Vision Pro hardware together just to watch a movie or play checkers, but as more people get devices like this, you’ll need the ability to share widgets and objects and whatever in person, not just remotely. Apple has presumably implemented this feature by combining its existing SharePlay technology with the same stuff that powers geographic persistence.

At an Apple demo back in June, I was able to manipulate a shared 3-D model of an astronaut in a space suit in collaboration with an Apple representative who was wearing his own Vision Pro, and we walked around it and gestured to it as if it were a real thing, because we both saw the very same VR object. I don’t know if I’ll use this feature any time soon, but it shows how Apple continues to build out features that it’ll need in its AR platform of the future.

Unfortunately, I haven’t yet gotten a chance to try one of the visionOS 26 features that excites me a lot: support for hand controllers, namely the PlayStation VR 2 Sense controller. I’ve felt strongly for a while that Apple needs to expand what’s possible on Vision Pro by adding support for games. No, the $3500 Vision Pro is never going to be a game console, but by far the weakest thing about the platform is a lack of content—and there are plenty of VR games out there that could make the platform more appealing, especially if more affordable versions are coming eventually. I hope Apple adds this promised feature to visionOS 26 soon.

Forward to the future

visionOS 26 shows that Apple isn’t asleep at the switch when it comes to developing features for its newest platform. If this thing is heading somewhere, it needs to keep showing forward progress, and I feel that progress with this update. Widgets and broad support for geographical persistence are important points of growth.

But the pace of change is sometimes frustrating. Apple has made great strides with some of its best features, like Spatial Personas, while others seem to be moving more slowly. The new Jupiter Environment is technically impressive, but the feature needs more attention.

As always with visionOS, it comes back to the long game. As long as Apple keeps pushing forward and building out its AR platform of the future, I’ll be confident that the company is on the right track. visionOS 26 offers robust evidence that the work remains ongoing, but the to-do list remains a mile long.

两部iPhone,一台iPad,和一个播客:与Final Cut Camera的冒险

2025-10-02 01:14:58

A man with a beard and glasses wearing headphones sits at a desk with a microphone and a water bottle. A smartphone screen shows a video recording interface with settings like '30 FPS,' '4K,' and a red record button. '26 MM' is displayed.
Myke framed by Final Cut Camera on an iPhone.

A bit lost in the hubbub around the announcement of the M4 iPad Pro back in May 2024 was Apple’s announcement of Final Cut Pro 2.0 for iPad and the accompanying Final Cut Camera app.

I know it was a bit lost because I used both of those apps last week to record the YouTube version of the Upgrade podcast and when I mentioned it to a bunch of people, they all seemed… surprised? I guess this one didn’t land, or at least didn’t land beyond a smaller group of video-oriented people.

So, in the interest of reminding you: the Final Cut Pro for iPad/Final Cut Camera thing (note to Apple: you couldn’t give this system a clever brand name?) is pretty amazing. You can connect up to four iPhones to an iPad running Final Cut Pro and use the iPhones as remote cameras to record a live event, adjusting settings on the fly and ending up with a full-resolution multicam project ready to be edited.

Myke Hurley and I were in Memphis for the St. Jude Podcastathon and needed to record Upgrade in person before we checked out and headed to our respective homes. Setting up an audio recording in a hotel room is easy, but recording video adds a bit of complexity. Last year, I used a 360-degree camera to shoot our episode, but not only was the result ugly, it also took hours of file conversion and uploading to get something usable.

Then I thought about using iPhones on tripods1 with Final Cut Camera. Given that it was new iPhone season, we knew we’d have plenty of iPhones, and I made sure we had three iPhone-compatible tripods. I figured we’d connect three iPhones to my iPad Pro, which I’d keep next to me to make sure everything was working properly. The system uses Wi-Fi direct for transfers (i.e., the devices talk to one another directly), so you don’t need to have a fast Wi-Fi network… which is good, because our hotel didn’t.

After moving around a lot of hotel furniture to try and get proper angles for a shot of me, a shot of Myke, and a two-shot of us both, we connected the iPhones to my iPad. While two of them connected immediately, the third iPhone wouldn’t connect no matter what we tried. We rebooted things, we disconnected and reconnected from both sides… nothing worked.

In the end, we decided to just use the iPad itself as a third camera. We didn’t have a tripod that fit it, so I put my suitcase on top of Myke’s bed, perched the iPad Pro on top of that, started the recording, and hoped for the best.2

Screenshot of video editing software with four video clips of a man speaking into a microphone.
The result of the shoot was a three-camera multicam clip ready for editing. (Shown here in Final Cut Pro for Mac.)

It turned out pretty well, all things considered. (The video settings of the cameras were mismatched, which I’m kicking myself about now—that’s my fault. I forgot to change the iPad’s capture settings as I had for the two phones.) I stopped our recording when we ended the regular show, and by the time we were done with the members-only portion of the audio podcast, the full-quality video files had streamed from the iPhones to the iPad.

When that was done, Myke used AirDrop to send me the audio files from our microphones. When I got to the Memphis airport, I sat in my favorite spot, ordered a drink, and got to editing.

Having our individual audio files helped this process a lot: With a glance at the waveform, I could tell which one of us was talking at any given moment. That meant that I could judge, entirely from the waveforms, whether I should be using a single-camera view or if there was enough back-and-forth that I should be using the two-shot.

Final Cut’s Multicam Clip feature made switching easy. On the timeline, it appears as a single clip, but you can bring up a view displaying all the different angles and tap to switch between them. Taps are immediately reflected on the timeline as cuts, which you can, of course, undo or switch as needed.

The edit went so smoothly that the entire 107-minute-long podcast was edited before I was halfway to Dallas. (I had expected that I’d need to export the project and edit it on my MacBook Pro, but I didn’t.)

Connectivity glitches aside, the entire process worked remarkably well. The next time I’m recording video of an in-person event, I’ll be sure to bring an old iPhone or two and possibly invest in more iPhone tripods.


  1. I used one of these, which is cheap but does the trick. Your mileage may vary. 
  2. It worked, though the iPad’s framing wasn’t ideal so I had to crop it later. And if something had gone wrong during the shoot, I had no way to know. (Fortunately the shooting iPhones save their video clips locally, so if there had been a disaster, we would’ve had a fallback.) 

反弹 566:你一切都做错了

2025-10-01 21:37:37

We talk bags, notebooks and upgrade complaints and get mad at streaming services.

Go to the podcast page.

升级 583:正确的感觉真好

2025-09-30 06:24:35

Jason’s spent a week with the iPhone Air and Myke’s been using his iPhone 17 Pro, so it’s time to discuss deeper thoughts about this year’s models. Also: internal chatbot chat, Intel investments, and miniature Macs!

Go to the podcast page.

赞助商:Clic for Sonos

2025-09-30 00:00:51

If you use Sonos hardware you deserve the best. Clic for Sonos is the fastest native Sonos client for iPhone, iPad, Mac, Apple Watch, Apple TV, and Vision Pro. There’s no lag, just seamless Sonos playback. It’s easy to get set up, giving you smooth control, whether you’re playing to a single device or grouping multiple speakers together.

Clic for Sonos offers deep integration with native Apple technologies, with support for Widgets, Live Activities, Shortcuts, a Mac Menu Bar app, and support for Control Center. It works with your Sonos library, Apple Music, Spotify, Plex, Tidal, and TuneIn, and supports lossless and Dolby Atmos. And Scenes can now play music, so it’s one tap to group, set volume and play a playlist.

Try it for yourself and you’ll see. Six Colors readers can get one year for just $9.99 (30% off) or lifetime updates for $30 (50% off). Go to clic.dance/sixcolors for all the details.

No lag. No hassle. Just Clic.