MoreRSS

site iconSix ColorsModify

Six Colors provides daily coverage of Apple, other technology companies, and the intersection of technology and culture.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Six Colors

Apple discontinues the Mac Pro ↦

2026-03-27 05:19:01

Mac Pro

Chance Miller calls the time of death at 9to5 Mac:

It’s the end of an era: Apple has confirmed to 9to5Mac that the Mac Pro is being discontinued. It has been removed from Apple’s website as of Thursday afternoon. The “buy” page on Apple’s website for the Mac Pro now redirects to the Mac’s homepage, where all references have been removed.

Apple has also confirmed to 9to5Mac that it has no plans to offer future Mac Pro hardware.

A quiet end to what was once the flagship of the Mac product line. But time comes for us all.

Over the years, as laptops rose in prominence and other Mac desktops added power, the Mac Pro increasingly became a niche, high-end device. After the disastrous trash-can Mac Pro design, Apple made good on a promise to return the Mac Pro, and shipped a new take on the “cheese grater” enclosure. But the move to Apple silicon really killed the product dead, since Apple’s modern chip architecture doesn’t support external GPUs, which was one of the last reasons to buy a Mac Pro.

In the interim, the Mac Studio has become the top-of-the-line desktop. It’s great. RIP to a real one, but it’s time for us all to move on.

Go to the linked site.

Read on Six Colors.

Vision Pro and Cosm: Two of a kind?

2026-03-27 00:30:45

Basketball game streaming live in a Cosm.
Public spaces like Cosm might be a good content fit with Vision Pro.

The Apple Vision Pro feels like a product that’s waiting for the world to catch up, but the reality is closer to the opposite. The world is waiting for a reason to use it and that reason hasn’t quite shown up yet.

There’s very little wrong with the hardware. Apple built something that works in a way first-generation devices rarely do (says the guy old enough to have bought a Newton at launch) with displays that feel natural rather than novel and an interface that disappears quickly enough to let you focus on what you’re seeing.

The problem comes the moment you take it off. There isn’t a strong pull to put it back on. It’s impressive, even remarkable in bursts, but it doesn’t yet fit into a daily rhythm. That’s not a hardware problem. It’s a content problem, and more specifically, a cadence problem. Apple has treated immersive content like a prestige release schedule, carefully curated and spaced out, which works for television but not for behavior. If you want people to build a habit around something, you need volume and consistency, not occasional brilliance. Right now, Vision Pro feels like something you check in on rather than something you live inside, and that distinction matters more than anything on the spec sheet.

Neal Stephenson’s skepticism lands because it recognizes that gap. If the content never reaches a point where it becomes necessary, the headset remains optional, and optional devices rarely scale. What’s interesting is that the missing piece isn’t hypothetical. It already exists in a different form, outside of Apple’s ecosystem, and it’s showing up in a place that Apple understands better than most companies: people paying for experience.

Cosm is the cleanest example of that. It’s easy to dismiss it a high-end gimmick, a giant dome with a better screen, but that misses what’s actually happening inside those venues. People are buying tickets, planning nights around it, treating it as something closer to attending a game than watching one. The technology matters, but the behavior matters more.

Cosm is already generating meaningful revenue and drawing repeat customers, which tells you this isn’t just novelty value. It’s tapping into something real, the idea that proximity, or at least the feeling of it, has value even when the event is happening somewhere else.

The challenge for Cosm is that scaling that experience is difficult. These are expensive builds that require the right locations, the right partnerships, and enough capital to expand without diluting the quality that makes them work in the first place.

That is exactly the kind of problem Apple has solved before. It’s not just about having the cash, though Apple certainly has that. It’s about having the discipline to build a system that can expand without losing its identity and the distribution to make it visible at scale. If Apple owned something like Cosm, it wouldn’t just be a set of venues. It would be a front door. You could put an Apple Store in the lobby and it wouldn’t feel forced. It would feel like a natural extension of the experience, a place where people encounter the hardware in the context of something they already understand.

From there, the path to the home becomes clearer. Vision Pro, or whatever lower-cost version follows, doesn’t need to stand on its own as a category. It becomes an extension of something people have already bought into. The idea of watching a game “from somewhere else” is no longer abstract because they’ve already felt it in a room with other people. At home, it becomes a different version of the same experience, missing the crowd and the waiter, but gaining convenience and access.

The critical shift is in how Apple approaches rights. Trying to own sports outright is a losing strategy. The costs are too high, the competition too entrenched, and the fragmentation too deep. Apple has made smart moves with MLS, F1, and selective partnerships, but doubling down on exclusivity won’t unlock this. The better path is to work alongside the existing ecosystem. Install Cosm camera systems at major events, not as replacements for the broadcast but as an additional layer. Let networks and leagues sell that immersive feed as a premium product, with Apple taking a share for the technology and distribution. It’s additive rather than competitive, which makes it easier to scale and harder for partners to resist.

Apple has always been at its best when it connects behavior to technology in a way that feels inevitable in hindsight. Right now, Vision Pro still feels like a solution looking for a problem. The problem, or more accurately the opportunity, is already there in how people respond to immersive sports experiences. Cosm has shown that people will pay for that feeling. The hardware is close enough to deliver it at home. The gap is building the bridge between those two things in a way that feels continuous rather than experimental.

If Apple gets that right, the conversation around Vision Pro changes quickly. It stops being about whether people want to wear a headset and starts being about what they’re missing when they don’t. That’s the point where adoption tends to take care of itself.

The earliest days of Apple ↦

2026-03-26 22:50:39

Harry McCracken has put together an amazing oral history of Apple’s earliest days. You should read the whole thing, but this anecdote from Chris Espinosa, who still works at Apple after all these years, is the part that made me laugh the most:

I was sitting there in the Byte Shop in Palo Alto on an Apple-1 writing BASIC programs, and this guy with a scraggly beard and no shoes came in and looked at me and conducted what I later understood to be the standard interview, which was “Who are you?” I said, “I’m Chris.” … Steve Jobs’s idea back then of recruiting was to grab a random-ass 14-year-old off the streets.

The rest is history!

Go to the linked site.

Read on Six Colors.

“For All Mankind” returns with more Mars drama

2026-03-26 16:00:16

Mireille Enos in “For All Mankind.”

The fifth season of Apple TV’s “For All Mankind” premieres March 27—really, the evening of March 26 for those of us on the West Coast. For the last few years, Dan and I have been reviewing episodes on our “NASA Vending Machine” podcast and I’m excited to have the show back.

As always, “For All Mankind” is about taking big swings. There’s always a dramatic, history-changing moment or shocking twist that’s not too far away. Set in an alternative past where the Space Race kept going after the Soviets landed on the moon (yep!), season four took us to a 2003 where Mars colonists sought more autonomy by hijacking an asteroid.

This season, which takes place in 2012, is still primarily set on Mars, though there’s also some space adventure in the offing. Apple tech fans will enjoy that we’ve finally reached the iPhone era, though the iPhones on “For All Mankind” are a little thicker than the ones we remember, and they might actually be Newtons. There are also a lot of early-2010s iMacs on display.

While the first episode has to do a lot of work reminding you of what’s happened recently and setting up the new power dynamics at play this season, subsequent episodes get pretty intense, pretty fast. At times the show plays with police procedural, mystery story, even car-chase adventure… familiar TV genre stuff, except it’s all on Mars! Mireille Enos of “The Killing” plays an important new role as an investigator for the Mars Peacekeeping force who is suspicious that several different crimes might have been committed out on the surface. There are also a bunch of returning faces, some expected and some quite surprising. (And also, yes, Joel Kinnaman is still in the show even though Ed is now basically in his eighties.)

I’ve seen the first six episodes thus far, so I don’t know where it’s all going, but I’ve sure enjoyed the ride. “For All Mankind” continues to use its alt-history setting to tell dramatic, almost operatic stories that can also disturbingly have relevance to current events in our own world.

How can Siri automate Shortcuts when it’s so opaque?

2026-03-26 06:18:02

Screenshot of Python code editing software with image scaling script.
Claude Code takes advantage of a real development environment.

I’m pretty skeptical that Apple’s new Siri-wrapped Gemini will be able to accurately and reliably assist with automation. Gemini will be the foundation to Apple’s foundation models, but there’s no there there. Apple has no well-documented, debuggable, inspectable system to execute automation with, unless you count ancient and inscrutable AppleScript, and you shouldn’t.

Sure, LLM chatbots will spit out code (even AppleScript!) if you ask them to, but it might not work. It gets substantially worse when you’re asking LLMs questions about Shortcuts.

Go ahead and ask any chatbot to describe how to make a Shortcut to perform some automation that you’ve been wanting to do and then try to assemble what it suggests. It’s extremely tedious, prone to user error, and isn’t in any way guaranteed to work even when it’s all put together.

Agents that hook into development environments are much better than a bare chatbot because they can inspect, run, and debug the code they are generating. They aren’t perfect, but if you have an agent like Claude Code hooked up to an development tool like VS Code and start describing some Python script you want, it’ll execute and iterate until the output is what you asked for.

If humans don’t have access to documentation, to actionable debug output, logging, the ability to bypass/ignore actions as part of testing, and the ability to copy and paste snippets of code, then how can the new Siri do it?

Right now, Shortcuts works with AI models by passing some input and then receiving the output. When something goes to the model, the model transforms the data, and delivers a result back to Shortcuts. That’s a non-deterministic workflow, so any change to the model, or even just randomness in general, can produce different output. This means you can’t reliably troubleshoot or adjust it without introducing uncertainty in what new outputs you’ll get.

When working with an agent to assemble automation in an IDE, the code it builds is deterministic, so it will keep working even if the model changes. Not everything you want to automate requires LLM functionality when it runs, but not everything you automate should require hours of labor to fabricate the deterministic workflow version of it.

I really hope that the magic of new Siri isn’t going to be that it will just do things with bare actions and App Intents, magically, without any user-accessible process, or as a blob inside of a Shortcut you need to make. If I ask Siri to reorder a list, and it doesn’t do it correctly, I want to be able to access the scaffolding it created to see what went wrong, not just keep asking Siri to do it again in slightly different ways until I get output I like.

If Siri doesn’t produce anything inspectable, or it produces a Shortcut, then there’s not much work humans or AI can do to fix things.

AI cut below the rest

The problem the Shortcuts app is supposed to solve has never been solved, because no one really knows how to use Shortcuts unless they become a Shortcuts expert. Shortcuts is user-friendly in appearance, but not in practice. It’s meant to welcome people who don’t know anything about programming with its friendly drag-and-drop interface, and searchable actions panel.

Unfortunately, the names for actions don’t always say what they do, and the documentation is often a vague piece of filler that’s frequently reused for more than one Shortcut action. Even experienced programmers can get flummoxed when they try to search the available actions for seemingly standard functions, like reversing a list.

Magic connections are magic, until your script gets any longer than the length of your screen and you need to start dragging actions around, inevitably breaking connections and making unintended ones. With a text-based script you’d have to keep track of the names and spelling of your variables, but they don’t change out from under you if you add more lines of code above or below them.

You can’t do one of the most simple, and useful things in scripting, which is commenting out (ignoring/bypassing) something to test or evaluate alternatives.

A lot of the time, when people are using Shortcuts, they’re relying heavily on the run shell script action to do actual programming that lets them write normal, vanilla code, or ssh’ing into a server from iOS to do the same thing. It’s nice that Shortcuts can do that, but shell scripts aren’t cross platform, and ssh’ing into a server is in no way accomplishing Shortcuts’ mission.

Without logging, you can’t ask Siri why your automation that was supposed to run in the middle of the night didn’t run. Maybe it was a permissions issue that was never raised when the shortcut was created. You, and Siri, just don’t know.

AI rising tide lifts all boats

Again, Apple doesn’t have to do these things just for humans, or just for Siri. They are in no way mutually exclusive.

If the concern is that Shortcuts shouldn’t be like a programming language, with tracebacks, and logs which would put off “normal people” then just remember that “normal people” don’t really use Shortcuts. They ask a chatbot to just do it, and Siri, as Apple’s chatbot, could take advantage of those fiddly, programming bits and perform its role better, in a way that was auditable.

I have seen people make frantic posts on Mastodon about how AI is deskilling programmers, but the beauty of Shortcuts is that Apple already applies the deskilling at the factory.

(Podcast) Clockwise 649: All Vocation, No Avocation

2026-03-26 06:04:53

Our latest personal tech projects, twenty-five years of macOS, our networking setups, and where we turn for up-to-date information.

Go to the podcast page.