MoreRSS

site iconSix ColorsModify

Six Colors provides daily coverage of Apple, other technology companies, and the intersection of technology and culture.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Six Colors

我与Mac、Apple和Macworld的故事(Macworld/Jason Snell)

2026-04-02 21:00:38

Apple has turned 50, and this week I realized that I’ve been writing professionally about the company for two-thirds of its existence. (Excuse me while I try not to turn into dust and blow away in the gentle spring breeze.)

Continue reading on Macworld ↦

苹果为仍在使用 iOS 26 的用户发布了 iOS 18 安全更新

2026-04-02 04:56:55

Last December I complained that Apple was withholding iOS 18 security updates from iPhones capable of running iOS 26, leaving users who didn’t want to upgrade to Apple’s latest OS version yet in some security peril.

Well, I have good news and bad news. The good news: As of Wednesday April 1, Apple is pushing out iOS 18.7.7 to all devices running iOS 18. This update, released last month for devices that were not capable of running iOS 26, is now available even for compatible devices. If you’ve got auto-update turned on but have not gone through the steps to do a full upgrade to iOS 26, this update can be automatically pushed and applied. This is good news, as those who have opted not to run iOS 26 will get to take advantage of several sets of security releases.

Now the bad news: This is happening because of some really bad security breaches like DarkSword and Coruna. As Apple noted in a security update:

We enabled the availability of iOS 18.7.7 for more devices on April 1, 2026, so users with Automatic Updates turned on can automatically receive important security protections from web attacks called DarkSword. The fixes associated with the DarkSword exploit first shipped in 2025.

Now, to be clear, security patches on an older operating system are not as effective as they are on an entirely new system, since a new OS like iOS 26 has all sorts of structural changes made for security reasons. As a new Apple security note says, iOS 26 “contains the strongest security protections.” If you’re very concerned about your iPhone being secure, updating to iOS 26 is going to make it more secure than updating to 18.7.7.

But this does mean that Apple’s patches, which seek to break the chain of bugs that led to serious security exploits, are available to many more people.

Bottom line: If you’re an iOS 26 holdout, and you’re not ready to update your iPhone, at the very least you should update to 18.7.7 and protect yourself from some seriously ugly malicious software.

(播客) 《Clockwise 650》:世界上最温和的嘉宾阵容

2026-04-02 04:26:49

In this April 1st edition of the show, Philip Michaels returns to steal the show from Dan and Mikah (and Jason!) and force them to compete for points for their punditry.

Go to the podcast page.

苹果50周年:必将,必将辉煌

2026-04-02 03:17:40

A man poses next to a vintage computer with a green Matrix-style screen, a PlayStation controller, and a Pikachu figurine on top. The setup is on a wooden desk against a speckled wall.
The author, slightly more than half of Apple’s lifetime ago.

A 50th anniversary is a good time to reflect on your relationships, and it seems lots of people have thoughts about their time with Apple today. I would definitely not be where I am in life without the company, for both good and bad, so here are mine.

Technically, my days with Apple started by playing games on my next-door neighbor’s Apple II in the late 70s or early 80s. When enough time has passed, the exact memories naturally become a little bit fuzzy. It was certainly before I got my own Commodore 64 in 1983, I know that much, but I don’t think I can exactly claim to have been there from the very beginning. Anyway, little did I know back then that I would actually get to house sit for the guy who designed the thing. Foreshadowing. 

My best friend’s dad was a university professor from California, and he had brought over an Apple II of some flavor. I don’t remember them being common over here otherwise—the UK had a weird home computer industry all of its own, but this was probably just the perspective of a little kid who only wanted to play video games.

I eventually graduated from my C64 to an Atari STe around 1989, which had many better games than a Mac, and built-in MIDI ports as well. It was also way cheaper than a Mac, and it was totally fine. There was a GUI and a mouse, and those are all the same anyway, right?

Then, just a year later, I started a degree in Computing Science at the University of Glasgow, and back then all the computers in the labs were Macs. Generally, Mac Pluses or SE/30s, with the occasional brand new LC in the second-year labs. And so I used them, and I realized quite quickly that Atari had completely ripped off the Mac GUI, and not exactly done an amazing job of doing so. 

I’ve had this experience two or three times in my life with technology, using something and realizing that it’s an inflection point for everything else going forward. The first was those early days with the Mac. Okay, so I was six years late to the party, so you are entirely right to question my definition of “early days.” Still, the user interface was so well designed and thought out, and it just made sense to me in a way that no computer had really done before. System 7 came out shortly afterward and improved everything even more.

At this point, we’d been doing most of the development work for our coursework in THINK Pascal, and I quickly realized I could use that to make my own applications. This history has been covered well, but I wrote the first version of my calculator PCalc in 1992 on my brand new Mac Classic. (Sorry, Atari.) I bought an LC II some time later, probably a month before the LC III was announced. I even fitted it with a maths co-processor! I started working on my application launcher DragThing in 1994.

Again, this is well documented, but I was soon determined to work for Apple. And a few years later, I got my wish, working in Apple’s software engineering group in Cork, Ireland. It was a lot easier to get a job with Apple in late 1996 than it is today, but my existing apps certainly helped me get a foot in the door. However, as I discovered after joining the company and moving to a different country, Apple was actually on the verge of complete bankruptcy.

I’m told that, like with having kids, you block out a lot of the difficult times of your life, and generally remember the good bits. Well, I remember a hell of a lot of bad stuff from those years, so who knows how bad it really was. 

Gil Amelio appeared and fired so many people across the company that soon our little engineering group all fit around the one table for lunch. It was an extremely stressful time to be at Apple, but also probably one of the most interesting – I got to witness the return of Steve Jobs first hand, after all. Another inflection point, really.

I worked on a bunch of things while I was there, including the only two things that actually shipped from my less than four years on that team – the Disney 101 Dalmatians and Hercules Print Studios that were bundled with Macintosh Performas. That was enjoyable and relatively low-stakes work. I learned how to program in C++, use a UI framework (Metrowerks PowerPlant), and generally work as part of a team. I was even co-team lead on the Hercules one. However, staring at pegasi all day meant that I did not see the film, and I still have not to this today.

I was then placed on the iMac project somewhat unknowingly, and ultimately the Dock and Finder – the source of all my best Apple anecdotes. Then Steve Jobs happened, I resigned, etc etc. You know how this story goes, I assume. In any case, I met a lot of good people at Apple, some of whom I am still in touch with today. Companies do not care for you, but at least some people do.

In all, it was a relatively short time working there. I was not important in the least, and I did not really do anything of note. I worked on lots of cool stuff that didn’t ultimately ship, sure. Put it this way: I am unlikely to be an entry in any Apple history book. 

And I was so relieved when I left. I was 27, and I was young enough then that I didn’t really know how stressed I had been working in that environment. The weight off my shoulders was enormous, even if being an indie developer came with its own set of slightly different artisanal weights. You know how some people have stress dreams about doing exams? I still have stress dreams that I’m back working at Apple.

I did not part from my ex on the best of terms, but it has remained a big part of my life, and we kept uncomfortably meeting up at parties.

I rewrote PCalc again after I left, and through a random pressing of my business card into the hand of one Phil Schiller at a WWDC, it ended up getting bundled with the iMac G4s in the U.S. I probably made more money from that deal (and a weekend’s work to change the app into U.S. English) than I did from all my years of salary at Apple.

I am definitely still in Apple’s orbit, or perhaps just past their event horizon. I am forgetting many things now, including Widgetgate and Lodsys.

And yes, I also ended up getting to know Woz, and stayed with him for many years in a row during WWDC time. We’re still in touch occasionally to this day. It is absolutely wild to me that I know one of the founders of Apple, who basically invented the personal computer. I got to chat to Douglas Adams because of Apple as well – he used DragThing, and I added several features to it just because he asked. Frontier scripting? Absolutely, Mr. Adams, right away, sir.

I’ve also known Jason Snell for something like 32 years at this point, since he was a youth at MacUser, and nowadays have the pleasure of doing podcasts with him at The Incomparable and Relay. So many good friends in my life have happened because of Apple, directly or indirectly.

I’m also, I will admit, doing reasonably well because of them. Then again, Apple is doing pretty well because of me. If I calculate the 30% or 15% of all the sales of PCalc in the App Store, I’ve probably easily paid back my entire Apple salary and all the PCalc licensing fees. But then again, the Apple stock I got in the late 90s is worth a little bit more these days, too, so ultimately I can’t really complain.

Whenever I purchase a new Mac with the money I have made from selling things on the App Store, it does at least make me think how ridiculously circular these things are. A disturbing amount of my lifespan has consisted of moving money slowly back and forth between Apple and me, whether I’m working for them or not. I think I’m currently ahead, but who knows what the future holds. I do sometimes wonder if I never actually stopped working for Apple.

Anyway, DragThing lasted nearly half an Apple, at 25 years. PCalc is still doing well, some 34 years later (I just need to hold on for another eight.) Dice by PCalc is a recent addition, based on my return to playing D&D, but it at least constantly amuses me. I suspect I will still be doing this long after I retire.

So here’s to the next 50, Apple. I do still miss you sometimes.

苹果50周年:从叛逆者到帝国?

2026-04-02 01:06:21

As Apple hits its half-century milestone, it seems like we’re all of us waxing a bit rhapsodic about the company, its products, and their effects on our lives. So who am I to skip out on a trip down memory lane?1

Thirteen-year-old Dan sitting at a Macintosh LC with a book open on his lap.
Portrait of the author as a young man.

Weirdly, I was born almost perfectly in between the founding of Apple on April 1, 1976, and the release of the first Macintosh on January 24, 1984. But the former was only one of two events that occurred around that time that would go on to have a profound impact on my life. Because just over a year after Apple was founded, on May 25, 1977, release of the original Star Wars.

Oddly, those two events are intertwined at various points, not only with my life, but with each other. That’s true both in time and in space, where ultimately, these two influences would effectively bracket the San Francisco Bay Area, with Lucas’s Skywalker Ranch just north of the city and Cupertino to its south.

And the connection extends even further—the interplay between the rise of computer technology and its effect on modern moviemaking. John Knoll, the creator of Photoshop, would go on to work for Lucas’s groundbreaking visual effects firm, Industrial Light and Magic. A group within Lucasfilm would later evolve, with funding from Steve Jobs, into the animation studio Pixar (which, along with Lucasfilm, would be eventually acquired by Disney). I definitely had a wallpaper on my Mac in college photoshopped with Steve Jobs and George Lucas in it—what can I say, I know who I am.2

There are thematic ties, too. I wasn’t the only Mac fan amongst my friend group, but in the 1990s we were engaged in pitched battle with the behemoth that was Windows. It lent something to our identity, then—we were no less scrappy underdogs than the Rebel Alliance fighting back against the evil Empire.

(I can admit, from this later date, that I cast envious glances at my friends’ PCs, able to run games like TIE Fighter and Might and Magic, while I had to wait for those to come to my platform—if they ever did. As the years went on, I persevered, reading my monthly issues of Macworld cover to cover, devouring books like the Macintosh Bible and digging up weird shareware, as though I could keep the company going through my sheer persistence.)

For a large part of my childhood, both Apple and Star Wars struggled, falling upon hard times. After 1983’s Return of the Jedi, there were no more Star Wars movies. Meanwhile, Apple nearly tumbled into oblivion.

I vividly remember sitting in our kitchen one morning, listening to the news on the radio while my dad made his coffee, and hearing a dire story about Apple. My dad, knowing my enthusiasm for the company, asked if I thought it would survive—maybe the first time I felt like he’d ever asked me a real opinion on something happening in the world.

I won’t say that it had never occurred to me that it was possible Apple would cease to exist, but it was something I didn’t really have the tools to process. So, naturally, I assumed it would survive somehow, as unlikely as that seemed—as sure as there would be new Star Wars movies someday. The narrative’s stronger when you’re a kid, when you don’t really understand how the world works and your only real templates are stories.

Dave Filoni on stage with a Star Wars presentation at WWDC.
A talk by now-Lucasfilm president Dave Filoni at WWDC 2014.

So I closely followed all the developments of those dark times: the transition to the Power Macs, the attempts to create a modern successor to Mac OS, devouring every tidbit of information with the no less fervor than I digested every new Star Wars novel. Any port in a storm.

And then in another close coincidence that is too strange for fiction, dual lights at the end of the tunnel: just as Steve Jobs returned to the company he’d founded, George Lucas announced that a trilogy of Star Wars movies was on the horizon. It seemed that faith had been rewarded and hope was once again on the horizon.3

Staying foolish

My life has always been kind of a push and pull between these two influences—forces, if you will4—of technology and storytelling: Venn diagram circles with an overlap sometimes larger or smaller. As a teenager, I both wrote and distributed some really terrible shareware on local BBSes and, for several years, collaborated with one of my best friends to publish an online magazine for sci-fi and fantasy.5

In college, I majored in English because I loved writing stories, but almost all my work experience, starting in late high school, was in tech: a nascent web company, IT work at a university library during summers and vacations, teaching fellow students about technology at my college. Freshman year, I got a reputation as the English major who would fix all the computers of the engineers on our floor—even though I was only one of a handful who had brought a Mac to college amidst the sea of beige—or, increasingly, translucent blue plastic6—PCs.

Dan at 13 in a blue armchair reading Macworld magazine.
The Force is strong with this one?

Even after college, I worked in IT and web development while toiling away on my first novel. The first piece I ever had published was about Star Wars and it led to the conviction that I could get a job writing—and it just so happened that job was writing about Apple. The rest, as they say, is history.

Always in motion is the future

As this milestone has approached, I’ve wrestled with my own feelings about Apple. Last year, as I wrapped up my ten-year stretch as a columnist at Macworld, I wondered whether we should even be fans of a company. A year on, I feel even more confident in my conclusion that it’s probably unwise to allow your identity be dictated in any small part by a for-profit corporation whose needs will not ultimately be aligned with yours.

Frankly, it’s a conversation I’ve had to have about Star Wars over the years—more than once.

The truth is I still view myself as an enthusiast of Apple and of Star Wars, even today. Without the former, I wouldn’t be here talking to you. I’m not sure I could have devoted this many years of my life to writing and talking about something for which I don’t have strong feelings. And without the latter, I don’t think I would constantly be writing stories that try to capture the way Star Wars enthralled me as a kid.

Dan with a stormtrooper at WWDC.
Hopefully this stormtrooper at WWDC 2014 wasn’t an omen.

But being an enthusiast certainly doesn’t mean being uncritical—honestly, none are so critical as those who view themselves the true enthusiasts. Amidst the recent years’ resurgence of both Star Wars and Apple, there’s been no end of criticism—some certainly less well-founded than others—from those who profess themselves the most ardent enthusiasts.

However, if I can trot out another old trope, you either die the hero or live long enough to see yourself become the villain. That’s the knife edge Apple is poised at now; some might argue that it’s too late, that Apple has already tipped itself over onto the side of full-blown villainy.

But maybe there’s one more lesson to take away from Star Wars here: even Darth Vader managed to redeem himself in the end. You don’t have to be the scrappy underdog to make the right decision. It’s never too late to hoist the pirate flag and think different.


  1. Although, have you seen RAM prices? Memory lane is pretty expensive real estate these days… 
  2. I assume the two of them must have met at some point, but I’m frankly shocked that I can’t find any direct evidence of it. As far as I can tell, not a single photo of the two of them together exists. And isn’t that suspic—no, no it’s not. 
  3. Unfortunately, sometimes the light at the tunnel is a Death Star superlaser firing. 
  4. AND EVEN IF YOU WON’T. 
  5. Spurred on, in large part, because West End Games wouldn’t accept my submission for the Star Wars Adventure Journal since I was too young. 
  6. The year was 1998, after all. 

50年后,苹果依然掌握着自己的命运

2026-04-02 00:00:07

Vintage Apple II computer with a beige monitor, keyboard, and floppy disk drive in a glass display case.
Museum piece. Photo: Alejandro Linares Garcia, CC BY-SA 3.0.

I am usually so focused on Apple’s present and future that I don’t spend a lot of time ruminating about its past. And yet, as its 50th birthday has approached, it’s been impossible not to think Big Thoughts about the Big Picture.

So here’s one: Apple has been remarkably consistent — across 50 years and numerous CEOs and the vast sweep of late-20th- and early-21st-century history — in a few key areas. The people change (except Chris Espinosa!), but some of the ideas have managed to stay the same. And I think that’s meaningful.

Here’s what it boils down to: Apple is a company that chooses to build the whole product, while controlling its own destiny. That was true in the 1970s, it’s still true today, and it’s perhaps the company’s definitive trait.

In the olden days…

The early personal computer market was a hodgepodge. Different companies rose and fall, all offering different devices that were essentially self-contained and proprietary—compatibility across devices was almost nonexistent. Even programs written in the same language might not run across different systems, since they might each implement the languages differently.

During those days, Apple was playing the game that pretty much everyone else does. Sure, there were some computers using the standardized CP/M operating system—you could install a card on an Apple II to let it run CP/M, even!—but mostly you got what you got when you bought the box. Apple IIs ran Apple stuff, TRS-80s ran TRS-80 stuff, the Atari 400 ran Atari stuff, Commodore PETs ran Commodore stuff… that was it.

But in the early 80s, almost the entire computer industry got flattened, and the reason was the IBM PC. Not that IBM did the flattening itself, but it had that effect: Since the IBM PC had been created using standard computer parts in order to get it out quickly, it became relatively easy for any other company to build equivalents. Its operating system was not actually owned by IBM, but was created by an upstart software company called Microsoft.

What happened next changed the entire computer market: Dozens of companies began making IBM PC compatible computers running MS-DOS from Microsoft. The generic Microsoft/Intel PC was born, and almost every other competitor was ruined. Atari and Commodore hung on for a while, but by the early ’90s, there were only pretty much two kinds of personal computers anyone would seriously consider buying: IBM PC compatibles running Microsoft software, or the Mac.

That was it. The rest of the market had capitulated. Only Apple hung on. And as someone who started writing about Apple during that time, I can tell you that nobody expected Apple to make it. Analysts either wrote that Apple should become like the other PC makers and just license Microsoft Windows, or that Apple should become like Microsoft and just license Mac OS to PC makers. Those were the choices.

Apple, to its immense credit, stayed true to itself. (Let’s not mention that brief dalliance with Mac clones.)

The whole widget

A man in a dark sweater sits at a desk with a blue plush toy, a white mug, and a computer. Papers and a red box are nearby. He appears thoughtful, resting his chin on his hand.
Portrait of the author as a college editor. Super Grover’s crimes are redacted.

To me, this is the core of what Apple is as a company: It makes the whole product. It is not a licensee adding value, like so many of its competitors. This is an attitude that started with Woz designing the hardware and software to work together, leaving a deep impression on Steve Jobs. That impression combined with Jobs’s innate focus on creating a complete product (in an era where most computers were still sold as assemble-it-yourself “kits”) and created an enduring legacy.

People often call Apple’s obsession with owning and controlling the primary technologies behind its products the Cook Doctrine, after current CEO Tim Cook, but that’s a value that goes back to Steve Jobs. Among the more modern examples of this approach:

  • Safari came to be because, as the Web rose to prominence, the Mac was increasingly judged based on its performance at Web browsing, and the default Mac browser was Microsoft’s Internet Explorer. Microsoft’s allocation of Mac development resources helped determined the success of Apple’s key product. That was a no-go.
  • iWork (Pages, Numbers, and Keynote) exist because it means that every Mac, iPhone, and iPad can work with Microsoft Office apps and documents right out of the box, without any extra purchase required. In releasing its own productivity suite, Apple provided instant Office compatibility and no longer needed to rely on Microsoft to do the right thing with its Mac software releases.

  • Apple silicon itself is Apple’s reaction to being held hostage by the long-term plans of chip suppliers who didn’t have Apple’s interests at heart. Every Intel chip that appeared in a Mac came from an Intel road map that was built based on the overall needs of the computer market, of which Apple was a tiny part. Every Apple silicon chip in a Mac comes from Apple’s own product road map, and the chip improvements are based entirely on Apple’s needs and synchronized with Apple’s software-development road map.

  • The C1/C1X chips that serves as the cellular connection in the iPhone 16e, iPhone 17e, iPhone Air, M4 iPad Air, and M5 iPad Pro—and will eventually power every new Apple device with cellular connectivity—is a reaction to Apple’s frustration with the dominant cellular radio provider, Qualcomm. Apple can now tune its own cellular chips to its own specific needs rather than relying on the parts Qualcomm builds for the entire market.

(Are AI models a primary technology? Who knows. Apple tried to build some, failed, and has decided to pivot to use Google’s AI models… for now. But if Apple ever feels that it absolutely has to have its own AI models running on its devices and in its data centers, I have no doubt that it will spend whatever it costs to make that happen. It’s just in the company’s DNA.)

You may have your own favorite examples of Apple going its own way, and counter-examples of Apple going with the crowd. Certainly, Apple has chosen to pick its battles. The G3 iMac, for example, dumped all the proprietary connectivity that Macs used to have, and just supported the industry-standard USB. Compatibility can be valuable to Apple, to a point. But beyond that point, the company knows it must go it alone—or it’ll end up being just another face in the crowd.

Over 50 years, that’s one thing that has remained true about Apple: You never forget that you’re using an Apple product. It doesn’t do generic—not in 1976, and not in 2026.