MoreRSS

site iconSix ColorsModify

Six Colors provides daily coverage of Apple, other technology companies, and the intersection of technology and culture.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Six Colors

Apple at 50: From rebel to empire?

2026-04-02 01:06:21

As Apple hits its half-century milestone, it seems like we’re all of us waxing a bit rhapsodic about the company, its products, and their effects on our lives. So who am I to skip out on a trip down memory lane?1

Thirteen-year-old Dan sitting at a Macintosh LC with a book open on his lap.
Portrait of the author as a young man.

Weirdly, I was born almost perfectly in between the founding of Apple on April 1, 1976, and the release of the first Macintosh on January 24, 1984. But the former was only one of two events that occurred around that time that would go on to have a profound impact on my life. Because just over a year after Apple was founded, on May 25, 1977, release of the original Star Wars.

Oddly, those two events are intertwined at various points, not only with my life, but with each other. That’s true both in time and in space, where ultimately, these two influences would effectively bracket the San Francisco Bay Area, with Lucas’s Skywalker Ranch just north of the city and Cupertino to its south.

And the connection extends even further—the interplay between the rise of computer technology and its effect on modern moviemaking. John Knoll, the creator of Photoshop, would go on to work for Lucas’s groundbreaking visual effects firm, Industrial Light and Magic. A group within Lucasfilm would later evolve, with funding from Steve Jobs, into the animation studio Pixar (which, along with Lucasfilm, would be eventually acquired by Disney). I definitely had a wallpaper on my Mac in college photoshopped with Steve Jobs and George Lucas in it—what can I say, I know who I am.2

There are thematic ties, too. I wasn’t the only Mac fan amongst my friend group, but in the 1990s we were engaged in pitched battle with the behemoth that was Windows. It lent something to our identity, then—we were no less scrappy underdogs than the Rebel Alliance fighting back against the evil Empire.

(I can admit, from this later date, that I cast envious glances at my friends’ PCs, able to run games like TIE Fighter and Might and Magic, while I had to wait for those to come to my platform—if they ever did. As the years went on, I persevered, reading my monthly issues of Macworld cover to cover, devouring books like the Macintosh Bible and digging up weird shareware, as though I could keep the company going through my sheer persistence.)

For a large part of my childhood, both Apple and Star Wars struggled, falling upon hard times. After 1983’s Return of the Jedi, there were no more Star Wars movies. Meanwhile, Apple nearly tumbled into oblivion.

I vividly remember sitting in our kitchen one morning, listening to the news on the radio while my dad made his coffee, and hearing a dire story about Apple. My dad, knowing my enthusiasm for the company, asked if I thought it would survive—maybe the first time I felt like he’d ever asked me a real opinion on something happening in the world.

I won’t say that it had never occurred to me that it was possible Apple would cease to exist, but it was something I didn’t really have the tools to process. So, naturally, I assumed it would survive somehow, as unlikely as that seemed—as sure as there would be new Star Wars movies someday. The narrative’s stronger when you’re a kid, when you don’t really understand how the world works and your only real templates are stories.

Dave Filoni on stage with a Star Wars presentation at WWDC.
A talk by now-Lucasfilm president Dave Filoni at WWDC 2014.

So I closely followed all the developments of those dark times: the transition to the Power Macs, the attempts to create a modern successor to Mac OS, devouring every tidbit of information with the no less fervor than I digested every new Star Wars novel. Any port in a storm.

And then in another close coincidence that is too strange for fiction, dual lights at the end of the tunnel: just as Steve Jobs returned to the company he’d founded, George Lucas announced that a trilogy of Star Wars movies was on the horizon. It seemed that faith had been rewarded and hope was once again on the horizon.3

Staying foolish

My life has always been kind of a push and pull between these two influences—forces, if you will4—of technology and storytelling: Venn diagram circles with an overlap sometimes larger or smaller. As a teenager, I both wrote and distributed some really terrible shareware on local BBSes and, for several years, collaborated with one of my best friends to publish an online magazine for sci-fi and fantasy.5

In college, I majored in English because I loved writing stories, but almost all my work experience, starting in late high school, was in tech: a nascent web company, IT work at a university library during summers and vacations, teaching fellow students about technology at my college. Freshman year, I got a reputation as the English major who would fix all the computers of the engineers on our floor—even though I was only one of a handful who had brought a Mac to college amidst the sea of beige—or, increasingly, translucent blue plastic6—PCs.

Dan at 13 in a blue armchair reading Macworld magazine.
The Force is strong with this one?

Even after college, I worked in IT and web development while toiling away on my first novel. The first piece I ever had published was about Star Wars and it led to the conviction that I could get a job writing—and it just so happened that job was writing about Apple. The rest, as they say, is history.

Always in motion is the future

As this milestone has approached, I’ve wrestled with my own feelings about Apple. Last year, as I wrapped up my ten-year stretch as a columnist at Macworld, I wondered whether we should even be fans of a company. A year on, I feel even more confident in my conclusion that it’s probably unwise to allow your identity be dictated in any small part by a for-profit corporation whose needs will not ultimately be aligned with yours.

Frankly, it’s a conversation I’ve had to have about Star Wars over the years—more than once.

The truth is I still view myself as an enthusiast of Apple and of Star Wars, even today. Without the former, I wouldn’t be here talking to you. I’m not sure I could have devoted this many years of my life to writing and talking about something for which I don’t have strong feelings. And without the latter, I don’t think I would constantly be writing stories that try to capture the way Star Wars enthralled me as a kid.

Dan with a stormtrooper at WWDC.
Hopefully this stormtrooper at WWDC 2014 wasn’t an omen.

But being an enthusiast certainly doesn’t mean being uncritical—honestly, none are so critical as those who view themselves the true enthusiasts. Amidst the recent years’ resurgence of both Star Wars and Apple, there’s been no end of criticism—some certainly less well-founded than others—from those who profess themselves the most ardent enthusiasts.

However, if I can trot out another old trope, you either die the hero or live long enough to see yourself become the villain. That’s the knife edge Apple is poised at now; some might argue that it’s too late, that Apple has already tipped itself over onto the side of full-blown villainy.

But maybe there’s one more lesson to take away from Star Wars here: even Darth Vader managed to redeem himself in the end. You don’t have to be the scrappy underdog to make the right decision. It’s never too late to hoist the pirate flag and think different.


  1. Although, have you seen RAM prices? Memory lane is pretty expensive real estate these days… 
  2. I assume the two of them must have met at some point, but I’m frankly shocked that I can’t find any direct evidence of it. As far as I can tell, not a single photo of the two of them together exists. And isn’t that suspic—no, no it’s not. 
  3. Unfortunately, sometimes the light at the tunnel is a Death Star superlaser firing. 
  4. AND EVEN IF YOU WON’T. 
  5. Spurred on, in large part, because West End Games wouldn’t accept my submission for the Star Wars Adventure Journal since I was too young. 
  6. The year was 1998, after all. 

50 years later, Apple still controls its destiny

2026-04-02 00:00:07

Vintage Apple II computer with a beige monitor, keyboard, and floppy disk drive in a glass display case.
Museum piece. Photo: Alejandro Linares Garcia, CC BY-SA 3.0.

I am usually so focused on Apple’s present and future that I don’t spend a lot of time ruminating about its past. And yet, as its 50th birthday has approached, it’s been impossible not to think Big Thoughts about the Big Picture.

So here’s one: Apple has been remarkably consistent — across 50 years and numerous CEOs and the vast sweep of late-20th- and early-21st-century history — in a few key areas. The people change (except Chris Espinosa!), but some of the ideas have managed to stay the same. And I think that’s meaningful.

Here’s what it boils down to: Apple is a company that chooses to build the whole product, while controlling its own destiny. That was true in the 1970s, it’s still true today, and it’s perhaps the company’s definitive trait.

In the olden days…

The early personal computer market was a hodgepodge. Different companies rose and fall, all offering different devices that were essentially self-contained and proprietary—compatibility across devices was almost nonexistent. Even programs written in the same language might not run across different systems, since they might each implement the languages differently.

During those days, Apple was playing the game that pretty much everyone else does. Sure, there were some computers using the standardized CP/M operating system—you could install a card on an Apple II to let it run CP/M, even!—but mostly you got what you got when you bought the box. Apple IIs ran Apple stuff, TRS-80s ran TRS-80 stuff, the Atari 400 ran Atari stuff, Commodore PETs ran Commodore stuff… that was it.

But in the early 80s, almost the entire computer industry got flattened, and the reason was the IBM PC. Not that IBM did the flattening itself, but it had that effect: Since the IBM PC had been created using standard computer parts in order to get it out quickly, it became relatively easy for any other company to build equivalents. Its operating system was not actually owned by IBM, but was created by an upstart software company called Microsoft.

What happened next changed the entire computer market: Dozens of companies began making IBM PC compatible computers running MS-DOS from Microsoft. The generic Microsoft/Intel PC was born, and almost every other competitor was ruined. Atari and Commodore hung on for a while, but by the early ’90s, there were only pretty much two kinds of personal computers anyone would seriously consider buying: IBM PC compatibles running Microsoft software, or the Mac.

That was it. The rest of the market had capitulated. Only Apple hung on. And as someone who started writing about Apple during that time, I can tell you that nobody expected Apple to make it. Analysts either wrote that Apple should become like the other PC makers and just license Microsoft Windows, or that Apple should become like Microsoft and just license Mac OS to PC makers. Those were the choices.

Apple, to its immense credit, stayed true to itself. (Let’s not mention that brief dalliance with Mac clones.)

The whole widget

A man in a dark sweater sits at a desk with a blue plush toy, a white mug, and a computer. Papers and a red box are nearby. He appears thoughtful, resting his chin on his hand.
Portrait of the author as a college editor. Super Grover’s crimes are redacted.

To me, this is the core of what Apple is as a company: It makes the whole product. It is not a licensee adding value, like so many of its competitors. This is an attitude that started with Woz designing the hardware and software to work together, leaving a deep impression on Steve Jobs. That impression combined with Jobs’s innate focus on creating a complete product (in an era where most computers were still sold as assemble-it-yourself “kits”) and created an enduring legacy.

People often call Apple’s obsession with owning and controlling the primary technologies behind its products the Cook Doctrine, after current CEO Tim Cook, but that’s a value that goes back to Steve Jobs. Among the more modern examples of this approach:

  • Safari came to be because, as the Web rose to prominence, the Mac was increasingly judged based on its performance at Web browsing, and the default Mac browser was Microsoft’s Internet Explorer. Microsoft’s allocation of Mac development resources helped determined the success of Apple’s key product. That was a no-go.
  • iWork (Pages, Numbers, and Keynote) exist because it means that every Mac, iPhone, and iPad can work with Microsoft Office apps and documents right out of the box, without any extra purchase required. In releasing its own productivity suite, Apple provided instant Office compatibility and no longer needed to rely on Microsoft to do the right thing with its Mac software releases.

  • Apple silicon itself is Apple’s reaction to being held hostage by the long-term plans of chip suppliers who didn’t have Apple’s interests at heart. Every Intel chip that appeared in a Mac came from an Intel road map that was built based on the overall needs of the computer market, of which Apple was a tiny part. Every Apple silicon chip in a Mac comes from Apple’s own product road map, and the chip improvements are based entirely on Apple’s needs and synchronized with Apple’s software-development road map.

  • The C1/C1X chips that serves as the cellular connection in the iPhone 16e, iPhone 17e, iPhone Air, M4 iPad Air, and M5 iPad Pro—and will eventually power every new Apple device with cellular connectivity—is a reaction to Apple’s frustration with the dominant cellular radio provider, Qualcomm. Apple can now tune its own cellular chips to its own specific needs rather than relying on the parts Qualcomm builds for the entire market.

(Are AI models a primary technology? Who knows. Apple tried to build some, failed, and has decided to pivot to use Google’s AI models… for now. But if Apple ever feels that it absolutely has to have its own AI models running on its devices and in its data centers, I have no doubt that it will spend whatever it costs to make that happen. It’s just in the company’s DNA.)

You may have your own favorite examples of Apple going its own way, and counter-examples of Apple going with the crowd. Certainly, Apple has chosen to pick its battles. The G3 iMac, for example, dumped all the proprietary connectivity that Macs used to have, and just supported the industry-standard USB. Compatibility can be valuable to Apple, to a point. But beyond that point, the company knows it must go it alone—or it’ll end up being just another face in the crowd.

Over 50 years, that’s one thing that has remained true about Apple: You never forget that you’re using an Apple product. It doesn’t do generic—not in 1976, and not in 2026.

Apple at 50: My 10 most memorable moments

2026-04-01 22:00:08

A group of people sitting in rows, looking attentively to the right. They appear to be in a conference or lecture setting.
The author (far right) at a certain Apple event 25 years ago.

It’s Apple’s 50th anniversary — you might have read something about that lately. And I’ve been writing about the company for more than half of that time, roughly 27 years if my math is correct. Companies may last a good long while, particularly when they have a track record of great products, but the writers who report on them invariably crumble to dust.

Still, my bones haven’t entirely blown away in the lightest of breezes just yet, so I figured I would weigh in with a few insights gleaned from chronicling Cupertino’s comings and goings for half my existence on this planet. Honestly, I might as well get something out of the deal.

The challenge is, you’ve probably had your fill of listicles chronicling Apple’s Best Products of All Time or the Most Memorable TV Commercials or Steve Jobs’s Most Viral Moments or what have you. I know that I have. Besides, while I know my onions when it comes to Apple, my opinion on the most significant Apple product (the iPhone 3G) or the best commercial (the sage iMac G3 serenaded by Kermit the Frog, naturally) or the most memorable thing Steve Jobs ever said (“Just avoid holding it that way”) carries no more weight than anyone else’s. In fact, there are folks whose Apple knowledge is far more encyclopedic than my own who are better equipped to weigh in on all that.

But what I can do is empty out my reporter’s notebook, with some random stories, stray observations and items I’ve largely kept to myself over the last 27 years. With tech reporting seemingly done with me, there’s no reason to keep this stuff under my hat any longer.

The occasion may call for 50 of these — one for each year of Apple’s existence — but let’s be honest: you’d stop reading after around 17, and I’d be scrapping the bottom of the tank long before we got to the last item or two. (“No. 33: Didja ever notice that Apple employed both a guy called Woz and a guy called Joz? That’s pretty weird, huh?”) So let’s stick with 10 random thoughts about Apple as the company celebrates its golden anniversary.

My Most Awkward Encounter with Apple

Back in 2001, I was handed an original iPod, not long after Apple’s press event to show off its new music player. It’s probably forgotten with time, but the MP3 players of that era weren’t very durable, and if you were foolhardy enough to take one on a run, you ran the risk of skips caused by mechanical shock. And heaven help you if you accidentally dropped one of those things.

The iPod was going to be different, Apple told us. Not only would Apple’s music player have more storage, it was going to be durable enough to survive real world use in a way that rival devices simply could not. So I decided to put that to test, probably ill-advisedly.

I commissioned a more physically active colleague to go work out with that iPod in tow, along with one very specific instruction: be especially brutal with the device. “Let’s find out just what kind of a licking this thing can take,” I remember saying at the time.

It turns out the iPod was pretty durable, though not indestructible. We did manage to damage the device, but only after deliberately tossing it from a moving bicycle. Otherwise, for a 2001-era piece of tech, it withstood a fair amount of abuse before finally succumbing to our more violent impulses. I patted myself on the back for conceiving of a handy piece of consumer tech journalism that would give readers insight into just what they could expect from an iPod in terms of durability and went about my business without giving the story another thought.

At least until Apple asked us to return the iPod.

Companies don’t always do that, as they’re happy to leave review units in the hands of publications for use as reference devices when subsequent updates come along. But occasionally, you do get asked to return the equipment, Q-from-James-Bond-style, and this was one of the occasions. But I held out hope that Apple would agree that proving just how much punishment an iPod could take was enough of a service to more than make up for the non-operable loaner.

Apple did not agree. I don’t remember the poor soul who was tasked with explaining to Apple why their once-pristine iPod was coming back in such a decidedly scuffed-up state, but whoever it was made certain to let the company the name of the dastard who so recklessly ordered the iPod beaten to a pulp. It would be many years before Apple ever trusted me with a loaner device again, and even on those occasions, the hand-off was made with decidedly sideways glances.

The part of the Apple campus I’ve never seen

I’m not a frequent visitor to worldwide Apple HQ, but I’ve been around the place a bit. I’ve even gone inside a building or two, though never uninvited, I hasten to add. I’ve had lunch at one of Apple’s on-campus cafeterias, and let me tell you after also dining at Google’s campus, your tech industry workers are being fed very well.

I have not, however, been inside the Steve Jobs Theater, which seems odd since Apple has been holding events there for the better part of a decade. Part of that’s the nature of my role in covering Apple events — I’m usually coordinating coverage and editing people’s work, and it’s easier for me to do that watching the live stream from the comfort of my office.

The closest I’ve come was in 2017, the very first time in fact that the Steve Jobs Theater hosted any product launch. I was a late addition to the coverage team on hand to look at the iPhone 8 models and the new iPhone X, and as a consequence, I was directed to watch the event from an outdoor overflow area on a nearby TV. Which is how I normally cover such product launches, only without the 90-minute commute.

I don’t know what you remember about that 2017 event — the Apple Watch Series 3 maybe or the Apple TV 4K or one of the trio of aforementioned phones. For me, it’s the smell of fertilizer baking in the warm Bay Area sun on the freshly landscaped area surrounding the Steve Jobs Theater. On the bright side, at tech events for other companies, the smell of manure typically originates from the stage, so Apple has that going for it at least.

Watching an event on a TV outside of the closed doors where the products in question are actually being launched is hardly my most traumatic Apple press event experience, though. That’s a close tie between the iPhone 6s launch, held inside the kiln-like Bill Graham Civic Auditorium, and the 2014 Apple event where I covered the iPhone 6 and Apple Watch preview announcements only to be laid off from my job 24 hours later. Good times.

My favorite Apple launch event

Look, over the course of 27 years, Apple events are going to blend together, particularly when you’ve stopped attending them in person. Nevertheless, a few stand out, especially since i was in the room where it happened.

My very first Macworld Expo in January 2000, Steve Jobs announced he was dropping the “i” from his iCEO title — basically, no longer an interim title, which seemed like a big deal at the time. I was also at the WWDC keynote where Apple held a funeral for Mac OS 9, marking the complete transition to OS X.

But c’mon — there’s only one logical choice here, and it’s the iPhone’s unveiling in 2007. Seeing Apple take the wraps off a completely new product is going to stick in the brain pan, especially since it’s one that’s subsequently stood the test of time. (Folks who were there for the Apple Vision Pro unveiling: I do not think time will be as kind to that moment.) Jobs’ pitch of a combination communication device/music player/mobile phone still resonates. Even AT&T’s Stan Sigman reading his contribution to the presentation off of index cards couldn’t dull the occasion.

My favorite Apple-inspired road trip

If you weren’t around for Apple’s pre-OS X era, it’s easy to forget what a significant shift it was away from the old Mac operating systems to the more modern design and capabilities of OS X — especially after previous efforts to update the OS went nowhere. (For us old timers, “Copland” is more than just a 1997 Sylvester Stallone vehicle or the misspelled last name of The Police’s drummer.) Apple had been working on a new OS for a while, and finally, in the fall of 2000, Mac users were going to get a chance to give it a try.

In fact, the public beta of Mac OS X was going to be revealed at that year’s Apple Expo in Paris, and I jokingly suggested to Macworld’s then-editor that it would be a hoot to send me to cover it.

“I don’t speak a lick of French,” I told him. “I don’t even have a passport. Wouldn’t it be hilarious to fly me over there and watch me flail my way through covering the event?”

“It would be hilarious,” the editor unexpectedly agreed. And that’s how I wound up getting an expedited passport, hopping on a flight to Paris and wandering about an indifferent metropolis without anything resembling a concrete game plan.

The turn-of-the-century tech boom was a hell of a time, kids.

Anyhow, I managed, covering both the OS X news and the surprise launch of the key lime iBook. That said, there was one moment of pure jet lag-induced panic that occurred moments before Steve Jobs stepped on stage to make his assorted announcements: What if, I thought, he delivers this entire speech in French, and I’ve come all this way to not understanding a blessed word he’s saying? Fortunately, whatever multilingual capabilities Apple’s CEO possessed were not on display that day, and I was able to fulfill my journalistic obligations.

My least favorite Apple keynote

Jason Snell and I used to have a running gag back in the days when print, not online, was king and we would reserve a sizable chunk of Macworld’s print edition for last-second coverage of all the Macworld Expo keynote announcements Apple was sure to make. But what would happen, we wondered, if Apple didn’t announce much of anything, leaving us with all those pages to fill and very little to write about.

Our joking Plan B: Run an article called “What Went Wrong?” featuring a picture of various Apple executives shrugging.

We came dangerously close to having to do that at the New York edition of Macworld Expo 2001 where Apple announced… well, some stuff. We got a recap of the recent Apple Store openings — hey, they were new at the time — and a lot of talk from developers showing off OS X native apps for the still-nascent operating system. The lone hardware announcement centered around new Power Mac G4 towers, punctuated by a lengthy discussion of what Apple called the “megahertz myth” to address differences in performance between Macs and PCs. Put another way, Apple’s big product announcement at that Expo was punctuated by an 8-minute deep dive on processor pipelines.

We managed to produce the necessary copy to fill those empty magazine pages that night. But it took some doing.

Apple event celebrity sightings

Attend enough Apple-hosted or -adjacent events, and you’re going to run into famous people. For example, if you walked the show floor of a Macworld Expo in San Francisco any time between 2000 and 2009 and didn’t see comedian Sinbad at some point, I’m guessing you were just popping into Moscone Center to use the restroom.

I’m notoriously bad at recognizing people, but even I can recount a couple celebrity encounters. Once, I waited in line to get in for an Expo keynote standing directly behind Adam Savage of Mythbusters fame. And during the iPhone 6s launch held in the hotbox that was the Bill Graham Civic Auditorium, I stood patiently waiting for a demo of one announcement or another — memory tells me it was gameplay on the Apple TV — when Charlie Rose big-footed his way in front of me and took my turn. Definitely the worst thing Charlie Rose has ever been accused of.

(Glances at Charlie Rose’s Wikipedia page.)

Oh. Um. Scratch that.

I’m told Gwen Stefani was at the 2014 iPhone launch, though I never ran into her or her apparently sizable entourage. But while U2 was busy surreptitiously downloading their Songs of Innocence album to the rest of your iPhones, they were also blowing out my ears at the same event.

Most awkward encounter with an Apple executive

Celebrity encounters are all well and good, but who’s a bigger name star than the men and women who run Apple? I don’t often rate face time with the higher-ups at the company, but there was one time where Tim Cook and I had the briefest of interactions. You will be surprised to learn it did not reflect well on me.

I was leaning against a wall in San Jose’s McEnery Convention Center, waiting for a colleague to wrap up a product briefing, when a gaggle of people strolled by, with Tim Cook at the center of the throng. For some reason, he looked over in my general direction at the same time I was watching him pass by, and that’s how I found myself in a staring contest with Apple’s CEO.

I don’t exactly have the friendliest appearance. My resting face makes it appear as if I’m trying to recall how you’ve wronged me, and if ever I try smiling, it looks like I’ve suddenly remembered. So I decided to offer some sort of gesture to convey a spirit of collegiality — I gave Tim Cook what I hoped passed for an amiable nod of acknowledgement. Judging by the mix of confusion and apprehension that flashed across his face, I don’t think I was entirely successful.

So, Tim Cook, if you’re reading this, and you’re still wondering why that glaring fellow nodded at you at that one WWDC many years ago, rest assured that there’s no ill will on my part.

My favorite portrayal of Apple in a movie

I saw 2013’s Jobs twice, which is probably two times more than anyone outside of Ashton Kutcher saw it. Both times were press screenings for a review I was commissioned to write about the movie. The first screening happened well before the movie’s release and Act Three of the picture felt so haphazard to me that I thought for sure that Jobs would be recut prior to arriving in theaters. Hence, the second screening right before the premiere, in which I discovered, nope, the movie was going to wind up exactly the same.

So Jobs isn’t my favorite picture about Apple, and I have to confess that the 2015 Steve Jobs biopic didn’t resonate with me either. No, for big-screen Apple thrills, I suggest turning to the small screen in the form of 1999’s Pirates of Silicon Valley, a made-for-TV movie staring Noah Wylie as Steve Jobs and Anthony Michael Hall as Bill Gates. (John DiMaggio — TV’s Bender — plays Steve Ballmer, and sadly, we do not get to hear “Developers, developers, developers” in the Bender voice.) Pirates of Silicon Valley isn’t the least bit accurate, but it’s a good character study that has something to say about ambition and our impulses to create.

If there’s a runner-up, I’d steer you toward Golden Dreams, a short video that used to run in the part of Disney’s California Adventure that now houses the Little Mermaid ride. There, you can look in as two seemingly random guys named Steve assemble a rudimentary computer while Whoopi Goldberg looks on, pointedly taking a bite out of an apple.

Goofiest Apple product of the last-half century

By this point, it’s probably clear that I find the off-beat aspects of a company’s history to be just as vital as the landmark hits that everyone talks about. I think we all should be serious about our work without being too serious about ourselves, so the things that are going to stand out to me about Apple’s first 50 years are going to reflect that. And occasionally, Apple has had some fun, too.

How else to explain the moment in 2004 when Steve Jobs — co-founder of the company, lauded visionary, subject of many a profile attesting to his business savvy — stood up in front of a packed house and introduced the world to iPod Socks? Jobs is fully committed to the bit, hailing the socks as a “revolutionary new product.” A hint of a smile flashes on his face as he tries to convince the world that, yes indeed, they need to swaddle their music players in brightly colored socks. “They keep your iPod warm,” Jobs insists, and you might for a moment feel like he actually means it.

We can talk about great Apple products and shake our heads at the few missteps. But life is about fun, and there’s no other way to describe iPod socks.

Most symbolic photo of my time covering Apple

Let’s end by circling back to the original iPod — the launch event, specifically. There’s a photo that makes the rounds in my circle of associates, pulled from the launch event video where the cameras have cut to the crowd. And there, you can clearly see Jason Snell watching as the iPod is unveiled. Seated next to him is Rick LePage, Macworld’s editor in chief at the time, and Jon Seff, another Macworld editor.

I’m there, too, though you wouldn’t know it from that shot. For a long time, I assumed I had been sitting next to Jason, so that I was cropped out of the photo — kind of like a real-life version of that Nathan Fielder meme — “Out on the town having the time of my life with a bunch of friends. They’re all just out of frame, laughing too.” — only in reverse. Here, it’s just me who’s been cropped out of the shot, having the time of my life.

And that seemed like a fitting way to sum up my time covering Apple. The company announces something significant, and I’m right there, if only slightly out of the shot.

Of course, that’s actually not the case. In fact-checking this article, we discovered that I am not seated next to Jason, but rather in the row behind him. And yes, we have the photos to prove it.

A group of people sitting in rows.
Jonathan Seff, Rick LePage, Jason Snell, Kristina De Nike, and Philip Michaels, among others, at the iPod launch event in 2001.

So as it turns out, I’m not as peripheral to this Big Moment in Apple History as memory had once dictated. Turns out Apple can still surprise us all after 50 years, even those of us who’ve seen it all.

Another life changed by the Mac

2026-04-01 21:00:46

Vintage Apple Macintosh computer with a beige monitor displaying 'hello,' a keyboard, and a mouse on a white surface.

When I saw my friend Antony Johnston’s post on Six Colors, I instantly thought, “yeah, me too.” And as it happens, the very Mac model that changed Antony’s life put me on an entirely new road, too.

Just before I got my journalism degree in 1984, a professor named Jim Haynes sat me down and warned me that I would have more trouble finding a job than almost anyone in my class because I have low vision. I choose to believe that he meant it kindly, a warning to get ahead of any potential employers’ doubts, rather than as a pessimistic prediction about my future.

But he was right. My job search was painfully long, and I realized that at least part of the struggle had to do with the expectation that young communications specialists working for non-profits or government – a niche I thought I could play in – needed to physically paste up newsletters, brochures and other typeset publications. I’d already learned how unsuited I was for that during a college internship, what with the need to cut straight lines of galley copy and wield an X-acto knife on rubylith. I simply wasn’t equipped to do that sort of visual work.

Somewhere along the way, I went to an Apple demo of something called “desktop publishing.” With a Macintosh computer and a high-resolution printer called a LaserWriter, you could design, lay out and print a complete publication — no knives required. When I arrived for the demo, I was intrigued. By the time I left, I would have sold a kidney for a Mac-LaserWriter combo.

In my unemployed state, the only available source of funds was my parents. Ever the practical sort, they suggested that I learn more about what I now knew as DTP, before they would be willing to hand over more than $6,000 for my pipe dream.

So I rented my first Mac (a 512Ke), a copy of PageMaker 1.2, and an external floppy drive. The guy I rented it from, Robert Jagitsch, would go on to found PowerLogix, a company that sold Mac processor accelerators. I used to run into him at Macworld Expo in the 90s. But just then, his stock of Mac stuff for sale or rent appeared to live in the trunk of his car.

Without a LaserWriter, I couldn’t do much more than teach myself PageMaker. But my local AlphaGraphics offered laser prints for $1 a page. It didn’t take me long to realize I might be able to make desktop publishing work as a freelance business.

Pretty soon, my mom – who had given my sister a used VW Rabbit during college – agreed to fund a brand-new Mac Plus. It was my equivalent “welcome to adulthood” gift. I added PageMaker and a SuperMac DataFrame hard drive that cost an eyewatering $625 for 20 megabytes.

I launched the publishing business, creating everything from brochures to fancy reports for graduate students to newsletters for a city council member. AlphaGraphics was still my source for laser prints, but I quickly fell in with a group of interlocking businesses that offered scanning, full-service printing and access to Linotype typesetters that offered 1200 dpi output, versus the LaserWriter’s 300 dpi.

Eventually – four years out of college – I landed my first full-time professional job. With a Mac Plus on my desk, I edited and laid out monthly trade magazines for enthusiasts of supercomputers, DEC minicomputers and various UNIX systems. Despite a solid portfolio of published writing, I could never have talked my way into that gig without my Apple desktop publishing skills. Those years I spent at home cranking out newsletters had also made me a pretty good Mac system administrator and troubleshooter – skills that have followed me throughout my career

The Vergecast: Apple at 50 ↦

2026-04-01 00:05:45

In addition to my two pieces on The Verge this week, I’m also on the Vergecast talking to David Pierce about Apple’s past, present and future:

On this episode of The Vergecast, we begin by stepping back a bit to ask a big question: How is Apple doing right now? Obviously, by many measures, Apple’s doing great — trillion-dollar company and whatnot — but this is a company that has long taken pride in building better software, better hardware, better everything, and doing it in a better and cooler and more responsible way. Jason Snell, a longtime chronicler of all things Apple, joins the show to do a modified version of the annual Six Colors report card about where Apple stands right now.

It was a great conversation, and nice to talk about where Apple is going, given all the history that I’ve been writing about for the last few weeks.

Go to the linked site.

Read on Six Colors.

Between Jobs: The triumphs and failures of Apple without Steve Jobs (The Verge/Jason Snell)

2026-03-31 22:13:18

It’s a famous story on its way to becoming legendary: Apple cofounder Steve Jobs was pushed out of Apple in 1985, spent more than a decade in the wilderness, and then returned to Apple in 1997 to save it from bankruptcy and transform it into one of the world’s most valuable companies.

That’s true, so far as it goes, but this interregnum is too often simplified as when Apple CEO John Sculley got rid of Steve and ruined the company. And that’s really not true. Not only was the Jobs who was ejected from Apple completely unprepared to run the company (as his disastrous but educational years at NeXT would prove), but the Apple of this period had some real accomplishments.

From making necessary changes to the Mac to the creation of the PowerBook, Apple didn’t simply weather the 12 years without Jobs. The company made shifts, adaptations, and decisions that would become foundational to its future. Were there missteps? Most definitely. But ignoring Apple’s successes over those dozen years undermines the truer, deeper story of how Apple survived to become the behemoth it is today.

Continue reading on The Verge ↦