2026-01-23 21:43:09

Yesterday brought two bits of reporting about Apple that are seemingly unrelated, and yet also sort of related. The first was a report by Aaron Tilley and Wayne Ma for The Information, diving into how Apple SVP Craig Federighi was recently tasked with overseeing the company's AI efforts. Later came a report from Mark Gurman at Bloomberg noting how another Apple SVP, John Ternus, had been quietly tasked with overseeing the company's design efforts.
Over the past year, these have been the two areas of Apple under the most fire. And they're arguably the two most important areas of Apple going forward. Now being overseen by perhaps the two most important figures within Apple going forward...
Let's start with design. It has obviously been a focal point of the company since inception, as Steve Jobs famously pushed the mantra of design not just being about how something looks, but how it works. The attention to detail on the back of cabinets and circuit boards and all that. And while it seemed to take a backseat in the industry with the rise of Microsoft and the PC in the 1990s, Apple came roaring back to become the most valuable company on Earth thanks in no small part to this continued focus.
But in the past year or so, Apple has been under quite a bit of fire for some design choices, almost all of which are on the software side of the house. Most notably, with the transition to "Liquid Glass" as the overarching UI theme. To be fair, much of this criticism started after the person overseeing such efforts, Alan Dye, had announced his departure to Meta – a big rebuke to the rival that Mark Zuckerberg clearly hates the most. But the design complaints certainly started before that as well. Anecdotally, I know quite a few people who have no idea who runs design at Apple who refuse to install the latest versions of the OSes because of the way they look. And there's data to suggest they're not alone.
That is not why Ternus is now overseeing the group. In fact, if anything, it may make it a harder time for him to oversee such things because he has no formal background in design beyond his work helping to steer it within the hardware group he has overseen (a group which, I should note, gets no such criticism). But he could be a fresh set of eyes to help right the ship, as it were. And it's not only a good challenge for him, it's a vital one, as he's the person believed to be next in line to succeed Tim Cook as CEO of Apple.
Granted, Cook had no such design talent or aspirations either. But he didn't need to – he had Jony Ive in place when he took over the reins from Steve Jobs. In many eyes – and certainly with hindsight – even Ive became problematic on the software side of design when he was thrown into that role (having also previously focused on hardware design) after Cook ousted Scott Forstall. Most people will know this as the time when Jobs' beloved skeuomorphism vanished in favor of flat. Apple's software design really hasn't been the same since.
Anyway, fast forward a few more years and Ive too was gone from Apple.1 Losing someone of his stature put the company in a tricky spot. How do you replace Jony Ive? The answer is that you don't. That led to design ultimately reporting up to Jeff Williams, Cook's COO. Also without the design background, Williams seemed like a good steward as a sort of jack-of-many-trades executive (while obviously mainly focused on supply chain, like Cook before him, he also oversaw Apple Watch, etc).
Fast forward a few more years and here we are at the present with Williams having just retired at the end of last year. With that move, design shifted, at least functionally, to reporting up to Cook. Two problems there. First, again, this is not Cook's world. But second, and more pressing, Cook is on the verge of retiring himself. Perhaps not imminently, but clearly soon. At which point design will have to be handed off yet again...
This is what led me to a sort of casual prediction/thought last July when the news first came out that Williams would be retiring:
Design is a far trickier topic. Once Williams actually retires to start 2026, that team will transition to noted design expert Tim Cook.
I kid, I kid. But really, it's weird. But it also feels like they don't really have another option here. Though I might note that if Ternus really is in line to be the next CEO now – something that obviously seems more likely than ever with this news – perhaps it would be good to get him that experience? Perhaps especially while Williams is still there to oversee that transition?
Well, sure enough, as Gurman reports:
Apple Inc. has expanded the job of hardware chief John Ternus to include design work, solidifying his status as a leading contender to eventually succeed Chief Executive Officer Tim Cook.
Cook, who has led Apple since 2011 and turned 65 in November, quietly tapped Ternus to manage the company’s design teams at the end of last year, according to people with knowledge of the matter. That widens Ternus’ role to add one of the company’s most critical functions.
If that timetable is correct, it sounds like Williams handed off the oversight of the design groups right to Ternus. And it makes sense for the exactly the reason I suggested last year, it's one of the – if not the – most critical areas within Apple. And they've floundered a bit in recent years because it has been passed around, in particular with awkward transitions as people left and there were not other real options. Better to try to set Ternus up for success here with an orderly transition.
And he'll have more time because again, Cook's departure doesn't seem imminent. And per Gurman's reporting, design is still technically reporting up to Cook on paper. This also gives the company some cover in not tipping their hand that Ternus is the heir apparent, even though it's widely reported now. Certainly if Ternus was announced to be formally taking oversight of design, that would be the absolute indication he was the next CEO of Apple – and for whatever reason, Apple clearly isn't ready to do that. So instead we'll have to just go with the reporting. But yes, it's the most clear cut sign yet that Ternus is the guy.
If it's not fully settled yet, this could be a final "test" of sorts for Ternus. Again, design is in a sort of state of disarray post-Dye. Can he stabilize that ship and set them back on the right path? And if he can't in the next few months, does it in some way encumber his path to CEO?

Honestly, probably not, but humor me. What if Craig Federighi is on a similar path with AI? It is interesting that given how forward-facing Federighi has been for years on stage at Apple's keynotes, and how vital he has been to the company, that he hasn't been the front-runner for the CEO job. The only obvious explanation may be that he simply doesn't want it. And if so, fair enough. But are there many people that rise to the level of SVP at large companies that don't ultimately aspire to the top job? And especially at Apple, one of the most powerful and important companies in the world?2
Federighi is older than Ternus – 58 versus 50 – but he's seemingly still in that "Goldilocks" zone for corporate CEOs: not too old, not too young. And given his aforementioned stage presence, he seems to be a young 58. Certainly he can give Apple another decade of work, if not more.
Despite the age difference, Ternus has actually been at Apple longer – about 25 years – versus Federighi who first came over (with Steve Jobs) in 1996 with the NeXT acquisition. But he left in 1999 before coming back a decade later, and he's been at Apple ever since – 17 straight years, and about 20 in total.
Anyway, again, Ternus seems to be the chosen one, and it's hard to see that changing. And interestingly enough, in Gurman's latest report, he notes that Sabih Khan, Apple's new COO (though a 31 year Apple veteran himself) after Williams, is "the other internal CEO candidate" – no mention of Federighi, despite other recent reports still naming him as a would-be candidate.3
All I'm saying is that if Federighi still has any shot – and again, if he even wants any shot – he's getting the other, and arguably far more challenging and daunting task at Apple right now: fixing AI.
The Information report is full of tidbits, notably including Federighi's initial reluctance to lead such an effort for Apple, which seemingly led directly to John Giannandrea coming on board. And how Mike Rockwell, then overseeing early Vision Pro efforts, went to Federighi to implore him to use AI more within iOS, which Federighi apparently rejected worried about the unpredictable and chaotic nature of the (still early) technology. That changed, of course, with ChatGPT, as has been widely cited by now. Once Federighi had his come-to-Jesus moment there, he seemingly had a fire lit under him, like Rockwell, to get Siri straightened out. And now, of course, the two are working hand-in-hand post-JG, with Rockwell leading the Siri efforts directly, reporting to Federighi.
But the report also notes that it was Federighi who ultimately made the call to outsource Apple's AI efforts – which likely sealed the fate of much of Apple's current AI team, including JG. And the resulting internal bake-off led to Google being crowned the winner. Now it's on him to make it actually all work. For real this time. And the stakes could not be any higher, any faster. The first revamp to Siri is due in a couple short months with iOS 26.4.
Meanwhile, Federighi is also overseeing the broader AI push across all of Apple's products – including the new 'HomePad' device which has been delayed due to Apple's AI struggles – which should culminate, once again, with announcements at WWDC. Seemingly without many new features planned for iOS 27 (and the other new OSes), and with "Liquid Glass" likely in place for at least the next iteration – your move, Ternus – all eyes will be on AI. And not just developers, but the public, and yes, Wall Street. Will we finally get the true "AiPhone" this Fall?
There's obviously not a lot of time, but if Federighi can pull this AI course correction off and show Apple to be on the right path... It will be a big mark in his favor, is all I'm saying. Yes, even if they're outsourcing much of the heavy-lifting to Google. It still took someone with, dare I say, courage, to make that call.
Plus the knowledge that it's a temporary fix until Apple's AI house can be truly put in order. And there's still very much a world in which Apple benefits from not being in the middle of the current LLM arms race, depending how things shake out over the next year or two...
So that's two tall tasks and two new taskmasters. Maybe it all has some bearing on the CEO job or maybe not. But both paths will certainly be key to where Apple goes from here. A true new era for the company.

1 Yes, there were a couple years there where Ive had stepped back into a less operational – or certainly less managerial – role and Cook was technically overseeing the design teams then too. ↩
2 And with Tim Cook likely stepping into the Chairman role eventually, the next CEO can hopefully avoid giving out golden trinkets to you-know-who... ↩
3 Khan could very well be the "backup plan", sort of the role Williams seemingly filled under Cook – someone who could step in at a moment's notice. Something Apple learned the hard way during Jobs' unfortunate illness. ↩
2026-01-22 21:04:40

AI wearables, so hot right now. Of course, they were also so hot two years ago when a rush of startups started to push such wares into the world. Yeah, that didn't go so well. Humane crashed and burned and the team is now running HP printers. Limitless found some limits after all and was forced to sell to Meta – not even a "hackquisition", for shame – with their product no more. Rabbit couldn't find their way out of the hat, and it seems to be more focused on AI software instead. Friend is far better known for their divisive marketing rather than their product. There were others, none achieving any real scale or success, of course...
2026-01-22 07:18:37

By my count, this is my fifth headline trying to make 'AiPhone' happen. I should probably stop. But tonight is not the night, damnit. Because there is once again hope. Hope that Apple has finally taken their collective head out of the sand. That they're going to rise to meet the moment that is AI.
There have been signs. After the complete and utter debacle of Apple's first attempt to launch the AiPhone in 2024, the embarrassment stretched into 2025 with almost nothing to show for it. Heads finally rolled, but that still wasn't enough because it still wasn't clear that even with new people in place, that Apple would be on the right path with AI. After all, many of the key players who had spent the past several years downplaying the importance of LLMs, or of NVIDIA chips, or belittling chatbots were still in place.
Well, that finally changed a couple months ago and look at that: it now feels like Apple is being honest with themselves at least. Siri wasn't just a few tweaks away, Siri needed a lobotomy. And after a new brain bakeoff, they got one in the form of Gemini. Finally, a world-class back-end to power Siri. But she was still sort of stuck in 2016 mode. Basically a smart speaker version of AI. Well now Siri may be on the verge of getting a new front-end to match the new back-end too, it seems.
As Mark Gurman reports for Bloomberg:
Apple Inc. plans to revamp Siri later this year by turning the digital assistant into the company’s first artificial intelligence chatbot, thrusting the iPhone maker into a generative AI race dominated by OpenAI and Google.
The chatbot — code-named Campos — will be embedded deeply into the iPhone, iPad and Mac operating systems and replace the current Siri interface, according to people familiar with the plan. Users will be able to summon the new service the same way they open Siri now, by speaking the “Siri” command or holding down the side button on their iPhone or iPad.
The keyword there, of course, is "chatbot". On the surface, that may not sound much different from current Siri, but again, instead of being the sort of one-off voice assistant, Siri should now be a full-on, LLM-powered AI.
The difference should be even more apparent because it sounds like we're still first going to get "old" Siri just with that new Gemini-powered brain in iOS 26.4 this Spring, before we get new Siri in iOS 27 this Fall. The first version will likely just be competent (finally), while the second version should actually be conversational. An actual AI product like we're used to from ChatGPT or yes, Gemini itself.
The previously promised, non-chatbot update to Siri — retaining the current interface — is planned for iOS 26.4, due in the coming months. The idea behind that upgrade is to add features unveiled in 2024, including the ability to analyze on-screen content and tap into personal data. It also will be better at searching the web.
The chatbot capabilities will come later in the year, according to the people, who asked not to be identified because the plans are private. The company aims to unveil that technology in June at its Worldwide Developers Conference and release it in September.
Campos, which will have both voice- and typing-based modes, will be the primary new addition to Apple’s upcoming operating systems. The company is integrating it into iOS 27 and iPadOS 27, both code-named Rave, as well as macOS 27, internally known as Fizz.
Gurman notes that this change may be the biggest revamp of iOS 27, after last year's big "Liquid Glass" UI changes. So it will be especially important to get this right. To take the iPhone and make it the AiPhone.
Internally, Apple is testing the chatbot technology as a standalone Siri app, similar to the ChatGPT and Gemini options available in the App Store. The company doesn’t plan to offer that version to customers, though. Instead, it will integrate the software across its operating systems, like the Siri of today.
Interesting that Siri still won't have an app in this new iOS 27 build. I'm not sure this isn't a mistake simply because that's how people – iPhone users in particular – know to interact with AI chatbots right now. Sure, you still want Siri running underneath, always on, as it were. But it would probably be helpful for users to have a hub – a Siri app – to load up. We'll see, I guess!
Like ChatGPT and Google Gemini, Apple’s chatbot will allow users to search the web for information, create content, generate images, summarize information and analyze uploaded files. It also will draw on personal data to complete tasks, being able to more easily locate specific files, songs, calendar events and text messages.
Unlike third-party chatbots running on Apple devices, the planned offering is designed to analyze open windows and on-screen content in order to take actions and suggest commands. It will also be able to control device features and settings, allowing it to make phone calls, set timers and launch the camera.
Yes, this all sounds great – if it actually works this time. It sounds like... the AiPhone.
One issue under discussion is how much the chatbot will be allowed to remember about its users. ChatGPT and other conversational AI tools can retain an extensive memory of past interactions, allowing them to draw on conversations and personal details when fulfilling requests. Apple is considering sharply limiting this capability in the interest of privacy.
Uh oh, this may be Apple falling back to their old ways of thinking again. We all understand why – and I've written about the potential AI memory "problems" in the past – but this may be another situation where the market has spoken. If all the other popular AI services have great memory features that remember a lot about you but Apple limits this, it could hurt the product.
Beyond that, Gurman goes into the weeds about the differences in foundation model terminology for the first new version of Siri we're going to see – seemingly in just a matter of weeks – versus the one due this Fall. One big thing of note:
In a potential policy shift for Apple, the two partners are discussing hosting the chatbot directly on Google servers running powerful chips known as TPUs, or tensor processing units. The more immediate Siri update, in contrast, will operate on Apple’s own Private Cloud Compute servers, which rely on high-end Mac chips for processing.
It's fairly well known that Apple already uses Google Cloud for some of their infrastructure – and TPUs for much of their AI work (as they still have that pesky NVIDIA aversion), so this shouldn't be too shocking. Still. I'd be surprised if Apple doesn't tout this as running some variation of Private Cloud Compute in Google Cloud.
Then again, Apple is clearly building this new Siri to be more extensible. The early talking points will be so that they can operate in China, which will obviously not allow Google-powered models powering Chinese iPhones. But eventually, this will let Apple swap out Gemini for other AI brains as well – ideally, one day, their own. That will be the true AiPhone, I suppose. For now, this will have to do.
One more thing: is there a world in which Apple uses the big, new AI upgrade in iOS 27 to also rebrand Siri? It sounds like that's not in the cards right now, but it obviously also might not be the worst idea given Siri's past stumbles...




2026-01-20 23:09:19

When you think about the central piece of technology in your home, it's probably not the computer, or the tablet, or even the smartphone. It's the TV. This was true 50 years ago, and it's still true today. Arguably, it's more true today thanks to the rise of streaming services and video games and yes, increasingly even YouTube.
And so it's wild that despite this key focal point in everyones' lives, the market for those actual televisions well, sucks. I'm reminded of this every single time I turn on my television set. Because every time I turn on my television set I see one of two things: either a pop-up asking me to install a software update or a main screen featuring a bunch of bloatware and ads.
It's just a shitty, shitty experience. But it wasn't always like this. When I was a kid in the 1980s and 1990s, TVs were morphing from these massive boxes with tiny rounded screens into well, even more massive boxes with increasingly flat, larger screens. This is when Sony was at the height of their power, and the "Trinitron" brand – amazingly started in the late 1960s – reigned supreme as the premium player in the market. I distinctly recall frequently going to electronics stores and lusting over the latest and greatest under that brand. It was sort of the Apple of the day, in a way.
And so the little kid in me is shocked by the news today that Sony is exiting the TV business. Technically, they're spinning the business off into a joint venture, but it's one TCL, the Chinese brand, will control. Wild. A true end of an era.
Granted, it has been a long time since Sony dominated in television. By revenue, they're now the number five player in the space, behind Samsung, TCL, LG, and Hisense. TCL was already the number two player, but when Sony is added to their mix, they'll be much closer to Samsung, by far the leader. It will also make for a top 5 that is all Korean or Chinese players (with Xiaomi taking over Sony's spot). Japan, which once utterly dominated television sets, has all but vanished from the market.



Meanwhile, America, which started the whole industry with RCA in the late 1930s, and completely dominated the first few decades, has basically only one player now, the super affordable (read: cheap) brand Vizio (now owned by Walmart, with the sets themselves obviously made in Asia). Of course, the US hasn't really been a major player since the 1960s, when Japan started to take over. Why? Well, it's complicated, but essentially RCA gave away the keys to the kingdom in exchange for a licensing fee and this started a race to the bottom. To the point where now, famously, it's a sort of crappy business to be in.
Despite the aforementioned growing importance of the technology in homes around the world over that time – and again, even to this day! – it's an extremely low-margin business. With margins continuing to get squeezed even further over time.
Which leads me, naturally, to Apple.
Apple is not in the TV business. Well, they are, but famously not in the television set business. This despite years and years of analyst predictions – namely from Gene Munster – that they were about to enter the space. To be fair to Munster, they were clearly thinking about and even working in the world. It's easy to forget but the Apple TV set-top box was another key part of the keynote where Steve Jobs unveiled the iPhone! He famously labeled it as more of a "hobby" for Apple, but clearly also thought that eventually it could become a leg of Apple's "stool" – a key pillar of Apple's overall business. That never really happened, in part because Apple and Jobs were also clearly working behind-the-scenes on broader ambitions in television. To the point where some of Jobs last recorded words were the famous "I finally cracked it" comment to his biographer Walter Isaacson, discussing the company's television ambitions.
Jobs' meaning there remains vague, in no small part because all these years later and Apple still hasn't really cracked much of anything with regard to television. Sure, the set-top box has continued to evolve, but it's still largely the same product, just smaller and faster, as it was when Jobs introduced it. And yes, Apple is full-on making content now (both television and film), but that, along with the broader streaming service, now also confusingly named 'Apple TV', isn't a massive business for Apple, more just a part of the Services stool, as it were. Jobs and Apple undoubtedly wanted to unify all television content around a product they offered, just as they did with music around the iPod, but they failed time and time again to make the all-important iTunes for the market. That's not their fault as much as it's the other major players – most notably, Netflix – not wanting to play ball. And you can sort of understand why!
Still, this sucks for consumers. Not only because of the comical number of streaming services you must subscribe to, manage, and search between separately now, but also because this probably ensured Apple would not enter the actual television set space. Apple TV remained a set-top box, a hobby.
Now it's certainly possible that Apple would have never entered the TV set space given those aforementioned margins. Having great margins has long been a north star of Apple, and certainly in the Tim Cook era. Was he really going to move away from 40%+ margin products to television sets that are closer to 2% margins? Um, no. But Apple also would have undoubtedly done it differently. Yes, charging more, but also focusing on quality, not that race to the bottom.

Still there's a long way between 2% and 40%. But even if Apple could figure out a way to do a premium television set with say 15% - 20% margins, it still might be worth it for other reasons. Again, the technology remains the key focal point of the home. And as Apple starts to make their first full-on push into that home with a rumored new 'HomePad' device and perhaps security cameras and other peripherals, you could certainly make a case here.
Of course, I've been making such a case for a long time, noting years ago that it was a mistake for Apple to cancel the AirPort product which was many people's hub for internet in their home and could have evolved into more. And these days you could make a case that the iPad – and soon, perhaps the 'HomePad' – is sort of Apple's version of a television set, albeit a small one! For a big one, there's the Vision Pro! That device is clearly best suited as an entertainment consumption device right now, as their first foray into sports content makes clear, but it's also pretty much the opposite of a central piece of convening technology in the home. Maybe Apple finds a path to make it more social eventually, but we're years away from that.
Instead, we live in a world where everyone convenes around the television in their living room and that television, again, largely sucks. Sure, the actual technology powering the thing may be impressive – still a focal point of CES every year, obviously – but the experience gets worse every year. And it's because the aforementioned television set players want to turn their crappy margin hardware businesses into better margin services businesses. Or because they've outsourced the software to the likes of Google and/or Amazon, both of which are trying to use the television sets to grow their own services (including ads) businesses.
The result is just awful. I have an LG TV, a nice, big OLED one. It wasn't cheap, but the experience of using it feels cheap. Again, every time I turn it on, I get a massive pop-up asking me to update some software that I neither want nor need. This shows up even though I mainly control the TV through my Apple TV box and I really just want the LG TV to be a dumb screen. Just something to display pixels. But it fights against that, for the mentioned reasons.
And my last few TVs have been from LG because it runs webOS – yes, the old Palm mobile operating system lives on in LG TVs! – which seemed better than what Samsung had to offer. Recently, an old Samsung TV we had in a secondary room broke and I opted to try a TV running Amazon's Fire OS. It's somehow even worse than what runs on Samsung TVs. It's so comically slow that I had to get another Apple TV box to bypass their software and turn it into a dumb screen.
I also recently bought a new Google TV set top box – not to be confused with the half dozen other products that have been called "Google TV" over the years – to try it out, and it's a bit better, but still way too promotional for my tastes. And don't even get me started on Roku... At least that Telly ad-focused TV set thing is upfront about shoving ads in your face!
The situation just sucks. And it's getting worse.1
Given all the talk about tariffs and a return to homegrown American technology, you'd think this would be a focal point for the US government. We care so much about where the chips are made, but clearly less so about the technology central in everyone's living room, it seems. Honestly, I don't really care about American-made TV sets, I just care about the experience. And I would really only trust one company, which happens to be American, to care about that as well...
So come on Apple. I know the margins are brutal, but you're already making monitors that are insanely expensive versus the rest of the market. And those other monitors don't even have a bad user experience like the current television sets in market do. There's a window here to do something, in particular if you truly want to own the home. An Apple television set could be the main hub not just in the living room, but for all the technology in all the rooms in the home. And it could be the ultimate showcase for Services. And may even make it possible to one day unify the experience of all content.
Otherwise my next TV may have to be a Samsung set since Sony is now gone. Yuck.



1 Don't even get me started on the remote controls included with television sets. I've been writing on this topic for well over a decade... ↩
2026-01-19 21:16:44

This week was supposed to be the deadline for Warner Bros Discovery shareholders to tender their shares in support of Paramount Skydance's (hostile) bid for the company. Paramount clearly doesn't feel great about the proposal (after losing a fast-track bid to get more information on Netflix's offer for Warner Bros) so they're likely pushing out the date (again). At the same time, Netflix reports their Q4 earnings tomorrow and alongside that, they could formally switch their bid to be an all-cash offer, more closely matching Paramount's structure. The fight continues...
But in some ways, the fight is almost aside from the larger point. That is, Hollywood hates both of these deals. Because they don't want to see one of their iconic studios acquired. That is a foolish stance that would ultimately hasten the decline of Warner Bros and eventually lead to its demise.
Those are the real stakes here. Which everyone outside of Hollywood seems to realize, and yet few inside of Hollywood do.
I've made this point a number of times around this deal, but I felt the need to more clearly spell it out. Because every day brings a new headline or interview which makes it clear that Hollywood is living in a dream world. Regardless of who Warner Bros ultimately moves forward with, they're hoping for regulators to step in to block the deal. Again, it's silly. No, it's stupid.
If no deal happens and Warner Bros Discovery moves forward with splitting off their cable assets later this year, it will leave the studio extremely vulnerable. While the newly spun-out Discovery assets would assume a lot of the debt that lingers from the previous deals the company made – notably the tie-up with Discovery itself, unburdening previous parent AT&T – it would also take about half of the profit that the currently combined company makes. Warner Bros, the studio, is coming off a banner year. If next year is less "banner", there could be some real problems there...
But even if the studio and HBO Max keep performing, this remains a relatively small company that's going to have vultures circling to swoop in. And, in fact, an acquisition was the entire point of the spin-out. CEO David Zaslav wanted to make Warner Bros a more attractive target to deep-pocketed buyers. Knowing that, Paramount, also a relatively small company – far smaller than WBD, actually – pre-empted the spin-out. It was simply the only way they could compete for the company. But they undoubtedly didn't anticipate this level of interest from Netflix. Or that WBD would agree to a deal for just the studio ahead of the spin-out.
Anyway, the point is that someone is going to be buying Warner Bros. This was always the plan, and killing these deals just removes two would-be suitors – suitors who, by the way, actually have interest in making movies (and television). Without these two around the table, the remaining would-be acquirers, are either going to be Big Tech or private equity. Is that really what Hollywood is hoping for here? A private equity deal that reeks of "synergies" and cost-cutting galore, to set the studio up to likely flip again down the road? A real Hollywood ending.
Of course not. But they're living in this fictional land where they think Warner Bros will continue as a stand-alone entity. That's not reality. That's not happening. So to beat the dead horse: if these deals don't happen, someone else buys Warner Bros. And it will likely be a buyer far less interested in the future of Hollywood...
But, but, but Hollywood will retort: Netflix is killing us! Are they? When I look at Netflix, I see a company that is always one step ahead of inevitable curves in media creation and consumption. Hollywood won't like that framing because filmmaking is an art form. I don't disagree, but it's also a business. That's really what this is all about. And it is a business that is always morphing. And it is an industry that is always filled with existential dread about those changes.
What Hollywood should be worried about is finding the right type of buyer for one of their major movie studios. And it sure feels like Netflix checks just about all the boxes to be the best possible buyer here. They're big, popular, profitable, and innovative. But this last bit scares Hollywood because again, they're worried that Netflix will turn all movies into streaming content. The dreaded 'C' word.
That's why Netflix CEO Ted Sarandos is on his charm offensive, making increasingly declarative statements that this will not happen. That they're committed to making Warner Bros work as-is – including with theatrical releases. No one in Hollywood believes him, and that's almost reasonable given his previous statements and Netflix's previous stances. But as I've predicted well before this deal, Netflix was always going to change their tune there. Again, they're both smart and innovative. This isn't a $400B media company – twice the market cap of Disney, by the way – on their way to the first $1T media company, because they don't know what they're doing.
You cannot say the same of every other entity that has bought a movie studio.
And that's the thing, this is not some new existential threat for Hollywood. The studios started consolidating or getting scooped up almost a century ago. To the point where only Disney remains a stand-alone entity – and that's largely because they've created (or bought) other businesses to augment (and support) the studio. Many of the studios have changed hands, usually amongst conglomerates, multiple times.
With that in mind, it's rather insane that Hollywood opposes a deal here.
Is Netflix (or Paramount, for that matter) going to continue doing things exactly as Warner Bros has in the past? Undoubtedly not, but other parents of Warner Bros in the past have changed strategies multiple times as well. With Netflix, there's a least a shot at some innovative new approaches and strategies.
This past weekend, I watched The Rip, the new Matt Damon/Ben Affleck movie that debuted on Netflix. Yes, on the streaming service, not in a movie theater. But that was undoubtedly the right call because the movie undoubtedly would not have been a huge hit at the box office. It may have done okay on the back of its stars, but it was never going to be a blockbuster. It's just not the type of movie that fits that mould these days.
Hollywood hates that too, but again, it's reality. And it has long been the reality of the theatrical business. It's a business mostly driven by the big-budget giant tentpole movies and augmented by some breakout hits with far more modest budgets. The Rip is neither. With a budget reported to be around $100M, it would need to make around $200M at the box office just to break even (after the theaters take their cut).
That was not going to happen.
But on Netflix, a movie like this can be a hit. It's a movie with known stars that the service can serve up to their members who are most likely interested in such fare. And, in fact, for the first time, Netflix is allowing a movie to be aligned with becoming a hit, financially. Damon's and Affleck's Artists Equity production company got Netflix to agree to pay a bonus to everyone working on the movie if it hits for the service.1 We don't know those exact metrics and terms, but it's a big deal for Netflix as historically, they've not done any sort of "backend" deal – in part because those have historically been based around box office results, which again, Netflix hasn't historically had in any meaningful way. That probably changes with the Warner Bros deal, but again, even this deal is slightly changing the stakes and structure.
And that should lead to more movies on Netflix taking chances – and hopefully leads to better content.2 Which, yes, is subjective, but has long been hung around Netflix's neck with their films in particular.
Now, the counter to the above would be the other bit of news coming out of Damon/Affleck with regard to Netflix: that the studio pushed them to make structural changes to The Rip that would result in people sticking around early (i.e. not burying the big action in the second and third acts), but also to keep reiterating the plot in dialogue for the viewers who may be watching while on their phones.
I'll pause while Hollywood collectively pukes in their popcorn here.
I tend to think the emphasis should be on making movies (or shows) that are gripping enough to rip people off of their phones – a point which Affleck makes as well in their interview with Joe Rogan, where the above idea is discussed.3 But this is also about the current reality: a lot of people who watched The Rip this weekend did so while on their phones. Netflix is simply aiming to play the game on the field, as it were...
Anyway, I'm too in the weeds now. The point is that Netflix is the one company best poised to succeed in the future of Hollywood. At least right now. And the fact that they're trying to buy one of the major studios should be viewed as a very good thing for the industry, not a bad thing, let alone the end of the world. Because the alternative is probably private equity buying the studio if one of these deals doesn't happen.
Good luck with that, Hollywood.





1 Coincidentally, one of the founding partners at Artists Equity is RedBird Capital – the PE firm run by Gerry Cardinale, which is also the key partner on Paramount Skydance's bid for Warner Bros Discovery... ↩
2 I thought The Rip was decent. A bit better than the typical bland "okay" movies on Netflix, but still not great. In particular, there are some real plot problems in the third act. To me, it would not have been worth the the time and money to go see it in theater, but it was a solid Netflix watch! ↩
3 Alongside Affleck's interesting points and thoughts about AI! ↩
2026-01-19 06:38:46

Wait a minute. You mean to tell me that Chuckie Sullivan might be Will Hunting when it comes to AI? I mean, can we just dive a bit into Ben Affleck's comments on The Joe Rogan Experience – I know, I know! – last week?
Rogan tees up the question around films potentially being written by AI – "it gets weird, right?" To which Matt Damon defers to Affleck as "an area of expertise for him." Okay, I'm intrigued, let's see what you got, Ben!
After a comparison to electricity – not the worst analogy, one that others have used in talking points before,1 though not particularly useful here, we get to the goods:
What I see is if, for example, you try to get ChatGPT or Claude or Gemini to write you something, it's really shitty. And it's shitty because by its nature it goes to the mean, to the average. It's not reliable. I mean, I just can't even stand to see what it writes.
Rattling off the three key AI products at the moment? Check out the big brain on Ben! The point about writing to the "mean" is fine. It's too dismissive of the interesting things such tools do produce, but for the most part, as a writer, I don't disagree. The point about being "not reliable" – presumably he's referring to hallucinations here – seems less relevant to writing a movie script, but we'll allow it.
It's a useful tool if you're a writer and you're going, "What's the—I'm trying to set something up where somebody sends someone a letter but it's delayed two days..." and it can give you some examples of that. I actually don't think it's very likely that it's going to be able to write anything meaningful, and in particular that it's going to be making movies from whole cloth. Like Tilly Norwood. That's bullshit—I don't think that's going to happen.
Yes, it's a tool that can be useful. Agreed. I disagree about being unable to write "anything meaningful" but that's obviously subjective. But I do agree with his bigger notion about not being able to make "movies from whole cloth". I think technically it will happen – obviously – but I highly doubt they'll be things people want to watch beyond the novelty value. Also, I appreciate the passion.
I think it turns out the technology is not progressing in exactly the same way they presented it. Really what it is going to be is a tool. Just like visual effects. And yeah, it needs to have language around it. You need to protect your name and likeness. You can do that. You can watermark it. Those laws already exist. I can't sell your fucking picture for money—I can't. You can sue me. Period. I might have the ability to draw you, to make you in a very realistic way, but that's already against the law.
Mostly agree. Again, I think it's going to be one tool in the tool belt of future filmmakers, and I wish more would recognize this rather than being fearful of it outright. Overall, Affleck seemingly has the right framework here, though he's brushing aside the idea that some people might want to leverage AI to use their likeness. Maybe not the Ben Afflecks and Matt Damons – or Matthew McConaugheys – of the world, but those who are trying to breakthrough and/or want to get paid. This will eventually includes stars as well, just as has become the norm with television commercials – which used to be relegated to markets in Asia. There will be a sort of NIL marketplace for Hollywood.
And the unions—I think the guilds are going to manage this. It's like, "Okay, look, if this is a tool that actually helps us—for example, we don't have to go to the North Pole, right? We can shoot the scene here in our parkas and then make it appear very realistically as if we're in the North Pole. Save us a lot of money, a lot of time. We're going to focus on the performances and not be freezing our asses out there and running back inside." That's useful.
Just like Spencer Tracy and Katharine Hepburn used to be driving their car and there's wind blowing a painting behind them and it looked goofy. Now people use a lot of computer-generated stuff. Some of it is going to replace just that—instead of 500 guys in Singapore making $2 an hour to render all the graphics for a superhero movie, they're going to be able to do that a lot easier.
Yes, that is useful and pragmatic. Not just to save money – though that's obviously a huge part of it – but also to save time and perhaps do things with film that just aren't feasible. How much the unions have to do with that... we'll see. But again, I appreciate this rational stance in a time of irrational stances!
There's already laws and guild guidelines around how many union extras you have to use. But also, we've been tiling extras. There weren't a million orcs in Middle Earth, you know what I mean? In Invictus, there weren't all those people in the stadium. That's something we've been doing.
Yep. Good. Appeal to the nerds. Appeal to Matt Damon. Check. Check.
It kind of feels to me like the thing we were talking about earlier where there's a lot more fear because we have this sense of existential dread—it's going to wipe everything out.
But that actually runs counter, in my view, to what history seems to show, which is that a) adoption is slow. It's incremental...
Yes. Very good. Displacement of jobs will (unfortunately) happen, but it's not going to happen overnight and it's not going to happen across the board in Hollywood. And the new technology will ultimately lead to new jobs that currently don't exist. As Affleck notes, this is the same story as with all technological revolutions throughout history.
I think a lot of that rhetoric comes from people who are trying to justify valuations around companies where they go, "We're going to change everything in two years. There's going to be no more work."
Well, the reason they're saying that is because they need to ascribe a valuation for investment that can warrant the capex spend they're going to make on these data centers, with the argument that, "Oh, as soon as we do the next model, it's going to scale up, it's going to be three times as good."
"CapEx"? Be still my heart! Data center spend? This is an actor, right?
The point is a bit too dismissive/flippant about what the tech companies are trying to do. This isn't all some Ponzi scheme – though some of it undoubtedly is! – to try to trick people (mainly investors given what Affleck is referring to here), it's insanely fast-moving (and growing) technology, and everyone is trying to put themselves in the position to "win". From the perspective of most of these companies, at least given our current moment in time, it's rational. Unprecedented, but still rational.
Except that actually ChatGPT-5 is about 25 percent better than ChatGPT-4 and costs about four times as much in electricity and data. So that's plateauing. The early AI—the line went up very steeply and it's now sort of leveling off. I think it's because, yes, it'll get better, but it's going to be really expensive to get better.
And a lot of people were like, "Fuck this, we want ChatGPT-4." Because it turned out the vast majority of people who use AI are using it as companion bots to chat with at night and stuff. There's no work, there's no productivity, there's no value to it. I would also argue there's also not a lot of social value to getting people to focus on an AI friend who's telling you that you're great and listening to everything you say and being sycophantic. But that's sort of a side issue.
Technically, it's 'GPT-5' and 'GPT-4', which are models running in ChatGPT, the service, but we'll let it stand. I'm more impressed he knew the version numbers and the high-level complaints OpenAI faced in switching between those two models.
As for "the vast majority of people" using AI as companions, this feels a bit too headline-y. As does the "no productivity" and "no value" stuff. It's too dismissive. But he is generally correct in the notion of more compute (and more money) needed to keep scaling as the low-hanging returns taper off.
For this particular purpose, the way I see the technology and what it's good at and what it's not—it's going to be good at filling in all the places that are expensive and burdensome and make it harder to do things. And it's always going to rely fundamentally on the human artistic aspects of it.
Okay, good, we're back on track. Agreed.
At this point, Joe Rogan jumps in:
Well, I think the more it becomes ubiquitous, the more people are going to appreciate real things that are made by real people.
Holy shit, I fully agree with Joe Rogan on something! I've written about exactly this notion a number of times now. He talks about tables and Claire Danes, okay sure, but yes! In a way, we're going to shift from caring more about the output, to caring more about the input. The time someone – a human – put into doing something.
One more thing: as some folks have pointed out in replies on social media, this isn't the first time Affleck has waxed poetic on this topic. I like "craftsman is knowing how to work, art is knowing when to stop." – a variation of a quote by Toni Morrison (and others). But I also like his immediate and intimate knowledge of Succession!
Also, how about 20+ years ago when he laid out what Spotify and Netflix would become...





1 Of course that's your contention, you're a first year AI student. You just got finished listening to Andrew Ng's lecture on why "AI is the new electricty"... I kid, I kid, but come on, this is a scene in a movie you wrote – a scene you're in! You know, the very scene from where my title comes from... ↩