2026-02-04 04:00:22

For the past decade and change, I’ve tested and reviewed a large number of Android phones, augmenting the iPhone expertise I’ve built up since seeing the very first iPhone live and in person at Macworld Expo 2007. And while that parade of phones has included some winning devices, my overall impression of the Android experience falls somewhere along the lines of “How do people live like this?”
That’s largely a reflection of the haphazard way Android phones receive their software updates. Some, like Google’s own Pixel devices, get new features right away, while others see updates once phone makers and wireless carriers are good and ready to release them. As someone used to downloading iOS updates the moment they’re available, that throws me. Android partisans tell me I’m being silly — sometimes politely, sometimes less so — and they may well have a point.
But even if some elements of the Android experience don’t land with me, I’d have to be a pretty narrow-minded person not to appreciate the features that do deliver. Android phones get a lot of things right — and some of those are missing in action when it comes to the iPhone.
Look, Apple didn’t sell more than $201 billion in iPhones during its 2024 fiscal year by listening to my advice, and I certainly don’t expect the company to start casting sideways glances toward Android phone makers to surreptitiously gather ideas on how to spruce up the next iPhones. But I do think there’s some merit in looking at areas where Android phones excel and how adopting something similar might give the iPhone a boost.
After all, we have some evidence that this already happens to some degree. Google added a rather distinctive horizontal camera bar to the Pixel 6 back in 2021. And while I don’t think the iPhone 17 Pro’s extended camera array is a direct copy, it certainly seems to draw some inspiration from what the Pixel has offered for years. Bringing the new look to the iPhone also allowed Apple to shift around internal components so that the current iPhone can benefit from a bigger battery and a vapor chamber, so there are benefits to adopting, adapting and improving.
So here’s what I’d flag up from my time looking at Android phones for features I’d like to see on the iPhone.
One of the best rivalries going right now involves Apple and Google battling it out to see whose phones can produce the best photos. It’s a pretty even match-up that seems to shift every time one of the companies rolls out a new flagship device, but I do think Google’s use of computational photography to produce high-quality images gives it an edge over Apple.
Some of this involves long-standing capabilities like the Super Res Zoom feature that’s been a part of Pixel phones for years. It cleans up digital zooms so that they don’t have the noise and fuzziness that can creep into a shot the closer you zoom in. I’ve also been impressed by features like Best Take, where the Pixel uses multiple exposures in group shots to make sure everyone’s looking their best.
I’d especially like to see Apple try its hand at a version of Pixel camera features, such as Add Me or Camera Coach. Add Me, debuting with 2024’s Pixel 9 release, lets you insert yourself into group photos you take, using AR overlays to show you where to stand and then stitching together the photos by tapping into AI. My results with Add Me on Pixel phones tended to be hit-or-miss — often, the final output depends heavily on who you hand the camera phone off to — but I bet Apple could make the process a little bit more foolproof.
Similarly, the Pixel 10’s Camera Coach feature isn’t flawless. But I like the concept behind it, as Camera Coach uses the phone’s Gemini assistant to make real-time suggestions on how to take a better shot. Once Apple gets its act together with Siri — and more on that in a bit — this is something the iPhone could easily offer.
I take a lot of screenshots on my phone — sometimes to preserve information I want to remember, sometimes to chronicle how-to steps for an article. And when I use a Pixel phone, I like that there’s an app that lets me stay on top of those screenshots, rather than letting them live unsorted on my camera roll.
I’m speaking of the Pixel Screenshots app, introduced with the Pixel 9, that collects all the screenshots you’ve captured. But it also lets you do more than that, like set reminders if there’s a specific action you want to take that’s related to that screenshot (even if it’s something as basic as “remind me to enter this recipe I’ve found into my database of recipes when I have more time.”) You can group screenshots into collections, too — handy for keeping those how-to screenshots in one place.
My favorite part of the Screenshots app is that those screenshots are now searchable — as in, the Pixel’s on-board smarts allow you to search for screenshots based on the information they contain. It saves me the trouble of having to remember when I captured a particular screenshot by just bringing up what I’m looking for, when I want it.
I feel like Apple is taking steps in this direction by expanding Visual Intelligence in iOS 26 to work with screenshots — adding calendar entries based on times and dates in a screenshot, translating words in a screenshot and running web searches on objects included in a screenshot. Cataloging the contents of those screenshots in a single app feels like the next natural step in Visual Intelligence’s evolution.
I don’t have to tell you that Siri needs to get a lot smarter, a lot faster. But if you’ve had a chance to use the Gemini Assistant on any recent Android flagship, you know how far behind Apple is when it comes to a truly intelligent assistant.
Hopefully, the iOS 26.4 update and its rumored updates to Siri will get Apple back in the game, though I don’t think it would be reasonable to expect Apple to catch up with Android with just one software update. If Apple is truly serious about making a go of Apple Intelligence and its suite of AI tools, it would be wise to work toward what Samsung currently offers with its cross-app actions through Galaxy AI.
Samsung introduced cross-app actions just about a year ago, and they really show off what on-device AI can offer, even to an AI skeptic like myself. With this feature, you can issue one command to your on-device assistant — “find me a nearby BBQ joint and text the address to my good pal Jason,” for example — and your assistant carries out that task across multiple apps. There’s no need for repeated commands or clarifications — just tell the assistant what you want done, and it goes and carries out the multi-step task.
Outside of some of the camera features mentioned above, this is probably the Android capability I miss the most when I’m back in my familiar iPhone terrain. And it’s something Apple needs to adapt on its own sooner rather than later.
2026-02-04 02:00:41

Despite some bumps in the road for its AI-driven features on the consumer side, Apple’s not slowing down on integrating the technology into its products. Today the company announced the latest update to its Xcode developer tool, which brings support for agentic AI coding.
Adding the agentic model opens up features like the ability for these AI models to get even deeper access to and more power with your projects. For example, the agents can look at and parse your project’s file structure to get more information, or even test and build the project all by itself. They’ll also have access to the latest documentation, allowing them to take advantage of the most recent APIs. Perhaps most impressively, these tools continue to iterate, repeatedly testing, verifying, and fixing errors until the project builds successfully.
This feature builds on top of the existing intelligence-powered tools and integrations that Apple introduced in Xcode 26 last year. Out of the box, Xcode 26.3 has built-in support for Claude Agent and ChatGPT’s Codex, allowing users to log in with their accounts or API tokens. But because this system is underpinned by Model Context Protocol (MCP), any other agent that supports the open standard can be integrated as well.
Xcode 26.3’s release candidate is available for download today.
2026-02-03 02:50:18
It’s Super Bowl week and the start of the Olympics, so Will Carroll joins Jason to discuss Peacock’s almost-make-or-break moment, streaming fights and wrestling, and the fate of a clutch of Regional Sports Networks and other cable channels.
2026-02-03 01:30:30

Six Colors subscriber Mihir writes in with a Photos question:
How do I delete just the RAW file in a RAW+JPEG pair from my photos library on my iPad or my iPhone?
The short answer: you can’t. Not directly, anyway. And it’s not just an iOS or iPadOS limitation—macOS won’t let you do it either.
I can understand why Mihir asks. An image in RAW format can occupy several times the amount of storage as a JPEG equivalent. This has to do with the nature of the image being stored, as I explain below.
There are good reasons to capture as RAW and good reasons to discard those formats later. I’ll go through the background of RAW, and then provide a workaround to Apple’s missing piece.
The RAW format used by digital cameras is often capitalized as RAW, but it’s not an acronym, nor is it a format in the traditional sense.
RAW means the file contains “raw,” or unprocessed, sensor data from your camera. To produce a JPEG, TIFF, or other format, a digital camera—including your iPhone—performs post-processing to produce an image that’s immediately usable. This can involve making significant changes to dynamic range and white balance, or even combining multiple images as a form of computational photography, as Apple does with iPhone photos.

This makes RAW the digital equivalent of a film negative: it’s typically larger than a post-processed file, and contains information that hasn’t yet been shaved down or squeezed into a presentable output. This gives you more flexibility when editing, but it requires processing to be usable for design, printing, or sharing.
Many cameras let you set a RAW export that includes a JPEG preview usable on its own. The JPEG is the best post-processed output from the RAW, and was originally provided because desktop (and later mobile) software didn’t support RAW or didn’t always keep up to date. Without the JPEG, importing the RAW file by itself would have been much less useful.
There’s no single RAW standard—Canon, Nikon, Sony, and others each have their own proprietary versions. It has become common to write “raw” in all caps, probably to distinguish it from the adjective form.
Because the information comes more or less directly from sensors without intermediate steps, it contains much more data that appears like noise, as the variation between adjacent sensors is retained rather than smoothed away. So even for RAW formats that compress data—not all do—the files will be larger than final images intended for viewing or printing. RAW will always be much larger than a corresponding JPEG file, as JPEG is lossy by nature, and discards some information even when you’re using the maximum setting.
After camera makers began supplying RAW output, often requiring apps they released to support it, photo-management and image-editing tools added RAW processing filters to meet the needs of digital photographers. Every professional app supports importing RAW, including Photoshop, Lightroom, Pixelmator Pro, Capture One, and DxO PhotoLab. And Photos!
Image-editing apps generally treat RAW as an import format: you view a preview, then apply changes before it is imported into an editing environment where you can work on the resulting image. Photo-management apps with built-in editing tools, like Photos and Lightroom, typically retain the original RAW image, and allow you to apply modifications on top. This provides much more flexibility in achieving your desired outcome.
When you import RAW+JPEG pairs into Photos, Apple treats them as a single, indivisible unit. You can choose which version to use as the basis for editing (Image: Use RAW as Original or Use JPEG as Original), but you cannot discard one half of the pair while keeping the other. Delete the image, and both files are thrown away.
Apple built Photos around a lossless workflow. This means that the original file that’s imported isn’t modified—changes are layered on top and previewed, and can be reverted back to the source image. You’d think it might engineer an override in a case like this, but apparently not.
If you need to reclaim the storage space those RAW files occupy, it’s only possible on a Mac, and it requires exporting, deleting, and re-importing.
Follow these steps if you haven’t made any modifications that you want to keep for any or all of your RAW+JPEG pairs:


Of course, you can use the same process to jettison the JPEG and retain the RAW-formatted file.
When you delete files, if you’re sure that you have all the backups you need, you can click the Recently Deleted album in the Photos sidebar, authenticate if prompted, and click Delete All. (Or select images and click Delete X Items.) This removes the images from your Mac, iCloud Photos, and all linked devices immediately and forever. Use wisely!
Now, I noted above that this works for images that you haven’t modified in Photos. As part of its lossless workflow, exporting unmodified originals means you lose any changes unless you follow these steps:
Because of how iCloud Photos syncs images, you may want to delete all the images you intended to first, and make sure those images have moved to the Recently Deleted folder on your devices before you re-import them.
While Mihir specifically asked about iOS and iPadOS, the export-delete-reimport workflow requires the Finder and file management capabilities that only macOS provides.
For more expert advice on Photos, you should obtain a copy of Jason Snell’s Take Control of Photos, which addresses all of the app’s features and vagaries.
People have been asking Apple to add a “split RAW+JPEG pair” or “delete RAW only” feature for many years, and the company hasn’t budged, likely because of its focus on lossless workflows.
In the meantime, if you find storing RAW+JPEG is taking you too close to a full volume, you could shoot RAW only on your camera, and let Photos generate a JPEG preview. If you want to convert to JPEG, you can export it from the RAW file and re-import it. Or you might switch between RAW+JPEG, RAW, and JPEG shooting profiles on your camera, as many support user-defined modes that include output formats.
[Got a question for the column? You can email [email protected] or use /glenn in our subscriber-only Discord community.]
.cr2, .cr3, .nef, .raf, .arw, and .dng. Failing that, look for any file that doesn’t end with .jpg/.jpeg or .xmp. ↩
2026-02-03 00:49:42
We break down Apple’s latest financial results (including the potential supply-chain storm brewing on the horizon) and then discuss the difficult roll-out of Apple’s new Creator Studio bundle.
2026-01-31 01:00:09
My thanks to Magic Lasso Adblock for sponsoring Six Colors this week.
With over 5,000 five star reviews; Magic Lasso Adblock is simply the best Safari ad blocker for your iPhone, iPad and Mac.
And with the new App Ad Blocking feature in v5.0, it extends the powerful Safari and YouTube ad blocking protection to all apps including news apps, social media, games, and other browsers like Chrome and Firefox.
So, join the community of over 350,000 users and download Magic Lasso Adblock today from the App Store, Mac App Store or via the Magic Lasso website.