After many years of non-pro iPhones, Apple’s new 48-megapixel iPhone 14 Pro was enough to convince me to pull the trigger. I was ready to fall into the embrace of Apple’s AI-assisted ProRAW format. While it has its uses, in its current state it’s still kind of a mixed bag.
When Apple first launched ProRAW with the iPhone 12 Pro series, I wondered how “raw” the ProRAW files were, what with some AI-assisted processing baked in. After a couple of years and a few hardware revisions, it’s clear the answer is: Very.
In good light, the slight boost from AI is hardly noticeable and the quality of the main iPhone 14 camera shot in ProRaw shines:
In these photos of Quinnipiac University’s campus, I pitted a Canon EOS R5 with the RF 24-105mm f/4L IS USM lens against the best lens/sensor the iPhone 14 Pro has to offer, its 1x (24mm equivalent) lens that has 48 megapixels. While it’s hard to tell what AI is doing in any photo, as Apple’s not very transparent about it, both photos were pretty much equal when it came to quality. The corners were slightly sharper on the iPhone (an AI assist maybe?) but the trees were slightly less detailed. Both photos were processed from the raw files of each camera in Photoshop to match as closely as possible.
But it’s when I pushed the iPhone in low light against the R5 with a geriatric, almost 20-year-old Canon EF 17-40mm f/4L USM lens that even the older lens was able to pull away from the iPhone, easily.
Here’s a look at Newport, Rhode Island’s Castle Hill Lighthouse with both the R5 and the iPhone at the same focal length.
The differences are obvious even at small sizes. It’s clear, to my eyes at least, what the AI is doing. The shadow details in the rock are basically gone, and so the iPhone filled in the shadow details with its best guess, resulting in the “waxy, painterly patterns” that I described two years ago. All that time and several hardware revisions have passed, and it's still that bad. Even simpler patterns, such as the brickwork on the lighthouse itself showed that “Vaseline smeared over the lens” look I described back then.
Much of this also comes down to the ability to change settings. If I want to use ProRAW, I’m stuck using Apple’s default camera app, which chose a shutter speed of 1/19 at ISO 1000 on with the main lens’ f/1.8 aperture. By comparison, I did a .6 second, ISO 100 at f/8 exposure for the Canon.
The strange part, though, is that my initial reaction to seeing these photos on the small phone screen was awe about how good they looked. In blind (unscientific) tests of both of these sets of photos amongst my social media following of mostly photographers and media professionals, even seasoned veterans guessed wrong about both photos, with many preferring the iPhone’s take on things.
And I think that’s what Apple’s betting on here is that AI will make the photo look better to most folks rather than produce what’s accurate.
Where this should have most photographers worried is that Apple calls this mode “ProRAW,” when, with this type of heavy-handed editing of what’s supposed to be a digital negative, it’s anything but.
All that said, I'll probably take some flack for the light levels being a bit lower on the iPhone post of the lighthouse above. Indeed the photos were shot 9 minutes apart at sunset, but for comparison, here's a shot with the iPhone at the same time as the R5, and while it's better, the same issues remain to an extent:
While the issues of what's happening in the ProRAW files are still there in the same spots (the bricks, the shadows of the rocks), this photo does track much closer to the R5 photo.
All of this leads me to the question: How did camera manufacturers let it get this close? When a phone can go almost toe-to-toe with a professional camera and lens, or at least good enough for most people, what's the point of it all?
What are your thoughts on Apple’s approach to raw files? Leave your thoughts in the comments below.