This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.
Apple’s iPhone 14 and iPhone 14 Plus phone get better main and selfie cameras, but serious smartphone photographers will be more interested in the iPhone 14 Pro and Pro Max announced Wednesday. These higher-end phones come with a 48-megapixel main camera designed to capture more detail.
The main camera on the $999 iPhone 14 Pro and $1,099 iPhone 14 Pro Max has an image sensor that’s 65% larger than last year’s, helping to double its low-light performance, Apple said. Low-light performance, a critical shortcoming in smartphone cameras, triples on the ultrawide angle camera and triples on the telephoto.
Apple has stuck with 12-megapixel main cameras for years. The approach offered a reasonable balance of detail and low-light performance without breaking the bank or overtaxing the phone processors that handle image data. But rivals, most notably Samsung, have added image sensors with 48 megapixels and even 108 megapixels.
Image sensors with more megapixels mean each pixel is physically smaller, and that can hurt image quality unless there’s lots of light. But by joining 2×2 or 3×3 pixel groups together into a single virtual pixel, an approach called pixel binning, camera makers get more flexibility. When there’s abundant light, the camera can take a 48-megapixel image that lets you dive into the photo’s details better. But if it’s dim, the camera will use the virtual pixels to take a 12-megapixel shot that suffers less from image noise and other problems.
Apple unveiled the new camera technology at its fall product launch event, a major moment on the annual technology calendar. The iPhone itself is an enormous business, but it’s also a foundation of a huge technology ecosystem deeply embedded into millions of peoples’ lives, including services like iCloud and Apple Arcade and accessories like AirPods and Apple Watches.
Cameras are one of the most noticeable changes in smartphone models from one year to the next, especially since engineers have embraced thicker, protruding lenses as a signature design element. Customers who might not notice a faster processor do notice the arrival of new camera modules, like the ultrawide angle and telephoto options that now are common on high-end phones.
Still, much of the improvement in smartphone photography relies on changes that are less visible. Faster processors, including graphics processing units, image processors and AI accelerators, are critical to new computational photography software that’s core to the smartphone photography revolution. In Apple’s new iPhone 14 models, it calls its latest processing system the Photonic Engine.
This technology is an advance over Apple’s previous Deep Fusion technology for merging multiple frames into one shot, preserving detail and texture when lighting is modest or dim. With the Photonic Engine, Deep Fusion begins earlier in the image processing pipeline, working on the raw image data to better preserve detail and color, said Caron Thor, Apple’s senior manager of camera image quality.
iPhone 14 and 14 Plus camera upgrades
Apple’s lower end iPhone 14 and iPhone 14 Plus get a new main camera that gathers 49% more light, with a larger sensor and a wider f1.5 aperture so its lens can let in more light, Thor said.
The Photonic Engine technology improves low-light photography on all the new phones’ cameras, though, including its new selfie camera and ultrawide back camera, she said.
The new selfie camera on the iPhone 14 and 14 Plus offers a 38% improvement in low-light performance. And for the first time, it also has autofocus to avoid blurry faces.