With iPhone 11 & # 39; Deep Fusion & # 39; camera mode, Apple takes your photos to the next level
Deep Fusion, Apple's new imaging technology for iPhone 11, 11 Pro and 11 Pro Max will be available to developers as part of the next iOS beta. When the latest iPhones were first announced in September, Apple showcased the new ultra-angle camera, Night Mode and an enhanced selfie camera, all of which represented a significant step forward for iPhone photography and videos. And now that the new iPhones are in the wild, we have tested their cameras and can confirm their improvements as well as the absolute joy we feel with that ultra-angle camera. But there's one camera feature Apple teased at this fall's iPhone event that nobody has tried: Deep Fusion.
Although it sounds like the name of a sour jazz band, Apple claims that the brand new photo processing technique will make your photos pop with detail while keeping the amount of image noise relatively low. The best way to think about Deep Fusion is that you are not meant to. Apple wants you to rely on this new technology, but don't think too much about it. There is no button to turn it on or off, or really any indication that you are even using the mode.
Right now, anytime you take a photo on an iPhone 11, 11 Pro or 11 Pro Max, the default mode is Smart HDR, which takes a series of pictures before and after recording and blends them together to enhance the dynamic range and details. If the environment is too dark, the camera automatically switches to night mode to improve brightness and reduce image noise. With Deep Fusion, when you take a medium to low light image, such as indoors, the camera will automatically switch to mode to reduce image noise and optimize details. Deep Fusion, unlike Smart HDR, works at pixel level. If you use the "telephoto lens" on the iPhone 11 Pro or 11 Pro Max, the camera will fall into Deep Fusion largely when you are not in the brightest light.
This means the iPhone 11, 11 Pro and 11 Pro Max have optimized mode for bright light, low light and now medium light. And I would argue that most pictures are taken in medium to low light situations like indoors. The effect Deep Fusion will have on your photos is huge. It's as if Apple changed the recipe for Coke.
At the iPhone event, Apple's Phil Schiller described Deep Fusion as "computational photography mad science." And when you hear how it works, you'll probably agree.
Mainly when taking a photo, the camera takes several pictures. Again, Smart HDR does something similar. The iPhone takes a reference image that is meant to stop motion blur as much as possible. It then combines three standard exposures and one long exposure into a single "synthetic long" image. Deep Fusion then breaks down the reference image and synthetic long photograph into multiple regions that identify the sky, walls, textures and fine details (like hair). Then the software does a pixel-by-pixel analysis of the two images – that's 24 million pixels in total. The results of this analysis are used to determine which pixels to use and optimize to build a final image.
Apple says the whole process takes a second or so to happen. But to give you the opportunity to keep taking pictures, all information is captured and processed when the iPhone's A13 processor has a chance. The idea is that you will not wait for Deep Fusion until you take the next picture.
The release of Deep Fusion comes just a few weeks before Google will officially announce Pixel 4 its latest flagship phone on a line known for camera processing.
I should note that Deep Fusion will only be available on the iPhone 11, 11 Pro, 11 Pro Max, because it needs the A13 Bionic processor to work. I'm excited to try it and share the results when the developer version is out.
CNET can get a commission from retail offerings.