Skip to main content

What is Deep Fusion? How it works, and what photos look like without it

Apple’s Deep Fusion camera feature has made a lot of buzz before its official release in iOS 13.2. Now that it’s live, we’re taking a deeper look at what’s being fused, and how.

It’s not that con-Fusing

Much like Apple’s Smart HDR, Deep Fusion relies on object and scene recognition, as well as a series of eight images captured before you click the shutter button.

Recommended Videos

Of the eight images, four are taken with standard exposure, and four with short exposure. A ninth picture is then taken with a long exposure when the shutter button is triggered. The short exposure shots are meant to freeze time and bolster high-frequency details like grass blades or stubble on a person’s face. Therefore, the sharpest image of this series is chosen to move on to the next step.

Three of the standard-exposure shots which display the best color, tones, and other low-frequency data are then fused with the long-exposure frame to compose a single image. This image and the sharpest short-exposure frame are then sent through neural networks, which choose between these two images for the best pixel to ultimately represent this photo. This pixel-by-pixel analysis enhances your images by ultimately minimizing noise, sharpening details, and accurately coloring your photos, doing so on a very granular and intelligent level.

how deep fusion works
Genevieve Poblano/Digital Trends

All of the post-shutter processing is done behind the scenes, so it won’t impact your photo capture time. In other words, you can still snap back-to-back photos just as quickly as you ever could on the iPhone, and if they’re all using Deep Fusion, they’ll simply be queued up in the camera roll to be processed in order. Apple says you could go into your camera roll and potentially see an image still processing for a half-second, but I’ve yet to encounter this. Deep Fusion won’t work on burst mode, however, and it will only be available on the iPhone 11 and iPhone 11 Pro models.

It just works

Apple’s iconic mantra is the guiding principle for Deep Fusion’s nonintrusiveness. There’s no toggle to flip this on; it will enable itself when possible in during various lighting situations that vary by the lens you’re shooting with.

The ultrawide-angle lens, for instance, cannot take advantage of Deep Fusion at all. On the main lens, Deep Fusion kicks in for what Apple describes as “indoor lighting” or anything below twilight in outdoor settings — that is, if the iPhone doesn’t explicitly offer night mode. The telephoto lens uses Deep Fusion for anything that isn’t very dark or exceedingly bright, but keep in mind that darker situations usually disable the telephoto camera and kick over the responsibilities to the main sensor, which will then determine what to do — be it Deep Fusion, night mode, or Smart HDR.

The results

So far, Deep Fusion’s impacts have been mostly subtle in our testing. We tested it on our iPhone 11 Pro Max running iOS 13.2 against the iPhone 11 Pro running iOS 13.1 which is not equipped with Deep Fusion, and at first, it’s hard to see Deep Fusion’s influence. But zooming in on a picture did reveal areas where finer details were more defined and less smoothed-over. This isn’t something that’s going to jump out at you, especially looking at it on a phone, though. Of course, Deep Fusion being a part of the equation never produced a worse image than without it.

We did have a couple of instances where, if you zoomed in a little, you could appreciate the difference Deep Fusion makes, particularly in the finer details of an image. So, while it may not be as magical as night mode, it’s still better to have than to not.

Corey Gaskin
Former Digital Trends Contributor
Corey’s technological obsession started as a teenager, lusting after the brand-new LG VX8300 flip phone. This led him to…
The iPhone 16 is having battery life problems. Here’s what we know
Battery page on the iPhone 16.

Do you feel like your battery life has worsened since upgrading to iOS 18? If so, you aren't alone. Dozens of users are reporting excessive amounts of battery drain, specifically on the iPhone 16 and iPhone 16 Pro. But is there a fix? Unfortunately, not yet.

According to MacRumors, there's a long-running thread with hundreds of posts from users lamenting their battery's lack of joie de vivre. One user said their phone drops from 100% charge to 60% by midday, even though there was no heavy usage during that time. Another user reported a battery drain of around 1% every five minutes. Obviously, this isn't a great look for Apple.

Read more
Notification Summaries are my favorite Apple Intelligence feature
iPhone 16 Pro homescreen with an Apple Intelligence Notification Summary

Apple Intelligence is my favorite AI tool right now. That's a bold statement, I know, especially given the magical capabilities of generative AI. But those aren’t features that you use all day.

To me, the best use of AI impacts everything you do and, crucially, makes it easier to perform those tasks. There are many different implementations of AI, but the most useful one I’ve found is Apple Intelligence — specifically the Notifications Summaries feature. It's transformed how I check and respond to hundreds of notifications every day, and I love it.
Why Notification Summaries are so good

Read more
Photographic Styles is the best iPhone feature I’ve used in years
Using Photographic Styles control pad on iPhone 16 Pro.

What makes a good photo? For the average person, anything with tons of punchy colors and sharp lines that can elicit some envy-filled comments on social media is good enough. A few would argue that you need reality in a picture, perfect or otherwise, and not a saturated, computationally generated mess.

I recently spent $20 on the Halide app to find out what an AI-free, computation-free picture looks like. The results from the aptly named Process Zero were a revelation. But as any photographer would tell you, a camera is only as good as the hands using it.

Read more