Skip to main content

Anamorphic app review

Anamorphic app unveils the magic behind the iPhone 7 Plus Portrait Mode

Anamorphic app review
Image used with permission by copyright holder
One of the many new features Apple is rolling out with iOS 11 is the ability for third-party apps to make use of depth data gathered by the dual cameras on the iPhone 7 Plus (and, presumably, on the upcoming iPhone 8 and iPhone X). Anamorphic, a new iOS app from visual effects software developer BrainFeverMedia, is one of the first to take advantage of this feature. The app is currently in beta (along with iOS 11 itself), and Digital Trends has been testing it. Beyond offering insight into the magic of how Portrait Mode works, we discovered in our Anamorphic app review that it opens new creative doors for iPhone photographers.

How depth information is gathered

The iPhone 7 Plus is the first iPhone to offer two camera modules: a standard wide-angle, plus a telephoto lens. In addition to two unique angles of view, the iPhone 7 Plus introduced users to Portrait Mode, which used computational photography to create a faux shallow depth-of-field effect — where the subject is in focus and the background is blurry.

iPhone 7 vs. iPhone 7 Plus camera
Image used with permission by copyright holder

Portrait Mode looks at the differences between the two images captured by both cameras and uses that information to determine the depth within the photograph — much in the way your two eyes help you determine depth in the real world. Essentially, with some AI assistance, the iPhone can tell which objects are in the foreground and which are in the background. A selective blur can then be applied to areas of the frame, and the amount of blur can even decrease with distance for a more realistic effect. Combined with facial recognition, the mode is especially useful for portraits — hence the name.

Recommended Videos

However, until iOS 11, Portrait Mode was only available through the built-in camera app, and users had no control over the strength of the effect. With the new depth APIs (application programming interfaces) in iOS 11, third-party developers now have the opportunity to take advantage of the same computational photography used in Portrait Mode.

Seeing is believing

From a purely technical perspective, Anamorphic offers a glimpse behind the curtain of how Portrait Mode works by actually displaying a live depth map next to the camera preview image. This lets you see exactly what the iPhone is seeing in terms of depth, and for those of us on the nerdier side, it’s a welcome bit of information.

Anamorphic offers a glimpse behind the curtain of how Apple’s Portrait Mode works.

For anyone just out to take pretty pictures, the visualization of the depth map may not matter as much, but it can still provide useful information. For one, as good as the iPhone 7 Plus is in determining depth, it is not perfect. By seeing the actual depth map, you can locate errors before you take the picture. Sometimes, just adjusting your distance or angle to the subject can help clean things up a bit.

But the depth map also comes into play after the fact. Once a photo is taken, you can actually adjust the depth map within Anamorphic, effectively shortening the available depth and determining where the blur will begin to set in. You can also control the amount of blur itself, akin to adjusting the depth of field by opening or closing the aperture on a DSLR or mirrorless camera lens.

And, true to its name, the app even gives an option for the style of blur: Regular or anamorphic, the latter being an imitation of anamorphic cinema lenses. All of this provides much more control than the built-in Portrait Mode (which is a simple, binary decision of “on” or “off”).

In addition to interacting with the depth data, Anamorphic offers a number of Instagram-esque filters as well as some basic editing options that let you adjust exposure or add film grain or a vignette.

Don’t throw away your DSLR yet

For as much as Anamorphic offers, it also makes clear the iPhone’s shortcomings. Basic, two-lens computational photography has some advantages over traditional cameras, such as the ability to adjust the amount of blur after the shot. However, there are still many limitations.

Portrait Mode users are undoubtedly familiar with the “Place subject within 8 feet” warning that displays when the camera is too far from the subject for Portrait Mode to work correctly. This is a result of the two camera modules being so close to each other, which means after a certain distance (8 feet, apparently) there is no longer a significant enough difference between the two images to determine depth.

Also, while adjusting the depth map and blur after the shot is a novel feature, this is not the same as refocusing an image. Anamorphic does have the option to invert the depth map (blurring the foreground instead of the background) but this only goes so far. The iPhone lenses natively have a very deep depth of field (meaning, most of the depth of an image is in focus), but it is not infinite. If you focused on something close to the lens, you won’t be able to dramatically change the image to make it look like you focused on the background.

Anamorphic also takes the artistic liberty of adding chromatic aberration (purple and green fringing) into the blurred part of the photo, which, at least in its current pre-release form, is not user-controllable. This applies to both images shot with the Anamorphic camera as well as those captured via the built-in camera app and edited in Anamorphic. While the effect is not inherently unattractive, we would like to see an option to toggle it on and off in a future release of the app.

An exciting look at what’s to come

While Anamorphic (and iOS 11, for that matter) is still in development, it’s exciting to see the potential of what it offers. It provides a much more robust version of Apple’s Portrait Mode. Frankly, we feel like Anamorphic’s depth map and blur controls should be part of the default camera experience, although we can also appreciate Apple’s desire for simplicity.

Anamorphic is the first of what will likely be numerous apps taking advantage of the new depth APIs in iOS 11, and while it’s not yet perfect, it is certainly promising. We look forward to trying out a final version after iOS 11 is officially available.

Daven Mathies
Former Digital Trends Contributor
Daven is a contributing writer to the photography section. He has been with Digital Trends since 2016 and has been writing…

Slap a filter on the iPhone 11’s triple camera with PolarPro’s unique system
polarpro litechaser pro iphone 11 filters announced bh 09941

The iPhone 11 may have three camera lenses, but PolarPro’s new system allows the camera to use a single filter to cover all three. Announced on February 18, the PolarPro LiteChaser Pro is a case and filter system for the iPhone 11, 11 Pro, and 11 Pro Max.

The filters use a quick-mount system to cover the iPhone 11’s array of lenses. The filter system will launch with three filters that are the most common types of filters used on DSLRs and mirrorless cameras. The circular polarizing (CP) filter fights reflection and haze (or in some cases, can be used to exaggerate a reflection). For filmmakers, the variable neutral density filter (VND) reduces the amount of light coming through the lens by a three- to five-stop range without swapping filters. The ND8 reduces light by three stops and the ND64 does so by six stops, in separate filters.

Read more
GoPro launches ultralight, affordable Hero 4K Camera for $199
The 2024 GoPro hero is frozen in ice.

GoPro enthusiasts have a new camera to consider after the company introduced its miniature, ultralight 4K Hero late last week. It is the company's smallest and most affordable offering, costing just $199.

The Hero is waterproof and combines GoPro's simplest user interface with 4K video, 2x slo-mo at 2.7K resolution, and 12-megapixel photos. It is available on retail shelves around the world and online at GoPro's website.

Read more
The best camera phones in 2024: our top 9 photography picks
A person holding the Samsung Galaxy S24 Ultra and Xiaomi 14 Ultra.

In the past decade or so, cameras on smartphones have evolved so much that they can pretty much replace a standalone digital camera for most people. The results you can get on some of the best smartphones these days are just so impressive, and being able to be with you at all times means you'll never miss a moment.

But what if you want the best possible camera phone money can buy? A camera that won't let you down no matter what you're taking a picture of? You've come to the right place. Here are the very best camera phones you can buy in 2024.

Read more