Check Out the AI Powered Depth Camera System in iPhone SE

  • AUTHOR: isbah
  • POSTED ON: April 30, 2020

Apple released the iPhone SE quite recently and as the shipment has finally started, we are receiving more and more reviews of this device. It turns out that the iPhone SE only has one camera, unlike all the other phones being released by the company recently. However, the results are pretty good and Dieter Bohn, The Verge’s tech expert says that the camera is well-lit.

For portrait photos, the iPhone SE’s camera uses machine learning to estimate the depth of field, and Ben Sandofsky, one of the developers of mobile photography app Halide, took a closer look at how portrait photos taken by the SE’s single-camera actually work.

The key feature of the iPhone SE’s portrait mode is that it can do something called “monocular depth estimation,” which is enabled by the iPhone SE’s A13 Bionic processor (the same processor in the iPhone 11 and 11 Pro). That processor and the depth estimation allows the phone to capture depth maps for photos differently than, say, the iPhone XR, which also has a single lens. Sandofsky found that the SE could even estimate depth for flat photos.

Sandofsky tried to demonstrate this by clicking pictures of this photo with an iPhone XR and an iPhone SE:


Source: Halide

Source: The Verge

In above images, you can see how these phones estimated depth.


Even though the iPhone SE’s depth map does not provide a perfect representation of the actual depth of the scene, it is amazing that it can estimate the actual depth of a flat photo by using machine learning.  

Updated April 30, 2020
Back To Top