The AI - powered portrait mode can intelligently differentiate
foreground subjects from their background to deliver a professional grade shallow depth of field.
Not exact matches
Whatever
subject you are photographing try to mentally visualize what you are trying to emphasize in the photo (
subject's size,
background,
foreground, etc) to capture the moment
from a unparagoned viewpoint.
The result is a stencil - like cutout — a «segmentation mask» — that separates the in - focus
foreground subject from the out - of - focus
background.
This will allow for a true 2x optical zoom along with Apple's new beta Portrait Lighting photo mode, which creates a depth map to separate the
subject from the
background and then adjusts the lighting between
foreground and midground to get the best results, all in real time.
But compared with the Pixel 2's Portrait Mode, the iPhone X has horrendous separation between the
subject,
foreground, and
background, and even worse, because it primarily uses the zoom lens you have to step far back away
from your
subject.
This creates a «Bokeh» effect in photos which makes a
subject more prominent by differentiating the
foreground from the
background in such images.
It's far pickier about how far away you are
from your
subject (stated as «3 to 5 feet»), and its processing often struggles to handle irregularities in the
subject matter with jarring aberrations in areas it incorrectly thinks are in the
foreground or
background.
Portrait mode, which debuted on the iPhone 7, uses the phone's rear cameras — two 12 - megapixel sensors, one with a 56 mm «telephoto» lens — to separate
foreground subjects (i.e., a person)
from the
background (everything else) with a subtle blur.
Live Focus is a new mode that takes photos
from both cameras on the back at the same time, and allows you to not only save each of these photos, but to utilize a depth - of - field technique to enhance or lessen the
background blur that helps make
foreground subjects stand out.