Everyone has experienced the feeling of seeing yourself appear on the screen of their phone unexpectedly, because they opened the camera-app with the selfie camera enabled. Not pretty. Well, today I learned that this is due to an effect called perspective distortion.
According to wikipedia, when shooting a portrait foto and fitting the same area inside the frame:
The wide-angle will be used from closer, making the nose larger compared to the rest of the photo, and the telephoto will be used from farther, making the nose smaller compared to the rest of the photo.
Photographers have known this for ages. That’s why professional portraits are usually shot from a distance, using a telephoto lens to fit the subject’s face in the frame. But us civilians mostly capture their faces with a selfie camera, which uses a wide angle lens. Maybe if we had 4m long selfie sticks, we could do something about it. But that doesn’t seem very practical to me.
Researchers from Princeton and Adobe have developed an algorithm that can adjust camera distance in post production . They achieve this by estimating a 3D model of the face, in which camera position and orientation are included, and fitting this to the 2D image. If you then manually change one variable in this model, the algorithm calculates the expected changes in the remaining properties and the result is projected onto a 2D image corresponding with the new camera position.
The representation of the face is obtained by automatically location 66 facial features around the eyes, nose and chin. For this step, the researches employ existing technology by Saragih et al (2009) . Because the detector they use doesn’t find key points on the ears and top of the head, these points have to be added manually. Those points are necessary to incorporate the ears and hair into the model. Without those, warping would result in an uncanny result where the perspective of the face changes, but that of the hair and ears stays the same. Read More