Martha Dillon investigates the technology behind living photographs
If you ever worried that your online avatars were too static, or online banking app bots a little cold, a new collaboration between Facebook and Tel-Aviv University could be the answer. The project, published in November, has managed to not only animate the Mona Lisa and a variety of emojis, but also to bring a series of real human photographs to smiling, frowning and ‘idle’ life. While previous moving image technologies required videos, multiple images or manual user interactions to create manipulations, the new technique introduces an automated process that can be applied to a single still headshot. Using standard expressions mapped from a ‘driving video’ of a different subject, the target portrait is manipulated by imitation using 2D warps (animations of slight rotations and tilts). Fine-scale dynamic details – such as creases and wrinkles – are then added to create a photo-realistic effect, and regions hidden in the target face (such as teeth) are ‘hallucinated’. The result is a moving, breathing, expressive photograph, and even a mocked-up Facebook profile picture that reacts to triggers from a viewer. The authors say the next stage is to develop methods of selecting ‘driver’ videos that more accurately match the target image, and to introduce compatibility with 3D and possibly even AI technology.
Header image: Alex Hahn