Is the face in that photo or video you are eyeing on the web real or fake? Adobe, the creators of Photoshop, may soon have tools to help you spot an altered face — letting you see what the original image likely looked like. Researchers with Adobe and the University of California, Berkeley recently developed an artificial intelligence program that recognizes when Photoshop’s Face Aware Liquify tool is used, a tool that can be used to alter facial expressions.
The team trained a convolutional neural network (CNN), a form of artificial intelligence, by feeding the computer sets of images — one original, and one altered. Using the data, the researchers trained the software to recognize when the faces in the photograph were manipulated. The software looks for several different clues, from warping artifacts to the layout of the face.
While the untrained person could spot the fake 53% of the time, the software reached rates as high as 99% accuracy in picking out the faked photo. But the software could also go one step further and make a rough estimate of what the original image likely looked like, reverse engineering the image based on the different artifacts and signals that manipulation was used in the first place. Adobe says the researchers were surprised at how accurately the software could estimate what the original image looked like.
The research joins Adobe’s earlier research into spotting images that are fake using cloning techniques. Adobe suggests that continued research into software to detect image manipulation could help democratize image forensics — in other words, make it easier for the average person scrolling through social media or a webpage to spot a manipulated photograph.
Manipulating the emotion in an image can be used to create misleading images and memes. In video, altering facial expressions is often part of creating deepfakes to manipulate the mouth of the speaker on the video to match made up text, such as the recent fake video of Mark Zuckerberg.
“This is an important step in being able to detect certain types of image editing, and the undo capability works surprisingly well,” Adobe’s head of research, Gavin Miller, said in a statement. “Beyond technologies like this, the best defense will be a sophisticated public who know that content can be manipulated — often to delight them, but sometimes to mislead them.”
Adobe says the company’s research team will continue to explore the topic of authenticity, including discussing balancing safeguards with creativity and storytelling tools.