With deepfake videos making headlines recently and campaigns against the over-Photoshopping of models picking up steam in the last few years, people are more aware than ever of how images can be digitally manipulated. Now the company that created Photoshop, Adobe, wants to give tools to users to let them spot faked images themselves.
“While we are proud of the impact that Photoshop and Adobe’s other creative tools have made on the world, we also recognize the ethical implications of our technology,” Adobe wrote in a blog post. That’s why it has developed a method for identifying edits made to an image using tools like Photoshop’s Face Aware Liquify feature. This particular feature was chosen because it is frequently used to change facial expressions, making it a useful test case for identifying image manipulation.
A team of researchers used deep learning to train A.I. to recognize images of faces that had been altered. It was trained by showing pairs of images, one original and one altered, so the system could learn the telltale signs of manipulation. By the end of the training, the tool was able to identify manipulated images up to 99% of the time, as compared to the 53% identification rate of humans.
It was even possible for the tool to revert images which had been altered back to how they had looked before. “It might sound impossible because there are so many variations of facial geometry possible,” Professor Alexei A. Efros of UC Berkeley said in the statement. “But, in this case, because deep learning can look at a combination of low-level image data, such as warping artifacts, as well as higher level cues such as layout, it seems to work.”
The tool isn’t ready for the mainstream yet, however. The researchers say they will need more time before they can offer customers a direct way to identify faked images for themselves. “The idea of a magic universal ‘undo’ button to revert image edits is still far from reality,” Adobe researcher Richard Zhang said. “But we live in a world where it’s becoming harder to trust the digital information we consume, and I look forward to further exploring this area of research.”