A.I. is getting pretty scarily good at lying to us. No, we’re not talking about wilfully misleading people for nefarious means, but rather creating sounds and images that appear real, but don’t exist in the real world.
In the past, we’ve covered artificial intelligence that’s able to create terrifyingly real-looking “deep fakes” in the form of faces, synthetic voices and even, err, Airbnb listings. Now, researchers from Japan are going one step further by creating photorealistic, high-res videos of people — complete with clothing — who have only ever existed in the fevered imagination of a neural network. The company responsible for this jaw-dropping tech demo is DataGrid, a startup based on the campus of Japan’s Kyoto University. As the video up top shows, the A.I. algorithm can dream up an endless parade of realistic-looking humans who constantly shapeshift from one form to another, courtesy of some dazzling morphing effects.
Like many generative artificial intelligence tools (including the A.I. artwork which sold for big bucks at a Christie’s auction last year), this latest demonstration was created using something called a Generative Adversarial Network (GAN). A GAN pits two artificial neural networks against one another. In this case, one network generates new images, while the other attempts to work out which images are computer-generated and which are not. Over time, the generative adversarial process allows the “generator” network to become sufficiently good at creating images that they can successfully fool the “discriminator” every time.
As can be seen from the video, the results are impressive. They don’t appear to have any of the image artifacts or strange glitches which have marked out many attempts at generating images. However, it’s also likely not a coincidence that the video shows humans positioned against simple white backdrops, thereby minimizing the potential of confusing backgrounds which could affect the images created.
Provided all is as it seems, this is a fascinating (albeit more than a little disconcerting) advance. If we were employed as movie extras or catalog models for clothing brands, we’d probably be feeling a little bit nervous right now. At the very least, the possibility of next-level fake news just got a whole lot better.