WHY THIS MATTERS IN BRIEF
DeepFakes are increasingly realistic, cheap, and easy to make, and in two years time you won’t know the difference between real humans and fake ones online.
DeepFakes keep making the news, as well as reading it, and helping you improve your Zoom calls. And Donald Trump’s deepfake now has its very own satirical news show on YouTube. But here are two deepfake adverts that, ironically, enough didn’t make the news – on purpose. Two political ads broadcast on social media during the recent US elections featured deepfake versions of Russian president Vladimir Putin and North Korean leader Kim Jong-un, and as you can see from the videos below, both deepfake leaders gave the same message – that America doesn’t need any election interference from them because it will ruin its democracy all by itself.
Yes, the ads, if you saw them, were creepy but according to the company behind them they were meant for a good cause. They were part of a campaign from the nonpartisan advocacy group RepresentUs which aimed to protect voting rights during the presidential election amid president Trump’s attacks on mail-in voting and suggestions that he may refuse a peaceful transition.
Kim Jong-Un delivers his message
The goal was to shock Americans into understanding the fragility of democracy as well as provoke people to take various actions, including checking their voter registration and volunteering for the polls, and it flipped the script on the typical narrative of political deepfakes, which experts often worry could be abused to confuse voters and disrupt elections for more nefarious reasons.
RepresentUs worked with the creative agency Mischief at No Fixed Address, which came up with the idea of using dictators to deliver the message. They filmed two actors with the right face shape and authentic accents to recite the script. They then worked with a deepfake artist who used an open-source AI algorithm to swap in Putin’s and Kim’s faces.
Putin delivers his message
A post-production crew then cleaned up the leftover artifacts of the algorithm to make the video look more realistic. All in all the process took only 10 days, which is remarkable given the fact that lousy deepfakes used to take months to make, and even today attempting the equivalent with CGI likely would have taken months, the team says. It also could have been prohibitively expensive.
The ads were originally supposed to broadcast on Fox, CNN, and MSNBC in their Washington, DC, markets, but the stations pulled them last-minute from airing. A spokesperson for the campaign said they were still waiting on an explanation. The ads did, however, include a disclaimer at the end, stating: “The footage is not real, but the threat is.” But given the sensitive nature of using deepfakes in a political context, it’s possible the networks felt the American public just wasn’t ready – and then there could have been a whole bunch of consumer and legal fallout to boot which do doubt figured in their decisions not to run them …