After deepfake technology was used to realistically impose celebrities’ faces on porn actors’ bodies, the new video by Jordan Peele is a sober reminder that the problem has not gone away. But maybe startups could help.
"Who controls the past controls the future. Who controls the present controls the past.” The words from George Orwell’s 1984 rings more true in the age of fake news than ever before. And as a new viral deepfake video showed this week, manipulating past, present and future has never been easier.
The video in question was released by BuzzFeed on Tuesday April 17 and first appears to depict the former US president Barack Obama address the nation. He seemingly starts on a sombre note, warning that technology can be used to distort truth by creating videos that make people appear to say “anything at any point in time — even if they would never say those things.”
However, the video suddenly takes a surreal turn when the former president suggests that the villain of the Black Panther movie was right, implicating that Ben Carson, US secretary of housing and urban development, is in the Sunken Place and ending by calling Donald Trump “a total and complete dipshit.”
It’s only then that it’s revealed that it’s not actually Obama doing the talking but Jordan Peele, the comedian and Oscar-winning director of Get Out. The video is shown to have been manipulated to make it appear as if the former president has been uttering the words when he in fact never did. Peele, speaking as Obama, continued: “This is a dangerous time. Moving forward, we need to be more vigilant with what we trust from the internet.”
Commenting on the video for Elite Business, Lyric Jain, founder and CEO of Logically, the intelligent-newsfeed startup, said: “The Trump-Obama video appears to be using a fairly primitive deepfake module to make that video. Recent research has enabled more realistic lip movement for any person after we have a few hours of video footage for any person. They also [used] a professional familiar with Obama's voice and speech patterns but an equally convincing video could be created by modifying a recording of an unskilled impersonator.”
This is not the first time deepfake videos have made headlines. In February Pornhub, the pornography streaming-service, banned videos where the technology had been used to somewhat realistically impose the faces of actresses onto the bodies of the people in porn videos. However, it seems as if the ban hasn’t really had a massive effect with several deepfake clips still being easy to find on the site.
But Lee Munson, security researcher for Comparitech, the tech-testing platform, adds that just like any other technology, deepfake tech isn’t inherently evil. “[It] is the person creating them that decides which direction they go in,” he told Elite Business. “There are genuine uses such as in the mainstream film industry where the ability to recreate characters previously portrayed by deceased actors is concerned.” A great example of this can be seen in the Star Wars movie Rogue One where the deceased actor Peter Cushing was brought back to life with clever use of CGI to reprise his role as grand moff Tarkin.
Moreover, Munson argued that the technology having found its way into the mainstream is also part of the problem as it has enabled more wicked individuals to get their hands on it. “If that’s not bad enough, the plethora of free apps around now could soon lead to even more nefarious uses of the technology, such as creating videos of world leaders or trendy celebrities spouting idealistic monologues,” he said.
Ending on a more encouraging note, Jain pointed out that there is a space for startups to help fight fake news and manipulated videos. “Technology companies can analyse images and videos and detect whether they have been manipulated or are AI-generated,” he said. “But we need to figure out how this information would be passed onto end users. Perhaps a warning or label for any video that has been modified? Only political videos? Significant edits or even aesthetic edits?”
And this service won’t be cheap. “The added layer of complexity with video compared to text is that it is very expensive to analyse and it remains unclear who will foot the bill - verifying videos on YouTube alone would cost tens of millions a day,” Jain concluded.
Just like the problem with fake news in general, deepfake videos won’t go away anytime soon. But it also presents innovative entrepreneurs with an opportunity to both make a difference and quite a few bucks.