(Cold Fusion, 2018)
Oh, the joy of seeing Nic Cage’s face swapped with Harrison Ford’s face in a scene from the movie Indiana Jones. Or seeing Steve Buscemi’s face replacing Jennifer Lawrence’s face at the Golden Globes. Harmless, hilarious fun, right?
Perhaps. But is the entertainment value of this hilarity worth tolerating a sophisticated video technique that could lead to extortion, revenge porn, and disruption of political discourse in our country?
That is a First Amendment question raised in Professor Nina Brown’s lecture at Syracuse University entitled, “Deepfakes & the Law.” As a First Amendment scholar, Professor Brown is studying the effect of this emerging technology on the legal and political systems in our country and abroad.
According to Professor Brown, “Deepfakes” are not a synonym for “Fake News” or fake videos. It is a term that describes a new, highly advanced technology that gives practically anyone the ability to create fabricated videos that distort reality and may have dire consequences.
This technology relies on “neural networks” that collect facial tracking data and uses algorithms and machine learning to replace one image with another. The more data, the better. So, a person who has a plethora of video or facial data available – say a president or an actor – would be a better target than someone with very little data in the digital world.
(Cold Fusion, 2018)
And this technology, which was once very expensive and time-consuming and used mainly by Hollywood film producers, is now very cheap and available to almost anybody. The recent release of “Fake App” allows a layman with minimal amount of technical expertise to create fabricated videos similar in quality to the videos only Hollywood could afford to produce just a few years ago.
In Professor Brown’s talk, she weighed the “good” uses of Fake Video technology versus the “bad” ones. The good include, as discussed, the entertainment value and the ability of anyone to create funny fabricated videos to entertain friends or even to promote brands. As a creative director in advertising, I can see the creative value in having this tool in my toolbox to create scenes resembling the scenes of Forrest Gump talking to President Kennedy and Johnson in the movie. The possibilities are endless for a brand or an ad campaign.
However, balance the good with the bad uses, which include the ability to create revenge porn by placing the face of your ex, on the body of an X-rated films actor. Or consider using such a fabricated video to extort a public figure who may find its release -although fake- damaging to his/her reputation.
Or even worse, altering images of political candidates to make them seem to be doing or saying something that they are not. Professor Brown presented a convincing video created by Jordan Peale, that put Peale’s words into President Obama’s mouth. Literally. The intention would be to disrupt political discourse in America and could lead to the public’s inability to trust anything they see or hear in the news.
What makes this situation even worse is the public’s power to share just about anything with the click of a mouse. And their desire to share quickly without verifying the authenticity of the material. The low media literacy of the general public to question what is real and what is fake only amplifies the danger (To understand my POV on media literacy, see my blog post on “Breaking news. This Exclusive Report is Completely Fabricated).
So, who is to blame of this situation? Hollywood? The news media? Digital hackers? Foreign actors who seek to manipulate our system? Nic Cage? Possibly.
But how about looking in the mirror? Is it a citizen’s duty to think before sharing? Would you spread a lie that you knew was a lie? Not likely. But too many citizens readily share fake videos without taking a moment to think about whether they may have been fabricated or not. If it’s funny, or supports your political views, it’s easier to share than to investigate. Perhaps a new Ad Council campaign needs to be created with the theme line, “Think before you click.”
Possible solutions to this problem fall into the technological and legal arenas. As technology gets more and more sophisticated Deepfakes keep getting better and better. Soon the only way of telling the difference between a real and a fake will be by comparing the digital data of the original source material to the actual video. Blockchain solutions are also being explored.
The legal landscape is a little trickier. Since the First Amendment protects free speech in just about any form It would take a bill from Congress to challenge Deepfakes and hold those creating them responsible. And even if that were to pass, foreign actors, and anonymous creators on the dark web would still be able to create and propagate these videos.
For now, internet service providers (ISPs) are not liable for spreading information uploaded to their sites. But that may change if Congress acts against Deepfakes. Social media sites like Facebook are now incentivized to police themselves before laws are passed that may not be in their best interests.
But there is a “bright side,” according to an article by CNET. “Videos with your face on somebody else’s body aren’t as tantalizing to bad guys, as say, creating political chaos” (Solsman, 2019).
True, but ultimately the general public has to be aware of the dangers.
According to CNET…”the ultimate threat of Deepfakes isn’t how sophisticated they can get. It’s how willingly the public will accept what’s fake for the truth — or believe somebody’s false denial because who even knows what’s true anymore?” (Solsman, 2019)
Considering the current state of Media Literacy and our willingness to share anything without verifying, I’m afraid Deepfakes will thrive. And we may all be in deeptrouble.
Brown, N. (2019, March 29). Deepfakes & the Law. Lecture presented at Syracuse University Immersion, Syracuse, New York.
ColdFusion. (2018, April 28). Deepfakes – Real Consequences. Retrieved April 3, 2019, from https://www.youtube.com/watch?v=dMF2i3A9Lzw&t=48s
Solsman, J. E. (2019, April 03). Deepfakes may ruin the world. And they can come for you, too. Retrieved April 3, 2019, from https://www.cnet.com/news/deepfakes-may-ruin-the-world-and-they-can-come-for-you-too/