A video clip circulating online by a niche YouTube channel illustrates a very dangerous problem. The video, which fakes former President Donald Trump giving a speech from the movie Independence Day, features a nearly-convincing President Trump, with almost-perfect facial expressions and a realistic imitation of the former President’s voice. The video does not cover anything controversial, only making Trump give a speech from a film.
This channel is packed to the brim with these videos, each with many different politicians and celebrities Almost all of them are gags, but this still illustrates how dangerously easy clips such as these are to manufacture. Assuming this channel is run by an amateur animator or computer enthusiast, it can be frightening to imagine how convincingly an intelligence agency or other government bureaucracy could make one.
Video Calls and Seven Years of Telegram
The political advantages for authoritarian governments and politicians to target foreign and domestic enemies could present a unique difficulty for the public where nothing can be trusted (as cliché and conspiratorial as that sounds).
Any piece of media could be manipulated to gaslight an opponent. A person with malicious intentions could create a video clip and leak it to news organizations or foreign governments to hurt those involved or start a global conflict.
Even worse, the possibilities for blackmail involving these are concerningly limitless. Imposing somebody into an inappropriate situation without their consent and threatening them with a ransom could put a victim in peril with little hope. Their life could be destroyed over that, via either public embarrassment or monetary loss trying to satisfy the malicious entity.
Another opportunity victims could fall prey to is phishing scams. Videos could feature famous influencers and business people such as Elon Musk or Jeff Bezos promising money in exchange for a small payment from the victim.As explained by Norton, an online security and technology company, “Many of these scams rely on an audio deepfake. Audio deepfakes create what are known as ‘voice skins’ or ‘clones’ that enable them to pose as a prominent figure. If you believe that voice on the other line is a partner or client asking for money, it’s a good idea to do your due diligence.”And this has been done before. As reported by The Washington Post, “Thieves used voice-mimicking software to imitate a company executive’s speech and dupe his subordinate into sending hundreds of thousands of dollars to a secret account, the company’s insurer said, in a remarkable case that some researchers are calling one of the world’s first publicly reported artificial-intelligence heists.”
As these deepfakes become easier to produce, making the public not only aware of them but also how to combat them will be necessary. Tech giants, or possibly governments, will need to investigate ways to combat these malicious forms of media. Ignoring what is clearly on the horizon will only leave humanity unprepared for what will follow the age of information: the age of disinformation.