Imagine someone in your billing department gets an urgent call. The CEO is requesting an immediate transfer of funds into a bank account due to an emergency with a very important client. The voice is convincing, and the client is real. Accounting does what it does best—transfers hundreds of thousands of dollars in record time. Shortly after, Accounting gets another call from the CEO demanding to know why so much money is missing. The CEO claims that he or she never requested the transfer, and by now, that money is long gone.
This terrifying scenario is more than imagination; it’s really happening all the time thanks to an easy-to-use, widely available technology called deepfake.
Deepfakes allow relatively unskilled hackers, disgruntled employees, or even misguided managers to create convincing, but fake, audio and video for whatever nefarious purposes they can imagine. I recently spoke with Natalie A. Pierce, Shareholder, Co-Chair, Robotics, AI and Automation Practice Group at Littler Mendelson P.C., to shed some more light on this topic.
What Are Deepfakes?
Pierce defines deepfakes as “computer generated and manipulated images and video … that is allowing AI to engage in unguided and realistic extemporaneous written dialogue and neural language content generation.” In other words, deepfakes can create frighteningly realistic audio and video that never happened at all. Take a look at this video for convincing evidence to realize the terrible potential of this technology (the first “transformation” takes place around 53 seconds).
Watching an actor transform into other people seamlessly is very unsettling. Pierce agreed, stating, “It is very disturbing, and it is absolutely getting better and better and more difficult to detect.” Experts on the cutting edge of AI and machine learning have been developing tools like these for a long time, and many of them are open-sourced and released to the Internet. Creators of deepfakes can simply train software with images, audio, and video of the impersonator and the impersonated, and that software is able to turn one into the other.
Perhaps the most concerning aspect of this technology is the fact that virtually anyone can create them with little more than a few hours of installing and configuring software and access to decent computing power—the kind that most gaming PCs are capable of. Pierce cautions us, “It’s really important to understand how easy and available this insidious technology is.” With a little bit of digging, I was able to locate a guide on exactly how to create deepfakes, complete with links to all the necessary software.
Security Leaders Should Be Very Concerned
The potential of this software should not be lost on security professionals. Consider these scenarios.
- Jealous Employee. An employee expresses jealousy over a coworker by doctoring a convincing deepfake video of the coworker engaging in illegal activity, like doing drugs at work. HR fires that coworker immediately. Then, that coworker takes the company to court for false termination by proving that the video is fake.
- Angry Manager. A manager is unhappy that one of his or her employees left for another job. The manager creates audio of that coworker badmouthing the new employer and sends it to that company. Imagine how high the legal fees would be with all of those experts that the defendant hired proving that the audio was doctored. Who would foot that bill?
- Miffed Ex-Employee. An employee is fired for completely legitimate reasons. The employee is upset and decides to turn to deepfake to create realistic videos of the firing manager being racist. Without knowledge of deepfake, the courts have no choice but to, at least, approve unemployment payments and, at worst, reinstate the employee.
Unfortunately, the extent to which bad actors can use this technology to create nightmare situations for security professionals are limited only by their imagination.
Pierce reinforces the concern when she says, “From an employer’s standpoint, deepfakes are already costing companies millions of dollars. And this technology is just proliferating.”
The Destructive Power Can Last Forever
The unfortunate truth is that, sometimes, even the specter of a false allegation can permanently destroy people’s lives. The fake news that stormed the Internet during the last election provides a real example. Remember how #Pizzagate began when reports emerged that Hillary Clinton led a child-trafficking ring out of Comet Ping Pong? The story was never true, but the damage had already been done. Not long after, a young man fired a gun into that location. Luckily, no one was hurt, but people could have been, and at least one individual’s life (the shooter) changed forever.
I brought up this point to Pierce, and she agreed, stating, “Once something goes viral you can’t undo the damage in time … this insidious tool threatens both our democratic society and our system of justice because people may choose to accept or reject what they see based on their belief system. And I think that’s very dangerous. Not only for employers, obviously, but for the entire society.”
What Is the Remedy?
Awareness is the first defense. Pierce says, “You certainly have to have someone standing by and thinking of real ways you can really tell your employees that there is fake content out there and that the deepfakes exist.” Getting that message out can help employees scrutinize emergency requests from CEOs and be wary of audio or video that seems accusatory. Similar approaches have helped many organizations with phishing techniques, and they can help here, too. A good place to start is showing your employees videos like the one above.
As for how to detect such deepfakes, Pierce says, “We’re telling people you have got to have a data analytics team that is prepared to help authenticate. There are things you can do to help with verification like cryptographic key signing or call logs or other means of verification to ensure that you can trust what you are seeing—and this is where a data analytics team can really help.”
A team of experts can do many things to help with verification. For example, says Pierce, they can “use fingerprinting by embedding cryptographic key signings in video communications in the form of an imperceptible watermark on a potential video or audio.” Experts could then analyze new video or audio for those fingerprints to help authenticate or debunk them.
Not every organization can afford to employ an entire team of experts. What can they do? Pierce offers some hope for these organizations. She says, “The good news is that as we see the greater proliferation of these deepfakes, you are also going to see companies coming up with solutions that are scalable even for smaller employers. I think that the technology … is going to become more affordable and more easily attainable. So your smaller employers are going to be able to have a team of data analysts available to them.”