Who is this person?
If you had to place her, you might say she was probably American. Perhaps she looks a little like someone you know.
Look again. There's something unusual, right?
Is it that her eyes are different colors? Maybe, but so were David Bowie's. But now you're looking at her eyes, there's definitely something strange about her eyelashes.
They don't quite seem to join-up.
And her top-set of pearly white teeth are let down by what look like a strange, perhaps decayed, lower set.
No, I'm not trying to bully or shame some innocent woman. The truth is, this person does not exist.
The image was generated by a form of machine-learning called a generative adversarial network (GAN). You can see hundreds of examples on the website This Person Does Not Exist, and even help to train the AI by providing text descriptions of the images.
But, be honest. Would you have given this image a second thought? No, right? And that's at the heart of a new challenge for anyone hoping to verify identity on the internet.
Convincing AI fakes don't stop at still images. Tools such as Deep Fake make it possible to swap faces in real-time between two videos, have it appear that someone said things they never did, and even replace every actor in the sitcom Friends with Nicolas Cage … or put Nicolas Cage in just about any movie you like.
Actually, there's a lot of putting Nic Cage's face on other people.
So, what does this mean for verifying the identity of people we do business with online?
Where We Use Video in Delivering Services
Video is becoming a standard part of how companies deliver services. Let's take three industries:
Healthcare: telemedicine can make healthcare available to many more people
Finance: video calling means that getting expert help filing taxes or applying for financial products can happen without making a special trip downtown
Education: techniques such as AR, along with straightforward video, can make location irrelevant when it comes to delivering education.
In each case, authentication is crucial. Patients must be able to trust that they're really speaking to their doctor. Financial institutions have strict Know-Your-Customer obligations. In face-to-face education, impostors taking exams on behalf of students is already a problem; technologies like Deep Fake could make that an even greater problem for remote education.
To tackle the threat of fake images and video takes a three part plan, covering human, technical, and process aspects.
The Human Role in Fighting Fakes
The AI-generated image at the top of this article is somewhat easy to spot as a fake for anyone who knows what to look for. Similarly, in the faked video of President Obama, there's something off about the way his mouth moves and how his body language doesn't quite tie up with what he appears to be saying.
Today, the technology is new enough that humans can spot its rough edges. So, the first step is to train contact center agents and other staff whose role involves video calls with outside parties.
This isn't a full solution, though. Early Deep Fake videos shared a tell-tale sign: the faked people never blinked. Not long after a researcher pointed this out, Deep Fake video creators ensured their videos included blinking. Just as with email spam, there's a risk that this will turn into an arms race that will make it much harder to tell fake from legitimate.
When it comes to video used in customer communication, the good news is that organizations can make technology choices to reduce the risk.
How Technology Can Fight Fakes
Many fake videos seen online are uploaded to video sharing sites such as YouTube. However, when it comes to customer communication, video takes place through specialist video APIs such as our own OpenTok.
A key element in reducing fakes is being sure that no one is interfering with the video stream between the customer and your organization's systems. OpenTok uses end-to-end encryption of the video, meaning that a so-called "man in the middle" attack can't hijack the video stream and insert faked content. For additional certainty, your developers can insert unique metadata into the video stream using OpenTok's Frame Metadata API, which your developers can verify on arrival to make sure it hasn't been tampered with.
In the near future, we can also expect to see commercial products that use machine learning to spot tell-tale signs of faked video that might be too subtle for humans to see. Researchers in Germany and Italy have already begun the groundwork.
The third strand in fighting fakes lies in strong authentication at registration and sign-in.
Process Changes Necessary to Fight Fakes
It's not enough to rely only on text passwords for authenticating users when they log into your systems.
Passwords are all too easily compromised, whether because someone has used the same password on multiple sites and one such site has been compromised, or because people tend to tell their passwords to people who ask.
Adding two-factor authentication (2FA) to your user sign-in and registration process makes it more secure by reducing reliance on the password alone. Using the Verify API, from Nexmo the Vonage API platform, your developers can build 2FA into your systems with minimal effort.
Similarly, you can strengthen your sign-up process by checking that the information people give you really checks out. Using Nexmo's Number Insights API you can check whether the phone number a new customer provides connects to a cell phone, landline, or VOIP end-point. You can even check whether the number was ported from one provider to another and get other data that will help you build a picture of how genuine the rest of their information is.
You Can Fight Fakes
Depending on your business, it might be that you feel faked video is unlikely to affect your staff.
However, many of the steps involved in fighting fakes are actually just good data security hygiene. For example, two factor authentication should be part of your process regardless of whether you use video calling or never plan to. And training your team to be vigilant about a faked video can become just another part of your standard security awareness training.
Whatever your plan to address the potential of faked video in customer communication, one thing is for sure: next time you're thinking, "I didn't know Nicolas Cage was in this movie," maybe the answer is that he never was.