McAfee Predictions: Deepfakes may bypass facial recognition

While the intentions of new technologies are to make life much easier for everyone, these can also be used to threaten security both digital and in real life. Computer security software company McAfee, in its “2020 Threats Predictions Report” highlights the potential rise of Deepfakes to be used by threat actors.

“What’s changed with the advances in artificial intelligence (AI) is you can now build a very convincing deepfake without being an expert in technology,” writes Raj Samani, chief scientist and McAfee Fellow, Advanced Threat Research, in a blog post.

What people should watch out for is the democratization of the use of these technologies with websites offering these “services.” All an ordinary internet user can do is upload a video and download the finished product of a deepfake video.

Have you read “IDC: ICT jobs to reach 55.3M worldwide in 2020”?

“Deepfake video or text can be weaponized to enhance information warfare,” said Samani. “Attackers can now create automated, targeted content to increase the probability that an individual or groups falls for a campaign. In this way, AI and machine learning (ML) can be combined to create massive chaos.”

The threats are directed not only toward politics but may also affect corporate personalities.

“Adversaries will try to create wedges and divides in society, or if a cybercriminal can have a CEO make what appears to be a compelling statement that a company missed earnings or that there’s a fatal flaw in a product that’s going to require a massive recall,” said Samani. “Such a video can be distributed to manipulate a stock price or enable other financial crimes.”

Steve Povolny, head of McAfee Advanced Threat Research, stressed how deepfakes can bypass facial recognition, which is often used in unlocking devices and other use cases.

“Generative Adversarial Networks (GANs), a recent analytic technology, that on the downside, can create fake but incredibly realistic images, text, and videos,” writes Povlony in a blog post. “Enhanced computers can rapidly process numerous biometrics of a face, and mathematically build or classify human features, among many other applications. While the technical benefits are impressive, underlying flaws inherent in all types of models represent a rapidly growing threat, which cybercriminals will look to exploit.”