FBI: Bad Actors Weaponizing Deepfakes for Remote Job Hires
Deepfakes (AI-generated or AI-assisted videos falsifying human beings) are a relatively known quantity even in mainstream media – particularly due to screenwriter and director Jordan Peele’s 2018 showcase of how believable the technology was in animating an otherwise digital ex-president, Barack Obama. While the tech first made forays in the Internet’s underground, the increasing ease with which bad actors can weaponize the technology is raising alarms throughout most sectors – or at the very least, they should be.
The FBI said bad actors have been prioritizing positions related to IT, programming, database maintenance, and other software-related functions. The idea, it seems, would be to facilitate access to customer information, corporate financial data, and proprietary information. Of course, one also has to consider the “old” technique of straight-up infecting a company’s infrastructure with a virus (more likely a ransomware, according to the latest trends) that could throw havoc in a company’s prospects.
Deepfake technology still isn’t at the level required to fool everyone – especially the most attentive ones, who look for de-synced actions between the “animation” and its audio or odd disconnects between skin coloration – but the technology is sure to only improve. Already companies are developing AI-chasing AIs that can detect the usage of deepfakes. It’s a case of a dog chasing its own tail: and who knows who’ll win in the end.