It’s been speculated that we’re living in the Golden Age of Fraud. Online banking and shopping are the norm, and millions of stolen credit card numbers float around the dark web. Workplace interactions are increasingly moving online as well, with a predictable result: Scam artists are exploring avenues to exploit the remote-work revolution.
The FBI issued a warning June 28 that complaints are rolling in about a scheme in which fraudulent candidates apply for remote-work positions and use deepfake technology to hide their identity during the video interview process. While the reported attempts so far have been foiled, the FBI said its presumption is that hackers are hoping to secure the position, gain access to company logins and steal sensitive customer and client information.
Deepfake technology uses artificial intelligence to create convincing, synthetic media. Through this tech, fake job applicants can use an overlay of a different individual’s face to conduct the video interview using a different identity. While the notion may sound ridiculous, deepfakes can be alarmingly realistic.
Fraudsters have been targeting mainly tech-heavy jobs, Brian Blauser, a supervisory special agent with the FBI, told HR Dive. These include information technology, engineering, database management and other roles that would likely require access to private information.
Because most industries collect some kind of private data from users — be it financial or identity information — many organizations are likely to be targeted. “Really, any company that has [personally identifiable information] on its system is probably going to be a target at some point down the road,” Blauser said.
Still, there are a few ways HR pros can weed out the frauds and keep their companies safe.
Pay close attention during video interviews
As mentioned, deepfake media can be surprisingly convincing — but there are some tell-tale signs to look out for. In the case of interviews that have been reported to the FBI, voice spoofing and deepfake voice technology seem to have been used as well. Sometimes, there appears to be something “off” during these interviews. The lip movements and audio of a person on screen may differ slightly, like a bad lip sync. The interviewer may hear a cough or sneeze without seeing it happen on screen, Blauser noted. If something feels a bit too uncanny, interviewers should follow their intuition, he said.
The MIT Media Lab created a website to help people practice deepfake identification. It also put out a checklist of video elements to observe closely, drawing attention to fine details such as skin texture, shadows and whether or not glasses have a glare. Companies like Microsoft have also come out with software that can detect deepfakes.
Employers can do some additional research and invest in training to help spot deepfake technology, which may continue to have implications for the workplace as it becomes more sophisticated and widespread.
Exercise due diligence in checking documents
With HR sometimes under the wire to rapidly recruit, it may be tempting to skip or delay some identity verification steps and quickly onboard a new hire. This would be a mistake, Blauser said.
Follow through on background checks and other identification processes the company has in place before providing access to secure company databases and client information. If the process results in errors, like a social security number or date of birth that doesn’t quite match up as expected, pause the hiring process and ensure the error is an honest one — and is corrected — before moving forward, he recommended.
Even paying attention to social media profiles can help. “Go look at their LinkedIn profile and see if it matches [the information provided],” Blauser said. “I know the bad guys can create LinkedIn profiles pretty easily. But you know, if you happen to know somebody in the same network or whatever, reach out to that person [and ask], ‘Hey, do you know this guy or gal?’ Don’t be afraid to ask questions.”
If possible, conduct at least one in-person interview
Finally, employers can side-step the threat of an online fraudster by requiring at least one in-person interview — or using in-person interviews as much as possible, Blauser suggested. If a candidate appears to be in the same city or area, requiring an in-person interview may be an easy lift. If the candidate is from outside the region, it may be worth discussing the budgetary potential of flying them in. Even mentioning an in-person interview may scare off a potential fraudster.
“That’s obviously going to be problematic for a deepfake — they can’t come in in-person,” Blauser said. “They’re not going to jump to that level of obfuscation of their identity. So that’s going to take care of the problem.”
While the issue of deepfake job applicants is only just emerging, Blauser said, HR pros should be vigilant. “It’s hitting our radar a little more regularly now,” he said. “So we just wanted to try to get in front of it and make the industry aware of it.”