Is your coworker a North Korean hacker? How AI impersonation is compromising the workforce

Once they ace an interview using genAI, threat actors then use widely-available deepfake tools to create fake ID documents and profile photos, often using the personal information of real US citizens. In KnowBe4’s case, the attacker created a deepfake profile photo based on a stock image. By combining this deepfake with stolen personal information of a real US citizen, the threat actor got past the company’s hiring and background check procedures, including four video interviews and visual confirmation of his identity.

Enhance FBI guidance for effective prevention

Clearly, existing hiring security procedures are insufficient to handle this threat. Some companies are experimenting with other methods, but even these fall short. For example, one executive interviewed by the Wall Street Journal said that they ask candidates to show a photo ID on video, which needs to match their face. But deepfake counterfeit IDs, which are now capable of beating Know Your Customer (KYC) software, are more than good enough to pass as real on a grainy video call.

The FBI is continuing to evolve its guidance on North Korean IT workers. Interestingly, elements bear remarkable echoes to guidance issued against deepfake and social engineering threats in the healthcare industry. Organizations looking to implement the FBI, DOJ, and Department of Health’s guidance can go a step further by using automated identity verification tools to enhance visual identity checks.



Source link

Leave a Comment