- ITDM 2025 전망 | 금융 플랫폼 성패, 지속가능한 사업 가치 창출에 달렸다” KB국민카드 이호준 그룹장
- “고객경험 개선하고 비용은 절감, AI 기반까지 마련” · · · AIA생명의 CCM 프로젝트 사례
- 2025年、CIOはAIに意欲的に投資する - そしてその先も
- The best robot vacuums for pet hair of 2024: Expert tested and reviewed
- These Sony headphones eased my XM5 envy with all-day comfort and plenty of bass
Is your coworker a North Korean hacker? How AI impersonation is compromising the workforce
Once they ace an interview using genAI, threat actors then use widely-available deepfake tools to create fake ID documents and profile photos, often using the personal information of real US citizens. In KnowBe4’s case, the attacker created a deepfake profile photo based on a stock image. By combining this deepfake with stolen personal information of a real US citizen, the threat actor got past the company’s hiring and background check procedures, including four video interviews and visual confirmation of his identity.
Enhance FBI guidance for effective prevention
Clearly, existing hiring security procedures are insufficient to handle this threat. Some companies are experimenting with other methods, but even these fall short. For example, one executive interviewed by the Wall Street Journal said that they ask candidates to show a photo ID on video, which needs to match their face. But deepfake counterfeit IDs, which are now capable of beating Know Your Customer (KYC) software, are more than good enough to pass as real on a grainy video call.
The FBI is continuing to evolve its guidance on North Korean IT workers. Interestingly, elements bear remarkable echoes to guidance issued against deepfake and social engineering threats in the healthcare industry. Organizations looking to implement the FBI, DOJ, and Department of Health’s guidance can go a step further by using automated identity verification tools to enhance visual identity checks.