- How to clear the cache on your Windows 11 PC (and why you shouldn't wait to do it)
- These Sony headphones deliver premium sound and comfort - without the premium price
- The LG soundbar I prefer for my home theater slaps with immersive audio - and it's not the newest model
- Samsung's new flagship laptop rivals the MacBook Pro, and it's not just because of the display
- Email marketing is back and big social is panicking - everything you need to know
Scammers Create “AI Hologram” of C-Suite Crypto Exec

Fraudsters used deepfake technology to impersonate the identity of a senior Binance official in online meetings with clients, the crypto exec has claimed.
Binance claims to be the world’s largest cryptocurrency exchange by daily trading volume, making it a popular target for threat actors.
Chief communications officer (CCO) Patrick Hillmann said in a new blog post that he was shocked at the “layers of security protocols” new starters had to navigate during onboarding.
“Despite having previously led one of the world’s largest cybersecurity teams and managed some of the largest data breaches in history (US OPM, Ashley Madison, etc.), I was not prepared for the onslaught of cyber-attacks, phishing attacks, and scams that regularly target the crypto community. Now I understand why Binance goes to the lengths it does,” he explained.
“However, criminals will almost always find a way to adapt to and circumvent even the most secure system. Over the past month, I’ve received several online messages thanking me for taking the time to meet with project teams regarding potential opportunities to list their assets on Binance.com. This was odd because I don’t have any oversight of or insight into Binance listings, nor had I met with any of these people before.”
It transpired that fraudsters had impersonated Hillmann using AI-based technology, he claimed.
“It turns out that a sophisticated hacking team used previous news interviews and TV appearances over the years to create a ‘deep fake’ of me,” he said.
“Other than the 15 pounds that I gained during COVID being noticeably absent, this deepfake was refined enough to fool several highly intelligent crypto community members.”
The scam appears to have been one of the first reported attempts at using deepfake technology to impersonate an individual via video display. Up until now most attacks have used audio to trick their victims.
The FBI warned earlier this year of scammers inviting enterprise participants to an online collaboration meeting. When they joined they would be met by an apparently ‘frozen’ image of their host accompanied by deepfake audio urging them to wire money to an external bank account.