- The ReMarkable 2 remains one of my favorite creative tools, and it's $70 off for Black Friday
- Join BJ's Wholesale Club for just $20 and save on holiday shopping
- The best Black Friday Sam's Club deals 2024: Sales available now
- The TCL Q65 98-inch TV is just $1,600 at Amazon for Black Friday
- New York Secures $11.3m from Insurance Firms in Data Breach Settlement
AI-Powered Russian Network Pushes Fake Political News
Security researchers have discovered a major new Russian disinformation campaign using generative AI (GenAI) to “plagiarize and weaponize” content from major news organizations, in a bid to influence Western voters.
Dubbed “CopyCop” by Recorded Future, the network uses large language model (LLM)-powered GenAI to copy content from mainstream media and then introduce political bias as it tailors that content for specific audiences.
Media organizations including Al-Jazeera, Fox News, the BBC, La Croix and TV5Monde are among those impacted. Sometimes legitimate sites are spoofed and hosted on alternative domains such as bbc-uk[.]news, while on other occasions, stories are published on fictitious news sites like London Crier and GB Geopolitics, Recorded Future’s report claimed.
The narratives promoted by CopyCop are designed to support Russian influence objectives by sowing division over the Israel-Hamas conflict, undermining support for Ukraine and criticizing government policy.
Stories seen by Recorded Future include claims that the UK government has criminalized Islam, and that it is planning to introduce a ‘buffer zone’ of NATO countries around Ukraine. Others attempt to drive a wedge between the UK and US governments, the report claimed.
Concerns Over Election Disinformation
Recorded Future said it is particularly concerned about the potential for GenAI-powered content like this to influence voters ahead of key elections in the UK and US later this year.
Recorded Future threat intelligence analyst, Clément Briens, warned that CopyCop is using AI to achieve “unprecedented reach and effectiveness” in order to influence public opinion.
“Media organizations and government entities should proactively identify and take down influence networks abusing their brand or intellectual property, work with naming authorities like ICANN and hosting providers to block infringing influence domains,” he told Infosecurity.
“Exposing influence operations and communicating accurately about their measured impact or engagement should also be a key priority.”
If CopyCop succeeds, it’s likely that other influence operations will follow a similar GenAI-powered model in future, with profound implications for Western democracy and trust in mainstream media, the report concluded.
Recorded Future emphasized a growing need for improved collaboration between governments, technology companies and civil society to combat the proliferation of disinformation online.