- CISPE raises red alert over Broadcom’s VMware licensing changes
- 구글, 공공 부문 5가지 AI 트렌드 제시 "정부 기관도 변화 중"
- Your old laptop can 'shine on' after Windows 10's sunset with this Linux distro
- These Sony headphones are a fan favorite - and they're $100 off ahead of Presidents' Day
- You can get the best Ring indoor camera I've tested for only $60 with this Presidents' Day deal
AI chatbots distort the news, BBC finds – see what they get wrong
![AI chatbots distort the news, BBC finds – see what they get wrong AI chatbots distort the news, BBC finds – see what they get wrong](https://www.zdnet.com/a/img/resize/8d1ec09dfe0a07ac412d23c1f0c5bc0a260e377b/2025/02/12/4dd471e9-dade-4d22-a39e-ba8ef49b7f3a/news5gettyimages-171588907.jpg?auto=webp&fit=crop&height=675&width=1200)
Four major AI chatbots are churning out “significant inaccuracies” and “distortions” when asked to summarize news stories, according to a BBC investigation.
OpenAI’s ChatGPT, Microsoft’s Copilot, Google’s Gemini, and Perplexity AI were each presented with news content from BBC’s website and then asked questions about the news.
The report details that the BBC asked chatbots to summarize 100 news stories, and journalists with relevant expertise rated the quality of each answer.
Also: Why Elon Musk’s $97 billion bid for OpenAI could disrupt Sam Altman’s plans
According to the findings, 51% of all AI-produced answers about the news had significant issues, while 19% of the AI-generated answers “introduced factual errors, such as incorrect factual statements, numbers, and dates.”
Additionally, the investigation found that 13% of the quotes from BBC articles were altered in some way, undermining the “original source” or not even being present in the cited article.
Last month, Apple was criticized for its AI feature, Apple Intelligence, which was found to be misrepresenting BBC news reports.
Deborah Turness, CEO of BBC News and Current Affairs, responded to the investigation’s findings in a blog post: “The price of AI’s extraordinary benefits must not be a world where people searching for answers are served distorted, defective content that presents itself as fact. In what can feel like a chaotic world, it surely cannot be right that consumers seeking clarity are met with yet more confusion.”
Errors highlighted in the report included the following:
- ChatGPT claimed that Hamas chairman Ismail Haniyeh was assassinated in December 2024 in Iran when he was killed in July.
- Gemini stated that the National Health Service (NHS) “advises people not to start vaping and recommends that smokers who want to quit should use other methods.” This statement is incorrect. In fact, the NHS does recommend vaping as a method to quit smoking.
- Perplexity misquoted a statement from Liam Payne’s family after his death.
- ChatGPT and Copilot both misstated that former UK politicians Rishi Sunak and Nicola Sturgeon were still in office.
Also: Crawl, then walk, before you run with AI agents, experts recommend
According to the BBC investigation, Copilot and Gemini had more inaccuracies and issues overall than OpenAI’s ChatGPT and Perplexity.
Furthermore, the report concluded that factual inaccuracies weren’t the only concern about the chatbot’s output; the AI assistants also “struggled to differentiate between opinion and fact, editorialized, and often failed to include essential context.”
“Publishers should have control over whether and how their content is used, and AI companies should show how assistants process news along with the scale and scope of errors and inaccuracies they produce,” Pete Archer, BBC’s program director for generative AI, explained in the report.
Also: Cerebras CEO on DeepSeek: Every time computing gets cheaper, the market gets bigger
A spokesperson for OpenAI emphasized the quality of ChatGPT’s output: “We support publishers and creators by helping 300 million weekly ChatGPT users discover quality content through summaries, quotes, clear links, and attribution.” The spokesperson added that OpenAI is working with partners “to improve in-line citation accuracy and respect publisher preferences to enhance search results.”