Apple AI falsely claims darts star Luke Littler won before final

And then declares Nadal gay

Image:
Apple AI falsely claims darts star Luke Littler won before final

Apple's new AI-powered news summarisation feature, Apple Intelligence, has once again fallen victim to inaccuracies, this time falsely declaring darts player Luke Littler the winner of the PDC World Championship before he even competed in the final.

The erroneous summary, generated by AI and based on a BBC report about Littler's semi-final victory, confused users of the BBC News app.

Hours later, the AI-driven system made another error, informing some BBC Sport app users that "Brazilian" tennis player Rafael Nadal had come out as gay.

Not only was Spanish tennis legend Rafael Nadal incorrectly identified as Brazilian, but the claim of him coming out as gay was completely wrong.

The linked story actually focused on Brazilian tennis player Joao Lucas Reis da Silva and his impact as an openly gay athlete.

These are not isolated incidents.

Last month, Apple Intelligence generated a false headline about a high-profile US murder case, prompting the BBC to formally complain. The AI-generated summary falsely suggested that Luigi Mangione, the man accused of murdering healthcare insurance CEO Brian Thompson in New York, had shot himself.

No such claim was made in any BBC reporting.

"It is essential that Apple fixes this problem urgently - as this has happened multiple times," a BBC spokesperson said.

"As the most trusted news media organisation in the world, it is crucial that audiences can trust any information or journalism published in our name and that includes notifications."

Apple has yet to publicly comment on these latest incidents.

While Apple CEO Tim Cook has previously acknowledged that Apple Intelligence would not be 100% accurate, these repeated errors suggest a serious gap in the system's reliability.

Last month, Reporters Without Borders (RSF) called on Apple to discontinue the feature, citing concerns about the automated production of false information attributed to media outlets.

"AIs are probability machines, and facts can't be decided by a roll of the dice," Vincent Berthier, the head of RSF's technology and journalism desk, said at that time.

"RSF calls on Apple to act responsibly by removing this feature. The automated production of false information attributed to a media outlet is a blow to the outlet's credibility and a danger to the public's right to reliable information on current affairs."

Apple Intelligence, currently available on select iPhones, iPads and Macs, aims to provide users with concise summaries of missed app notifications.

While it accurately summarised some news stories, the recent string of errors highlights the challenges of relying on AI to accurately interpret and condense complex information.

The BBC is not alone in encountering issues with Apple Intelligence.

In November, a journalist from ProPublica shared a screenshot on Bluesky showing another misleading AI-generated summary. The notification inaccurately implied Israeli Prime Minister Benjamin Netanyahu had been arrested, a misinterpretation of an article about an International Criminal Court warrant.

The New York Times, whose articles were summarised, declined to comment.

Professor Petros Iosifidis, a media policy expert at City University in London, criticised Apple for releasing what he called a "half-baked" product.

"I can see the pressure getting to the market first, but I am surprised that Apple put their name on such demonstrably half-baked product," he said.

"Yes, potential advantages are there - but the technology is not there yet and there is a real danger of spreading disinformation," Iosifidis added.