"Last month one of our journalists received an interesting email. A researcher
had come across mention of a Guardian
article, written by the journalist on a
specific subject from a few years before. But the piece was proving elusive on
our website and in search. Had the headline perhaps been changed since it was
launched? Had it been removed intentionally from the website because of a
problem we’d identified? Or had we been forced to take it down by the subject
of the piece through legal means?
The reporter couldn’t remember writing the specific piece, but the headline
certainly sounded like something they would have written. It was a subject they
were identified with and had a record of covering. Worried that there may have
been some mistake at our end, they asked colleagues to go back through our
systems to track it down. Despite the detailed records we keep of all our
content, and especially around deletions or legal issues, they could find no
trace of its existence.
Why? Because it had never been written.
Luckily the researcher had told us that they had carried out their research
using ChatGPT. In response to being asked about articles on this subject, the
AI had simply made some up. Its fluency, and the vast training data it is built
on, meant that the existence of the invented piece even seemed believable to
the person who absolutely hadn’t written it.
Huge amounts have been written about generative AI’s tendency to manufacture
facts and events. But this specific wrinkle – the invention of sources – is
particularly troubling for trusted news organisations and journalists whose
inclusion adds legitimacy and weight to a persuasively written fantasy. And for
readers and the wider information ecosystem, it opens up whole new questions
about whether citations can be trusted in any way, and could well feed
conspiracy theories about the mysterious removal of articles on sensitive
issues that never existed in the first place."
Via Dave Farber.
*** Xanni ***
Chief Scientist, Xanadu
Partner, Glass Wings
Manager, Serious Cybernetics