UK Newspaper Discovers ChatGPT Has Been Referencing Non-Existent News Articles

Charles Oladimeji
False Guardian news articles, purportedly authored and disseminated by ChatGPT, have been circulating despite not being written or published by any journalist affiliated with the news organization anywhere in the world.
ChatGPT on OpenAI's Website
ChatGPT on OpenAI’s Website. Photo: Rolf Van Root | Unsplash

In recent weeks, ChatGPT and OpenAI have faced a range of challenges, including issues surrounding privacy, security, and parental guidance. Unfortunately, this is not the first time that ChatGPT has referenced non-existent news articles, which has become a growing problem within the journalism industry. Recently, reporters from The Guardian discovered that ChatGPT had created several entire articles and bylines that were never published by the news organization or any of its journalists.

The discovery was made when a researcher stumbled upon a reference to a Guardian article on a specific topic, which was supposedly written by one of the company’s journalists several years ago. The journalist in question was contacted via email and informed of the article’s existence. They initially assumed that the article could not be found either because its headline had been altered or it had been removed from the internet due to legal issues.

Upon further investigation, it was discovered that the article had never been published, and ChatGPT had spread it as if it were real news. The article was written in a similar style and tone to the journalist’s usual writing, which added to the confusion and made the discovery all the more surprising.

Chatbot Hallucinations 

ChatGPT Generating Answers to a Question. Photo: Airam Datoon | Pexels

Chatbot Hallucinations occur when an AI tool generates seemingly realistic sensory experiences that do not correspond to any real-world input. The efforts of legitimate news sources are undermined when chatbots spread unconfirmed news. In most cases, chatbots like ChatGPT are unable to fully distinguish between real and fake news while checking the internet to provide answers to users’ queries.

Citations have been created by ChatGPT for The Guardian more than once, and it is still a problem because it suggests authority over articles that do not exist. In a statement, Chris Moran, the Guardian’s head of editorial innovation, said, “If the misattributions continue, it could well fuel conspiracy theories about the mysterious removal of articles on sensitive issues that never existed in the first place.”

He further added, “The invention of sources is particularly troubling for trusted news organizations and journalists whose inclusion adds legitimacy and weight to a persuasively written fantasy.”

The Effects Of ChatGPT On The Media

Journalist Writing a News Article. Photo: Tran Mau Tri Tam | Unsplash

ChatGPT has gained widespread adoption among global users, and it’s easy to get lost in the details of generative AI because of its inherent opacity. The concepts and implications of this technology are being studied by academics across multiple disciplines, and the pace of growth is rapid. Companies with large existing market shares, such as Microsoft, have integrated ChatGPT into some of their products to maintain a competitive edge.

The usefulness of chatbots like ChatGPT cannot be overstated; however, there are still areas that need improvement in terms of usage and how information found online is presented. The Guardian has raised a crucial question: “What can this technology do right now, and how can it benefit responsible reporting at a time when the wider information ecosystem is already under pressure from misinformation, polarization, and bad actors?”

The discovery of ChatGPT’s limitations and the creation of false citations for non-existent articles in The Guardian has raised concerns about the spread of false news online. This highlights the urgent need for increased awareness and scrutiny of the content generated by the AI tool. 

Do you have ideas about areas in which ChatGPT can be improved? Let us know in the comments section!

Total
0
Shares

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts