A.I. Journalism and why writers need not fear ‘the algorithm’

What happens when computers write articles instead of humans?

In this way, we are witnessing a revolution in journalism. It is not a technological revolution; it is a social revolution. We are not just a product of a new era of digital media. In fact, digital journalism is an extension of the digital revolution that is being celebrated.

I did not write this paragraph. The passage was generated by the Honi Soit Article Generator, an artificial intelligence (A.I.) program developed by former University of Sydney student and software engineer Andrew Cain. The generator has been trained exclusively on Honi’s online database, using an A.I. model developed by Google in 2019 called GTP2 [the newest version of the program, GTP3, was sold to Facebook for $1 billion].

These deep learning models are called transformers, which have proven themselves at being especially powerful at understanding human language and can do everything from sentiment analysis to producing better search results. Despite their ability to synthesise paragraphs of realistic English text, there is no guarantee the content has any basis in reality, leaving material susceptible to defamation and/or copyright claims as programs such as GitHub Copilot have recently found.

Currently, companies such as Jarvis.ai have championed these technologies to improve the speed of the writing process and have claimed to increase writing speeds fivefold. In 2014, the Los Angeles Times published a report about an earthquake three minutes after it happened, made possible by an in-house program called Quakebot that produced automated articles based on data generated by the US Geological Survey. Models such as these are called Natural Language Generation (NLG) tools, some other examples including the BBC’s Juicer, the Washington Post’s Heliograf, and Bloomberg’s Cyborg [which generates nearly a third of their content].

Generally, these A.I.-written articles are limited to simple and formulaic topics such as stock-market updates and sporting results. While NLG tools do not possess the creativity of the transformer models, they are far more widely used. Yet, NLG algorithms have not rendered journalists redundant as they lack flair, imagination, and in-depth analytical skills. They only produce articles where structured data is available to input. Francesco Marconi, a professor of journalism at Columbia University, believes that only up to 12 percent of journalism will be taken over by computer programs, mainly saving time on tasks divorced from actual writing such as analysing large databases or transcribing audio and video interviews.

In other words, A.I. generates plans for articles. Author Calum Chace argues may result in an increase of niche articles being tailored at scale to individual readers. An example of A.I.’s ability to identify trends and developments worthy of investigation by journalists comes from Forbes’ content management system released in 2018. Known as Bertie, the algorithm recommends possible article topics and headlines to writers based on their previous work. Not only has A.I. affected the news that gets written, but it also controls the articles people see. Facebook news algorithms suggest what they believe to be the most relevant content for their users. News outlets similarly keep track of the types of articles that subscribers read, and learn about preferences based on the time spent on each article. Personalisation is a burgeoning trend online which aims to encourage user interaction, the New York Times for example prominently displays a “For You” section on their homepage.

A.I. is symbolic of a new business model based on breaking down media trends. For journalists, there needs to be close collaboration between editorial teams, computer scientists, and marketing staff. The survival of journalism depends on using databases to find stories that are relevant to readers.

Yet, A.I. journalism is not without limitation. The 2019 JournalismAI report found there are persistent difficulties in adopting A.I., including limited financial resources, lack of knowledge and skills, and cultural resistance. The final limitation is the most topical. Although trust in traditional media is much greater than the global average, 57 percent of Australians believe there is a ‘fair to great extent of fake news’ in newspapers and magazines, increasing to 63 percent on online platforms.

The two concerns that mainly apply to A.I. journalism are centred around the generation of content and the display of that content to the reader. In both areas, issues of algorithm bias are prevalent. Because data often reflects the inherent biases within society and is designed by humans, the structure of the models themselves can alter data analysis and lead to serious consequences. For example, A.I. is more likely to predict that an Indigenous person is a criminal because statistically a higher percentage of Indigenous persons are incarcerated. Furthermore, news companies that promote articles only based on user interactions will likely see a downward trend in quality of writing, producing clickbait news that promulgates purely emotionally driven content. User-driven news production also runs the risk of getting stuck in yesterday’s news; articles will not have genuine substance because they are based on previous publications.

Journalists therefore need not fear the algorithm. To ensure the news remains informative and trustworthy, companies must keep a human hand on the wheel. Although A.I. can produce hundreds of basic articles per day, these articles require supervision and fact-checking. Looking back to the A.I.-generated paragraph at the beginning of the article, while the wording my entice a reader to open the article, the words do not deliver any real news. Humans need to be well-trained to oversee and verify algorithmic results and ensure their quality.

A.I. is not a trend, rather a new paradigm. A.I. models have shown their worth in clearing away the more tedious aspects of journalism. But when it comes to the results that news companies are truly renowned for – political commentary, opinion pieces, in-depth data analysis – humans are clearly an essential part of the equation.

In addition, Warren St. John, the chief executive of Patch, a US nationwide news organisation, speaking to the New York Times, believes A.I. journalism comes with other benefits:

“One thing I’ve noticed,” Mr. St. John said, “is that our A.I.-written articles have zero typos.”

A special thank you and acknowledgement to Andrew Cain. Try his program yourself, the Honi Soit Article Generator, using the link below: https://newsgenerator.tklapp.com/