In a significant move towards modernizing journalism, the Worcester Journal has integrated AI-assisted reporters into its operations, a development hailed as a blend of historical journalistic integrity with cutting-edge technology.
This integration leverages the capabilities of AI models like ChatGPT to streamline routine tasks, such as transcribing minutes from local council meetings, and transform raw data into news reports in the publisher’s signature style. The overarching goal is to enhance time efficiency and ensure the accuracy of routine reporting, enabling human journalists to focus on more complex and investigative tasks.
AI complements, not replaces, human journalists
One crucial aspect emphasized by Stephanie Preece, the editor of the Worcester News, is that AI is not intended to replace human journalists but to enhance their roles. It is understood that AI cannot replicate the indispensable human elements of journalism, such as being physically present at events, attending interviews in person, and connecting with people on the ground. Instead, AI serves as a valuable tool that provides journalists with more time and resources to concentrate on these essential aspects, ultimately elevating the quality of journalism.
While integrating AI in journalism brings numerous advantages, it is not without its challenges and ethical concerns. Newsquest, the organization behind Worcester Journal, acknowledges the concerns surrounding AI’s reputation for inaccuracies. They have implemented various safeguards to mitigate this issue, including extensive training and a new code of conduct. Notably, the technology does not operate independently; a trained journalist inputs information into the AI tool, which is then subject to editing and refinement by a news editor if necessary.
AI’s role in journalism’s future
Newsquest’s head of editorial AI, Jody Doherty-Cove, cites a notable achievement in using AI—successfully generating a freedom of information request regarding mundane council expenses. Doherty-Cove envisions a future in which AI’s role in journalism becomes as commonplace as the internet today, expanding the scope and depth of journalistic endeavors. This perspective is further supported by Henry Faure Walker, CEO of Newsquest, who highlights AI’s vital role in modern journalism. He points to a specific incident involving the felling of the Sycamore Gap tree on Hadrian’s Wall, where AI took charge of routine reporting for a week, allowing journalists to dedicate their efforts to deep investigative work and multimedia storytelling.
Despite the promising potential of AI in journalism, major publications like The Guardian and The New York Times are taking cautious steps in this domain. The Guardian has outlined principles for generative AI, emphasizing a careful approach. In contrast, The New York Times has initiated legal action against OpenAI and Microsoft for scraping its content, highlighting concerns about the responsible use of AI in journalism.
Council of Europe’s guidelines for responsible AI integration
In a related development, the Council of Europe has established guidelines to govern the responsible use of AI in journalism. These guidelines ensure that AI integration in journalism aligns with human rights, democracy, and the rule of law.
Serving as a practical tool to navigate the evolving landscape of AI in journalism, they address the potential effects of AI on audiences and society at large. These guidelines were developed by a specialized sub-committee, working with broader efforts to establish a comprehensive framework convention on AI.