it's about trust–if we undermine trust, then we've lost our greatest value. There have been enough hits to trust in the news industry over the last decade. We don't need to exacerbate that.
- Troy Thibodeaux, Director of AI Products and Services at the AP
In a few short years, AI has gone from being a sci-fi trope to an everyday reality. For better or worse, AI is everywhere, and it’s having a profound impact on the way we live. While AI can provide a host of new creative possibilities, it has also blurred the line between fact and fiction like never before. Bad actors are already using it to create ever more plausible lies, while AIs themselves can peddle misinformation if they hallucinate. This means that the rise of AI has proven particularly challenging for journalists since they must walk a tightrope between embracing efficiency while also maintaining a high level of accuracy.
The AP's cautious framework
The Associated Press made headlines in August 2023 when it became one of the first major news organizations to establish comprehensive AI standards. Despite having a licensing agreement with OpenAI that allows ChatGPT to train on AP's news archive, the organization took a notably restrictive approach.
The AP's guidelines prohibit staff from using generative AI tools like ChatGPT to create publishable content. Instead, "any output from a generative AI tool should be treated as unvetted source material" that must undergo editorial evaluation. The guidelines also note that "we do not see AI as a replacement of journalists in any way."
The restrictions extend to visual content, with the AP refusing to use generative AI to alter photos, video, or audio, and avoiding AI-generated images suspected of being false depictions of reality.
Any output from a generative AI tool should be treated as unvetted source material.
- AP guidelines
The AP’s caution is understandable. As Troy Thibodeaux, Director of AI Products and Services at the AP put it in an interview with The Media Copilot, “it’s about trust–if we undermine trust, then we’ve lost our greatest value. There have been enough hits to trust in the news industry over the last decade. We don’t need to exacerbate that.” At the same time, the AP can't take an ostrich-like approach and ignore it entirely given that customers increasingly prefer a level of personalization and immediacy that is hard to achieve without AI.
AI hype vs. AI reality
Recent research illustrates that even limited use of AI can cause difficulties. A paper by Nadja Schaetz of Hamburg University and Anna Schjøtt of the University of Amsterdam documents some of the challenges faced by the AP’s Local News AI initiative. The goal was to see if AI could help local newsrooms with their workflow, but even relatively straightforward tasks such as the production of automated transcripts proved tricky since there had to be continuous human supervision. Similarly, an attempt to create basic news stories based on police blotters encountered difficulties due to a lack of standardization in the source material. Consequently, the scope of the tool was narrowed to focus on three specific law enforcement agencies.
Despite these limitations, using AI still allowed journalists to work more efficiently by allowing them to focus their efforts on reporting rather than data-gathering.
AI is neither panacea nor peril, but a tool that requires educated oversight.
The BBC's invisibility problem
A paper by Bronwyn Jones of the University of Edinburgh and Rhianne Jones of the BBC’s Research and Development Division looked at how the BBC used AI between 2020 and 2023. They found that most BBC journalists engaged with AI as "users of pre-made tools" rather than contributors actively shaping their profession's future. A key barrier was "how invisible and abstract AI seems" to journalists, creating a disconnect between technical capabilities and practical understanding.
The paper found a need for better AI education in the newsroom, suggesting that this could help journalists make better use of AI in the future.
Managing expectations
Both papers underscore the challenges of using AI in the newsroom. While the AP’s local newsrooms reported that AI freed up time for original journalism and reduced burnout, it wasn’t a silver bullet. However, it could still be beneficial to smaller newsrooms whose capacity for data gathering might be more limited.
Similarly, the BBC’s experience shows the importance of education to help manage expectations.
Conclusion
As artificial intelligence continues to reshape the media landscape, the experiences of the AP and BBC offer an instructive glimpse into its nuanced impact. AI is neither panacea nor peril, but a tool that requires educated oversight. While early adopters have found both promise and pitfalls, the path forward lies not in resisting change, but in stewarding it wisely. By investing in education, refining standards, and keeping human judgment at the core, newsrooms can ensure that AI serves truth rather than distorts it.