How will AI affect journalism in 2024?

News organizations will use AI technologies to increase efficiency, but not to churn out generic, easily reproducible content, according to Journalism, Media, and Technology Trends and Predictions 2024, a new report published by the Reuters Institute at Oxford. You can read a summary or download the PDF here.

The 41-page report is based in part on a survey of digital news leaders in 56 countries and territories, 314 of whom responded in the final months of 2023.

Sixteen percent of respondents said their organization “already [has] a designated AI leader in the newsroom” and 24 percent said they are “working on it.”

“Forward-thinking news organisations will be looking to build unique content and experiences that can’t be easily replicated by AI. These might include curating live news, deep analysis, human experiences that build connection, as well as longer audio and video formats that might be more defensible than text.”

—Journalism, Media, and Technology Trends and Predictions 2024, page 39

Concerns about deepfakes, and false information generated and spread by AI–enabled bots, are especially strong around national election campaigns, but in spite of promises to be vigilant by the big platforms (Google, Meta, TikTok), no one knows how bad it will be or whether the effects will be serious. The EU remains the only region with legal requirements for platform oversight and accountability (the Digital Services Act). Labeling AI–generated content and deploying fact-checking routines are two defenses that might not be adequate for the task — for example, news audiences might simply ignore labels.

Full Fact is a U.K.–based fact-checking organization that is using various “AI techniques.” The Newsroom is an AI startup developing tools for journalists and news audiences.

Current and future uses of AI in newsrooms, ranked by importance by respondents to the survey:

  • “Back-end automation tasks (56%) such as transcription and copyediting … a top priority”;
  • “Recommender systems (37%)”;
  • “Creation of content (28%) with human oversight … e.g. summaries, headlines”;
  • “Commercial uses (27%)”;
  • “Coding (25%), where some publishers say they have seen very large productivity gains”;
  • “Newsgathering (22%) where AI may be used to support investigations or in fact-checking and verification.”

Some news organizations are using image-generation tools such as Midjourney “to create graphic illustrations around subjects like technology and cooking.”

Back-end automation and coding are considered relatively low-risk applications of AI, but content creation and newsgathering are seen as higher risk, potentially threatening the reputation of the news organization.

Creative Commons License
AI in Media and Society by Mindy McAdams is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Include the author’s name (Mindy McAdams) and a link to the original post in any reuse of this content.

.