Media Matters: “Extra! Extra! AI just wrote this headline”
“I’m not worried about AI taking over the world. I can barely get my Roomba to clean under the couch.” – Rita Rudner, comedian.
So I asked AI for a humorous headline about artificial intelligence and journalism. ChatGPT suggested the above. First I laughed. I had my headline! Then I worried. AI has a sense of humor. Just how smart is it?
I use AI for research for many of my columns. It helps me quickly identify the articles and studies I often cite. I told ChatGPT I was worried about writing this column. I got this advice:
“That worry is actually useful – it means you’re taking the topic seriously. The worst AI columns are the ones that are either breathless hype or blanket doom.”
So here goes … I’ll try to avoid the extremes.
I heard about an early version of AI at a journalism conference in Chicago in 2009. Professors at Northwestern University had developed software that “fully automates the writing of baseball game recaps.” While this seemed a useful service for small local papers, I remember thinking: “Could this someday replace sports reporters?”
Fast forward to the AI chaos of 2026.
In January, Chris Quinn, editor of the Cleveland Plain Dealer, Ohio’s largest newspaper, hired an “AI writing specialist.” In a headline titled In This Cleveland Newsroom, AI Is Writing (But Not Reporting) the News, the Columbia Journalism Review (CJR) described the plan – “reporters would spend more time gathering formation, less time typing it up.” All the writing was fact-checked. The Plain Dealer reports its journalists now have an extra day each week to gather information and do interviews. An editor noted, “AI is the assistant, but it’s not the journalist.”
According to CJR, Quinn received a lot of criticism, including calls to resign. Removing even some of the writing from the workload of reporters seemed like a step way too far.
I asked a couple of area editors about AI. In an email, Mandy Gambrell, editor of Hamilton’s Journal-News, offered this: “In local media, we have worked to integrate AI into our workflows, but we do not rely on it for any reporting. Our journalists remain at the forefront of storytelling and reporting the facts. Sometimes it may assist us with headline writing for search engine optimization, but we have major concerns with it in content. We have tested it by giving it a data set, telling it not to add or delete things – and then it spit back exactly what we asked it not to do. We asked AI, ‘why did you add data?’ And it responded that it simply did not know. AI should not be relied on for ethical and real journalism. I believe humans will remain the backbone of trusted information.”
Aidan Cornue, my editor at the OFP, also hit the trust theme: “AI ruins the trust that needs to be built between people and news organizations. AI, in its current, unregulated state, causes more harm than good for newsrooms. News needs to focus on people, not technology.”
Patti Newberry, an enterprise reporter at the Cincinnati Enquirer, taught journalism at Miami for 25 years. She was skeptical about AI at first but does use it in her work.
“Like all journalists,” Newberry wrote, “I conduct searches all day long as I report a story. When did that law pass? Where did that politician go to school? What previous charges did today's outlaw face in earlier years? In recent months, USA Today Co. has made Microsoft’s Copilot AI software available to … its news organizations. I've been using it daily to speed up searches.”
She emailed me ways she uses AI:
- “I’ve loaded attachments to legal proceedings and asked for bulleted summaries. Those have been pretty accurate reflections of the filings. (To be clear: We don’t and won’t publish such summaries – of a lawsuit or anything else. I read documents I need in full. But a Copilot summary can speed up my understanding and provide options for language.)”
- “I've asked for simple calculations – i.e., adding together a row of numbers from a spreadsheet.”
- “I’ve asked for links to records that can be tricky to find. Lately, I’ve been in search of government contracts with private businesses, and Copilot can find them more quickly than I can.”
Newberry, named SPJ’s Reporter of the Year in Ohio in 2024, also notes that Copilot makes mistakes and needs to be carefully fact-checked.
These journalists point out that we are in the early stages of AI. It remains experimental and unregulated. It makes mistakes. And the last thing we need is fewer local reporters.
Many journalists are not happy about the scarier aspects of AI’s encroachment. Another CJR article, titled Fighting the Machine, documents union activism pushing for contracts that give reporters a say over their bylines and the use of AI in their newsrooms. One major concern is AI-generated copy added to reporters’ stories without their approval.
“Unions for McClatchy papers – the Miami Herald, Sacramento Bee, Kansas City Star, and Idaho Statesman – as well as the Washington State News Guild,” CJR reports, “have filed grievances against the company,” suggesting that the company’s current use of AI “violates contract provisions that require advance notice for major technological changes.” McClatchy employees “have spoken out against the company’s use of a ‘content scaling agent,’ an AI tool powered by Anthropic’s Claude, to repackage reporters’ stories for specific audiences, while retaining their byline.”
In 2023, the Associated Press, our pre-eminent national news service since 1846, offered an early set of guidelines for journalists. Here are two:
- AP has a licensing agreement with OpenAI, the maker of ChatGPT, and while AP staff may experiment with ChatGPT with caution, they do not use it to create publishable content.
- Any output from a generative AI tool should be treated as unvetted source material. AP staff must apply their editorial judgment and AP’s sourcing standards when considering any information for publication.
So for my ending, I asked ChatGPT, “So what can human intelligence do that AI can’t?” After a substantial answer, it concluded: “The real takeaway isn’t ‘humans vs. AI,’ but that they’re complementary. Humans bring meaning, judgment, and direction; AI brings processing power and pattern detection.”
This reminded me of a Steve Jobs quote: “It is in Apple’s DNA that technology alone is not enough – it’s technology married with liberal arts, married with the humanities, that yields us the results that make our heart sing.”
Richard Campbell is a professor emeritus and founding chair of the Department of Media, Journalism & Film at Miami University. He is also a co-founder of the Oxford Free Press.