NNW op-ed: AI is still no replacement for local reporting

National Newspaper Week is coming up from October 6-12, 2024. Run this guest op-ed in your newspaper and visit www.nationalnewspaperweek.ca for more resources.

Did you read the one about the German court reporter who was convicted of abusing children, conning widowers and escaping from a psychiatric hospital? Likely not, as no human editor would ever confuse the writer of an article with people he was writing about. However, Microsoft’s Artificial Intelligence Copilot system did just that.

Now, unless you’re the poor German reporter, these sorts of “hallucinations” as they’re euphemistically called, may seem funny or the natural growing pains of a new technology. However, how would you feel if an AI told lies like that about you all the while touting how reliable its system is?

The truth is that, like a politician reading a speech in a language he doesn’t speak, AI doesn’t understand what it’s saying. It’s just a program that looks for patterns based on scanning billions of words of text. To it, it’s a perfectly reasonable assumption that the reporter committed all those crimes because his name was associated with all those stories. But should we really be trusting an algorithm to tell us what’s true and what isn’t when it has no way of knowing itself?

Misinformation aside, I’d never actually tested any of the generative AI programs to see if they could actually be useful for local news, so I asked three of them something I was asked many times during my reporting days: “What happened at last night’s council meeting?”

Copilot gave me a rundown on a vote that took place last night in Aurora … Colorado. After telling me it had no updates, ChatGPT suggested that I add the city I was looking for. I typed in my hometown, and it still returned no answers, but I suggested that I could “find summaries or highlights on the city’s official website or local news outlets.” Gemini did actually pull up some local news, mostly because I had to login to Google to use it. However, the summary it spit out not only wasn’t for the most recent council meeting (which wasn’t actually last night), but after a quick search of a year’s worth of online minutes, I couldn’t find a council meeting that matched the AI’s result.

Now, if you asked a local journalist the same question (even a tricky one) they’d give you an answer. Why? Because he or she was likely one of a very few people sitting there in the gallery observing the goings on. Was AI there? No. Did AI ask the difficult question that the mayor was hoping to avoid? Not a chance. Can you rely on what AI tells you about local news? Apparently not.

And you don’t need to be a German court reporter to see that.

Gordon Cameron is the executive director of the Ontario Community Newspapers Association.