Ken Paulson, director of the Free Speech Center at Middle Tennessee State University and a former editor-in-chief of USA Today, penned a wonderful piece published by The Monroe Times (small town paper in my home state of Wisconsin) on Sunday.
Paulson acknowledges that AI is going to transform everything we watch, hear, and read. You key a question into an AI search engine and get answers that often are not all that bad.
What Paul points out, though, and it’s analogous to legal publishing by lawyers, whether it be blogs, articles, or other publications, is that AI-fueled news or commentary introduces some serious problems.
The first is that some of the information you receive is fictitious. We already hear about hallucinations in relationship to the law. The second issue Paul points out is that AI really doesn’t know what the truth is. Yet many take its output at face value, sometimes making decisions based on it.
In the world of legal writing and legal publishing, we already have lawyers putting up content marketing not for the purpose of providing news or insight, but for the purpose of generating traffic. First through search engines, and now through gaming AI. Legal marketers are coming out with all types of quick renditions on how to “be seen in AI.”
The problem is that consumers of legal information, whether they’re peer lawyers, small businesses, or ordinary people in towns like Monroe, take this information from AI to heart. They might even take it more seriously because it’s coming from AI. To a layperson, it might appear to have more authority than a traditional newspaper.
But what Paulson points out is that local newspapers, and local publishing, are the original sources of accurate information. That’s a huge point, because garbage in is garbage out. As lawyers fill the internet with content marketing that may be poorly written, created by marketing professionals rather than lawyers, and lacking insight based on niche experience or care, that same material becomes part of the AI training data. The result? Garbage out.
Paulson’s article really hit home for me, as a legal publisher holding to the belief that real publishing, real blogging by lawyers, still matters. It’s a lot like newspaper reporting. Rooted in truth, service, and the public good.
Per Paulson,
Local newspapers embrace the original AI: accurate information. How refreshing is that? Newspapers focus on your community, written by neighbors who shop at the same stores and send their kids to the same schools. Most can readily be reached by phone or email, and when they make an error, they correct it.
How quaint. How essential.
AI isn’t magic. When used for search, it offers an analysis and recasting of information about what’s already known, drawing on the vast resources of the web.
Any search about your hometown, though, depends on that information being captured and published in the first place. If your local newspaper doesn’t report on a new transportation plan for your community, there’s nothing for AI search to draw upon. AI is not sitting in the third row of the City Council meeting
I recently shared that it’s a little scary to think that AI could replace lawyers in writing online. But what Paulson explains for me is that maybe that’s not true. We still need real lawyers explaining the law. Sitting down, offering their insight and commentary on niches they’ve practiced in, and sharing what they’ve learned from helping businesses or consumers.
If lawyers provide that kind of authentic information, we can get it out into the world faster than ever and maybe even identify the most reliable sources more effectively through the use of AI. But it will take that “newspaper lawyer,” as I think of it, to put the real, human knowledge in so that what comes out of AI is accurate and trustworthy.
