The Age of Ultracrepidarians

Warning: I may have found my new favorite word to use in relation to various tech topics. At least my favorite since "anagnorisis".1 "Ultracrepidarian", beyond being an awesome word, is defined as follows: someone who has no special knowledge of a subject but who expresses an opinion about it. I mean, how is this not the main word for the entire internet? Certainly social media. And actually, in the age of AI, I think we can even bend the idea of “someone” to include machines.
I believe that social media, while still fun in ways, has gotten worse over time when it comes to sharing actual information. Some of this is intentional product and policy decisions, but some of it is also just the fact that charlatans have gotten more emboldened with time to spew nonsense to their followings. And this has bred more charlatans as a result. Because there’s no downside to spouting bullshit. The only thing that matters is the volume.
Right or wrong, say it loud. Time has revealed the ultimate truth: no one remembers when you’re wrong and you’ll remind everyone when you’re right. That mixed with the fact that news cycles move at lightspeed these days just exacerbates it all. People want bold stances that rise above the noise, no matter how ludicrous. In fact, the more ludicrous the better.
And that, in turn, has fueled the rise of ultracrepidarians. If someone is known to be knowledgeable (or at the very least sounds knowledgeable) talking about a topic, people will tend to listen to that person on other topics. There’s absolutely nothing to suggest they should be trusted for anything beyond their area of expertise, but it doesn’t matter. The press has long played to this notion by giving us various celebrity opinions on the big issues of the day. And politics has always dabbled in this idea. But now it has seemingly become the M.O. of the current administration.
We have a number of people now in positions of great power because they’re known to Donald Trump for other things they’ve done. Bonus points if they’re on TV. Extra points if they’re on Fox News.
Look, I’m all for thinking outside the box. And I think there’s some merit in looking at things with a fresh set of eyes. But putting unqualified people in important places just seems like it's more likely than not to not end well.
But again, it’s hardly surprising, this is the world in which we live. And social media fuels it because anyone can give their opinion on anything. The strength has become a weakness because of our collective weakness when it comes to the Dunning-Krueger effect and, of course, cognitive bias.
At the same time, I’m reminded of AI hallucinations. While overall, this topic has faded quite a bit as AI has matured, with all the players now rushing to get “Deep Research” products out the door, you have to wonder if it won’t come roaring back in ways...
To be clear, these offerings are great. And OpenAI’s version, in my testing thus far, has been incredible. Truly, this feels like another big watershed moment for AI. But it’s also probably a bit premature to outsource your brain to them. A couple of the ones I’ve “commissioned”, where I know a thing or two about the topic, seem to be around 95% right, but that final 5% may end up mattering at times, one suspects.
At the same time, one also suspects that most users will be happy to believe everything that’s being output 100% of the time. Obviously. This could be problematic. Obviously. Even when the stakes are relatively low – Siri, the ultimate ultracrepidarian.
I'm reminded of a post I wrote a decade ago. In "Wrong Positions, Strongly Held", I warned of this guy:
You undoubtedly know someone like this. They’re the person who always very matter-of-factly states something, when they’re often talking out of their ass. They have absolutely no idea what they’re talking about, but because they’re saying it so forcefully, people believe them.
Blame human nature. Almost all of us are ingrained with a level of trust so that when someone says something in a manner which it seems like they must know what they’re talking about, we naturally believe they in fact know what they’re talking about. We trust them — even if we don’t know them. Because who would spout bullshit as fact?
Well, a lot of people, actually. But again, most of the time people tend to do this in a meek manner. Equivocations are plentiful. Bullshit sensors immediately go berserk.
But the ‘wrong position, strongly held’ folks often evade these detectors. At least for a time. Often until it’s too late. Which is why this is so dangerous.
This was long before our era of AI – at least as we know it now. But doesn't it sound like some of the AI outputs we've all seen? But it sounds even more like many of the charlatans we all now see online. In an odd way, perhaps that’s now making the AI seem more human! Maybe AI should run for office.



1 Anagnorisis, noun: the point in the plot especially of a tragedy at which the protagonist recognizes his or her or some other character's true identity or discovers the true nature of his or her own situation." See also: Wile E. Coyote chasing the Roadrunner right off a cliff... Almost a decade ago, I wrote about in relation to the fall of BlackBerry -- which at first looked like it was succeeding in the face of the iPhone. By the time they realized they weren't, it was too late.