“…strategy researchers Michael McDonald and James Westphal, the worse companies performed, the more CEOs sought advice from friends and colleagues who shared their perspectives.”
from Adam Grant’s “Originals”
That is litterly the grown up version of the post title.
Humans are weird.
When we’re in trouble, under stress, faced with a challenge, we don’t always reach out for new ideas. Often we look for safe havens…places where we’ve found comfort in the past. Executives are human too… (really – they are) and do the same thing – look for comfort. In the same marketing strategies. The same tacitcs (usually just more of them with bigger deals). The same ideas. The same self-reinforcing network of friends and colleagues.
We do it because it makes us “feel” better.
We revert to the tools that have worked before.
Now let’s mash that idea into 2025 where we’ve got Artificial Intelligence for everything.
The problem? People are using AI like CEOs use old friends and tactics. As mirrors, not windows – or even binoculars.
AI Echo Chambers: Comfort over Clarity
Here’s the nasty little secret: AIs, by design and training, are meant to chase consensus. Feed them a million examples of one style or answer, and that style gets reinforced a million times over. It starts to sound like everyone’s on the same page.
But guess what?
That’s not how breakthroughs happen. That’s how you get groupthink on steroids.
When you ask an AI to help with “out of the box” thinking but feed it the same type of input or let it wander unsupervised, it’ll serve up what it *thinks* you want. Which (because it has the data and knows what you like) is more of the same.
The Danger of Consensus-Machines
Why does this matter? Because when things go sideways the last thing you need is a yes-person (or yes-AI?) You do not want an AI that reinforces whatever story you’re already telling yourself.
The issue isn’t whether you have enough information, it’s whether you have the *right* information. The unique perspectives that let you ask the questions your competitor or ClaudeAI or Grok won’t think to ask. And – ultimately – the courage and intelligence to listen to it.
More AI does not equal more wisdom if all it’s doing is showing you a fancier version of what you already think.
Enter the Human Outsider
A note popped up in my LinkedIn feed today from Christian Hunt hits the nail on the head.

“Don’t hire me to say things you could say.
Hire me to say the things you 𝘤𝘢𝘯’𝘵.
The perspectives, stories, and experiences your team doesn’t have.
The conversations that aren’t happening but should be.
The elephants in the room that everyone is pretending aren’t there.
I’m not shaped by your organization’s politics, history, or hierarchy… That gives me the freedom to be more direct, more provocative, and more challenging, while keeping people engaged instead of switching off.
If you want a safe echo of the status quo, I’m not your person.
That’s something you can get in-house, not something you should pay an external speaker to do.”
That’s it. That’s the antidote. Want to get *unstuck*? Call in someone who *doesn’t* sound like you, see the world the same way, or care about protecting your feelings or status meetings.
What AI Can’t—And Humans Can
AI is great for sorting, summarizing, and spitting out *what’s already known*. It’s an archive, not an oracle. But experts—the *real* ones, the ones who refuse to play along or nod like a dashboard bobblehead—are there to challenge, provoke, and uncover what’s missing. They see the gaps, smell the weirdness, and break stuff so you can rebuild it stronger.
The Wheat, the Chaff, and the Courage
When you’re facing tough, messy, or unfamiliar problems, you don’t need another mirror or a smarter parrot. You need someone with the guts to point out what you’re missing, to help you sort the wheat from the chaff. Not reinforce the same old patterns but break them.
Bottom line? AI’s consensus magic is a blessing for organizing chaos, but a curse for true innovation.
The dirt doesn’t clean itself by piling up more dirt—ask any toddler with a toybox. Don’t “clean up” by asking a machine for more of the same; get the expert, the outsider, the uncomfortable question-asker.
So next time a crisis, opportunity, or banana peel lands in your path, don’t double-down on agreement—invite disruption. Bring in someone who *can’t* sound like you. If you want to change the world (or at least your company), go find a Christian Hunt, not another AI with reassuring charts.
Get dirty the right way. Challenge, provoke, rethink—because a clean answer isn’t always a right one, and the right answer is usually up to its elbows in the mess you’re scared to sift through.




Recent Comments