When Optimisation Starts to Erase Value
For a long time, optimisation was treated as the answer.
If you could measure the right things, tune the system, improve efficiency — better outcomes would follow. And for many organisations, that logic appeared to work. The numbers improved. Decisions felt cleaner. Progress felt visible.
But there was always a gap.
The things that actually made the work meaningful — and valuable — didn’t sit neatly in the data. Trust. Reputation. Word of mouth. The way people felt after engaging with us. None of that moved cleanly in dashboards.
For us, experience was never optional.
It was always everything.
What took longer to become clear was that most systems simply couldn’t recognise it.
So while experience continued to do the real work — building demand, trust, and reputation — the tools used to evaluate performance lagged behind, focusing on what they could see rather than what actually mattered.
The hidden cost of measuring the wrong things
Optimisation systems are not malicious. They do exactly what they’re designed to do.
They optimise what’s easiest to measure.
Revenue. Conversion. Utilisation. Throughput.
All useful signals — but incomplete ones.
What they don’t capture well are the pre-economic forces that create demand in the first place:
trust built over time
shared experience
community endorsement
reputation earned through consistency
These forces don’t behave like metrics. They accumulate slowly. They compound invisibly. And for a long time, they existed outside the system’s field of view.
That didn’t make them less real.
It just made them harder to account for.
What changes when AI enters the picture
When AI is layered on top of these systems, something important shifts.
For the first time, systems begin to observe what actually holds up in the real world — not just what performs well in spreadsheets.
That doesn’t automatically mean better decisions.
But it does mean recognition is catching up.
AI systems increasingly learn what to trust by watching:
repeated use
human endorsement
consistent outcomes over time
In other words, lived experience.
What’s becoming clear in 2026 is that lived experience isn’t a soft, secondary input.
It’s a coherence signal.
This is why some AI-driven systems feel hollow: they’re still optimising against a reduced version of reality. Others are beginning to align with what people actually value.
The difference isn’t the technology.
It’s what the system is allowed to see.
Efficiency versus coherence
I’ve watched organisations become more efficient while still struggling to explain why pricing power erodes or loyalty thins.
The reason is rarely effort or intent.
It’s misalignment.
When the system’s definition of success drifts away from lived experience, efficiency and value start pulling in different directions.
Coherence is what keeps them aligned.
It’s the relationship between:
experience and value
short-term metrics and long-term trust
what people feel and what they’re ultimately willing to pay
When that coherence is protected, optimisation supports value rather than eroding it.
A recognition problem, not a technology problem
This isn’t an argument against AI. Or data. Or optimisation.
It’s an argument for recognising what has always been true.
Experience isn’t the output of good systems.
It’s the input they’ve been missing.
As systems improve, they’re finally starting to recognise what lived experience has been signalling all along.
That’s not a revolution in values.
It’s a catching-up.
And it’s why experience-led organisations like Elsewhere are now being understood — not because they changed, but because the systems around them finally can.