The Bias Caused by Uncertainty
The forgotten concept that explains why your best investments disappoint you and why neither Hormuz nor Al should change your process
I am often confronted with this bias, that’s why I write this article to help me remember it.
It is important to keep in mind that I am not saying we should buy, sell or hold our current positions. However, he currently has and will probably have better opportunities in the coming weeks/months. It is important to keep in mind this bias, which can be very little known by investors.
Why crises shouldn’t change your margin of safety
The Strait of Hormuz has been effectively closed for a month. Oil prices have surpassed $100. Nearly 2,000 ships are waiting on either side of the strait. Marine insurance premiums have skyrocketed. Analysts are talking about a global recession. Meanwhile, another alarmist narrative has been circulating for months: artificial intelligence is going to render SaaS software obsolete. “Vibe coding” tools make it possible to recreate in a single weekend what used to take a team of developers months to build. Barriers to entry are crumbling. The SaaS model is dead.
I don’t pretend that we shouldn’t do anything and keep all our positions in the portfolio. Recent events bring a lot of opportunities that we must not miss, because uncertainty pushes us to increase our safety margin or become very conservative about the future.
Constellation Software, the world’s largest acquirer of vertical-market software, has lost more than 50% of its value since its peak. Not because its revenue has fallen-it rose 16% in 2025 . Not because its cash flow has collapsed-it surged 46%. Because the market has decided that AI will destroy the business model of its hundreds of subsidiaries. (In addition, the evaluation of Constellation was far too high in 2025.)
Two narratives of panic. Two sources of uncertainty. The same fundamental question: Should I change my investment criteria?
The panic instinct
When a major event strikes-a war, a blockade, a technological disruption-our natural reflex is to reevaluate everything. We look at our portfolio and wonder if our margin of safety is still sufficient. We’re tempted to demand an even bigger discount before buying. We put our decisions on hold. We wait. This is a human reflex. But it is based on a fundamental mistake: believing that our analytical framework should fluctuate with the news cycle.
The truth is, if your investment process was good before the Hormuz crisis, it’s still good now. And if it wasn’t, the problem isn’t the war in Iran it’s your process. The same logic applies word for word to the narrative that AI is killing SaaS.
To understand why, we need to discuss a concept that most individual investors have never encountered, but which should be at the heart of their thinking.
The Invisible Trap of Uncertainty
In the 1970s and 1980s, an economist named Edward Miller described an elegant yet devastating problem for anyone making investment decisions. The idea can be summed up as follows: even if your estimates are perfectly honest and unbiased, the mere act of choosing among several investment options creates an upward bias.
Imagine you’re evaluating ten investment opportunities. Your return estimates are generally accurate sometimes you overestimate, sometimes you underestimate-but there’s no systematic bias in your analysis. You select the three best opportunities based on your estimates and make your investment.
Here’s the problem: among these three “best” opportunities, you’ll end up over representing the ones whose returns you’ve mistakenly overestimated. It’s a matter of math. Investments whose value you’ve accidentally overestimated are more likely to end up in your top 3 than those whose value you’ve underestimated. The former move up in your rankings the latter move down. The result: the investments you choose will, on average, underperform your expectations. Not because you’re incompetent, but because the selection process itself introduces a bias.
Miller called this the “uncertainty bias.” The greater the uncertainty in your estimates, the greater the bias.
The solution, you must demand an additional margin-an “uncertainty premium” before accepting an investment. The greater the uncertainty, the higher the premium must be.
In practice, this is what good value investors have always done. When Benjamin Graham spoke of a margin of safety, he was essentially talking about the same thing: buying far enough below the estimated value to account for the inevitable errors in our estimates.
In practical terms, this means that I should:
Recognize that the estimates of intrinsic value are systematically too optimistic, even if they do rigorous work.
Apply an uncertainty premium proportional to the degree of uncertainty in each situation.
Calibrate this premium over time by conducting “post-mortems” on past investments.
This third point is crucial and my favorite. If you never go back to see how your past investments have fared compared to your initial estimates, you have no basis for knowing whether your uncertainty premium is too high, too low, or about right.
As far as I am concerned, I am often more pessimistic in my estimates than the average, and I have often asked for an excessive margin of safety.
Will Al kill SaaS, really?
Since mid-2025, a prevailing narrative has dominated the tech sector: Al tools-particularly coding assistants will eliminate barriers to entry in the software industry and erode the margins of existing SaaS companies. The logic is compelling: if anyone can code software in a matter of hours, why pay a monthly subscription for a SaaS product?
The market has spoken. Valuations for SaaS companies have plummeted. Constellation Software has fallen by more than 50% from its May 2025 highs, despite a 16% increase in revenue a 46% rise in cash flow and a ROIC at +19% in 2025. The stock has fallen from $5,300 to a low of nearly $2,200. This is the largest drawdown in the company history.
And yet, let’s apply the framework to this situation. The narrative that “AI is killing SaaS” is based on a true premise (AI lowers software development costs) to draw an exaggerated conclusion (all SaaS software will be replaced). It’s a bit like saying that the invention of the electric drill was going to put plumbers out of business. The tool is faster, yes. But a plumber job isn’t just about drilling holes-they know building codes, pipe layouts, and local standards. The tool doesn’t replace domain expertise.
That is exactly what Constellation Software CEO Mark Miller said in his earnings call in March 2026:
But I want to be direct about something, building products and features faster will not be what differentiates us long term. That capability will become widely available. It's going to be table stakes. What will matter is what our businesses have spent many years developing, deep vertical knowledge, a genuine understanding of customer workflows and processes, the data inside their solutions and the trusted relationships they've built. I believe AI will help us do all of this better.
Vertical SaaS software is not a generic product. It is mission-critical software deeply integrated into the day-today operations of specific industry niches, such as fleet management, golf club software, and library reservation systems. The cost of replacement is not the cost of rewriting the software. It is the cost of migrating years of data, retraining employees, and risking operational downtime. Al can write code faster. It cannot reconstruct fifteen years of customer data integrated into a vertical ERP system.
But the point here isn’t to determine whether the AI optimists or the AI pessimists are right. The point is to realize that the narrative “AI is killing SaaS” is a source of uncertainty, not a certainty.
The uncertainty surrounding the impact of AI is real. But this uncertainty should already be factored into your premium. If your investment process cannot account for the possibility that a technological disruption might affect your positions, your problem isn’t AI → it’s your process.
Hormuz in all this
Let’s return to the war in Iran and the blockade of the Strait of Hormuz. When the strait was closed in late February, oil prices surged by 40%. The markets panicked. Brent crude surpassed $120 at its peak. Major oil companies declared force majeure. The Federal Reserve Bank of Dallas estimates that the closure could shave 2.9 percentage points off global growth in the second quarter of 2026. That’s serious.
Should I demand an even larger margin of safety now? The answer depends entirely on how you built your margin of safety before the crisis.
If your uncertainty premium was already calibrated to absorb the kind of geopolitical shocks that occur once or twice a decade-and such shocks happen regularly then no, you don’t need to change it. Geopolitical uncertainty didn’t just appear on February 28, 2026. It’s always been there. The Gulf Wars, the invasion of Ukraine, the oil crises of the 1970s. These events are part of the normal investment landscape.
On the other hand, if you’ve been investing up to this point without factoring in any uncertainty-treating your estimates of intrinsic value as certainties then yes, Hormuz should serve as a wake-up call. But the problem isn’t Hormuz. The problem is that your process didn’t account for uncertainty from the start.
The same logic applies exactly to AI. If you own vertical SaaS companies and your intrinsic value analysis includes no scenario for technological disruption, no consideration of barriers to entry, no assessment of switching costs, and no allowance for the unpredictable, then AI is your wake-up call.
The real danger
It’s becoming too conservative at the wrong time.
When everyone panics and demands huge safety margins, prices fall. And when prices fall enough, expected returns rise. That is precisely when the best opportunities arise-just when uncertainty is at its peak.
Constellation Software is a prime example. The company is generating more cash flow than ever before. Its disciplined acquisition strategy remains intact. And it is trading at one of the lowest multiples of the past decade ~16x EV/NOPAT. All because the market is extrapolating a panic-driven narrative.
Will AI affect certain Constellation subsidiaries? Probably. Is this the end of the vertical SaaS model? That’s far from clear.
But here’s what’s certain: if you require a 50% margin of safety instead of 25% simply because “AI” is in the headlines, you’ll miss out on opportunities where the market offers exactly the kind of discounts your process is designed to capitalize on.
Yes, you need an uncertainty premium. But this premium should be stable and carefully calibrated, not driven by market sentiment. If you double your required margin of safety every time a crisis hits, you’ll consistently miss out on the best buying opportunities.
If you’re a value investor, here’s what I’ve taken away from all this amid the Hormuz crisis and the prevailing narrative that “AI is killing SaaS”:
Your margin of safety should already account for the unpredictable. If you buy a stock at a 20-50% discount to your estimated intrinsic value, that margin exists precisely to absorb surprises including wars, blockades, oil shocks, and technological disruptions.
Don’t change it just because the media shows ships on fire or because ChatGPT can write Python.
Edward Miller: The Bias Induced by Uncertainty
Edward Miller published his work nearly 50 years ago. His ideas have remained relatively obscure outside academic circles but they are remarkably practical.
The central message is simple: uncertainty creates a structural bias in our decisions, and the only defense is a margin of safety calibrated.
The war in Iran will be resolved through diplomacy, by force, or through a shaky compromise. Oil prices will fall, or they won’t. The narrative that “AI is killing SaaS” will turn out to be exaggerated, partially true, or prescient. The markets will calm down, or they won’t. None of this should change how you value a company or the discount you demand before buying.
What should change your process is the data. Your post-mortems. The historical gap between your estimates and reality. That’s what true calibration is all about.
The rest is just noise.
Max


