Frequently when I suggest goals to teams, they’ll assume I’ve misunderstood the problem. I’ve been told the numbers are absurd. Not ambitious, not optimistic, absurd. The kind of target that makes people hesitate, as if writing it down feels reckless.
There is, however, a logic to this. A reason for pushing for high levels of achievement.
I’ve come to believe that the probability of achieving a meaningful goal should sit somewhere around 20–40%. If I’m more than 50% confident we’ll hit a target, I start to worry that the goal isn’t doing anything interesting.
On the surface this sounds irrational. It isn’t. We’ve just misunderstood what goals are for.
Goals as Instruments of Perception
We tend to think of goals as instruments of performance. They exist to make us work harder, measure progress, and allocate rewards and blame.
In complex systems (products, teams, companies, markets), goals function differently. They are better understood as instruments of perception. They determine what you can see. Telescopes didn’t create galaxies. They revealed that our model of the universe was wrong.
A reasonable goal preserves your existing worldview. An unreasonable goal destabilizes it. That destabilization is the point.
Changes to products fail not because people lack effort, but because they never escape their current mental models, their local maxima. Reasonable goals quietly reinforce those models.
Phase Transitions, Not Improvements
Most products try to improve continuously. They optimize funnels, tweak pricing, refine processes, hire slightly better, deploy slightly better. This feels responsible. It is also often irrelevant.
Complex systems rarely change smoothly. They change through phase transitions. Water doesn’t gradually become ice; it stays liquid until it suddenly isn’t. Products are the same. They remain structurally stable until something forces them into a new regime of behavior.
Unreasonable goals are one of the few tools that I’ve found reliably trigger these transitions.
If your goal is to grow revenue by 5%, your existing system remains valid. You adjust knobs and preserve architecture. If your goal is to grow revenue by 10×, most knobs stop working. Your current system becomes inadequate, and you are forced to confront assumptions you previously treated as laws of nature.
The unreasonable goal doesn’t demand more effort. It invalidates the current state of your product.
The Collapse of Choice
Reasonable goals leave you with too many options. Imagine your business makes $100 today and your goal is $101 tomorrow. There are thousands of plausible paths forward. None are obviously wrong, which means none are obviously right. You drown in optionality.
Now change the target to $10,000 tomorrow. Most options disappear instantly. They are not just insufficient; they are irrelevant. You are forced to consider actions that previously felt irresponsible, naive, or socially unacceptable.
The unreasonable goal collapses the problem space. It functions less like a target and more like a filter, deleting most ideas and leaving only the handful that might matter.
Absurdity can be a form of clarity.
Moderate Ambition as a Structural Anti-Pattern
We tend to assume the dangerous place is the edge: too much ambition, too much risk, too much change. In practice, the most fragile place is the middle.
Moderate ambition encourages you to stretch systems that were never designed to stretch. You keep the same assumptions, architecture, and incentives—just pushed harder. This is how organizations break quietly: not through dramatic failure, but through accumulated fragility.
Extreme ambition does something different. It forces discontinuity. It makes your current system obviously inadequate. You are compelled to abandon it rather than torture it. In that sense, wildly ambitious goals can be safer than moderately ambitious ones. Risk is not linear; it is discontinuous.
The danger is not always in leaping too far. Sometimes it lies in leaping just far enough that you never question the ground beneath you.
Compression of Truth
There is another effect that is harder to admit. Conservative goals feel safe because they minimize visible failure. But they also minimize information. You move slowly, learn slowly, and remain uncertain for a very long time.
Aggressive goals are brutal but epistemically efficient. If your goal is “make modest progress,” you can spend a year being gently wrong. If your goal is “prove this matters at scale,” reality is forced to respond.
Unreasonable goals compress time. They extract truth from complex systems faster than polite goals ever could. In this sense, they are not motivational tools so much as instruments for interrogating reality.
The Cultural Problem
There’s an uncomfortable truth here. Most teams are trained to equate “missing a goal” with “failure.” So when you deliberately choose goals you could well miss, it feels like sabotage. But that’s a cultural problem, not a strategic one.
If goals are treated as verdicts, probabilistic targets feel cruel. If goals are treated as probes, questions posed to reality, they feel necessary.
The alternative is deceptively comforting: goals you’ll definitely hit. Those goals rarely change anything. They confirm what you already believed about your capabilities and your system.
A Qualified Bet
I might be wrong about the numbers. Maybe you should hit a goal 15% of the time or 5%. Maybe it depends on the system, the people, or the moment in history.
But I’m convinced that the worst goals aren’t the ones you miss. They’re the ones you hit without having to change. Reasonable goals preserve today’s truth. Unreasonable goals create new ones.