One of my best mates regrets moving to Austin. The job didn’t work out, the culture wasn't what they expected, and now he’s planning to move back to New York within the next year.
He had an offer for a role that paid 40% more than his New York position, the cost of living was dramatically lower, and he’d been complaining about New York winters for the better part of a decade. The job collapsed because of funding issues that nobody could have predicted. The culture mismatch was real, but how could he have known without trying?
The question he asked me: did he make a bad decision?
We conflate the quality of our decisions with the quality of their outcomes so automatically that we rarely notice we're doing it. A good decision that leads to a bad outcome gets reclassified in our memory as a bad decision. A terrible decision that happens to work out becomes evidence of our brilliant judgment. We are constantly running our own internal kangaroo court, retroactively convicting or acquitting our past selves based solely on how things turned out.
Thinking in bets offers a way out of this trap.
When you frame a decision as a bet, you're forced to explicitly acknowledge that you're operating under uncertainty. You have to assign some probability to different outcomes, however rough. You have to think about expected value rather than guaranteed results. My friend made a bet that had, let's say, a 70% chance of working out well and a 30% chance of not working out. He lost the bet. That doesn't retroactively make it a bad bet.
This might sound like motivated reasoning, a way to excuse bad judgment by claiming "well, it seemed like a good idea at the time." But the difference is that thinking in bets forces you to be honest about what you actually knew at the time versus what you learned afterward. If you claimed there was a 95% chance of success when any reasonable analysis would have suggested 50/50, you made a bad bet regardless of the outcome. If you carefully weighed the probabilities and made the choice that maximized your expected value, you made a good bet even if you got unlucky.
Professional poker players understand this viscerally. You can play a hand perfectly and still lose because someone hit their 8% draw on the river. You can play terribly and win because you got lucky. What separates good players from bad players over the long run is the quality of their betting decisions, not the outcomes of individual hands. Good players lose hands all the time. They just make sure they're making +EV (positive expected value) bets when they do.
Think about how differently we might approach major life decisions if we thought about them probabilistically. Instead of asking "should I take this job?" you'd ask "what's my estimate of this working out well, and given that probability, what's the expected value compared to my alternatives?" You might still take a risky job if it has high enough upside and you have a decent chance of success, even if failure is a real possibility.
When you make a binary decision, admitting you were wrong feels like admitting you're bad at decisions. When you make a probabilistic bet, being wrong just means you were on the losing end of the odds this time. You can simultaneously acknowledge that the outcome was bad while maintaining that the decision was sound.
This has downstream effects on how you learn from experience. If every bad outcome is evidence of bad judgment, you'll be motivated to rationalize or minimize your failures. You'll be reluctant to take reasonable risks because any failure will be counted against you.
But if you can separate decision quality from outcome quality, you can look at your failures more honestly.
You can ask: did I misjudge the probabilities? Did I have information I should have weighted differently? Or did I actually make a good bet that happened not to pay off?
There's a trap here, though.
Isn’t there always…
You can't just assign whatever probabilities make you feel good and then claim you're thinking clearly. The probabilities have to be calibrated to reality, which means tracking your predictions and adjusting when you discover you're systematically over or underconfident. If you keep saying things are 80% likely and they only happen 50% of the time, you're not thinking in bets so much as you're using probability as a fig leaf for wishful thinking.
When you apply this consistently over time, you start noticing patterns in the types of bets you make. Maybe you're too conservative about career moves but too aggressive about financial risks. Maybe you systematically overestimate your ability to finish projects quickly. These patterns are much harder to see when you're thinking in binary terms, where every decision is either vindicated or condemned by its outcome.
Some people resist thinking this way because it feels cold or calculating. Where's the room for intuition, for gut feelings, for the ineffable sense that something is right? But probabilistic thinking doesn't exclude intuition. Your intuition is providing you with information, you're just being more honest about the uncertainty inherent in that information. When something feels right, you might update your probability estimate upward. You're just not pretending that your intuition provides certainty when it doesn't.
Others worry that thinking in bets will paralyze them with overthinking. If you have to calculate expected values for every decision, won't you waste enormous amounts of time on analysis? But you can think in bets at whatever level of precision the decision warrants. Choosing where to get lunch doesn't require a spreadsheet. Choosing whether to go to graduate school might benefit from one. The framework scales to the stakes.
What happens when you extend this thinking to your beliefs more generally, beyond just decisions? You start saying things like "I'm about 60% confident that" instead of making definitive pronouncements. You become more comfortable with uncertainty. You can hold multiple contradictory hypotheses in your mind simultaneously, weighted by probability. When new evidence comes in, you update your probabilities rather than either ignoring the evidence or completely reversing your position.
The world contains more uncertainty than we're comfortable acknowledging. We want to believe that if we just think hard enough, gather enough information, and make the right choice, we can ensure success. We want to believe that good decision-making guarantees good outcomes. But it doesn't. Sometimes you make the best possible choice given the information available and things still go sideways. Sometimes you make a questionable choice and get lucky.
Thinking in bets lets you make peace with that uncertainty without surrendering to fatalism. You can take responsibility for the quality of your decisions while acknowledging that outcomes are probabilistic. You can learn from your failures without assuming that every failure represents a personal deficiency. You can take reasonable risks without being paralyzed by the possibility of loss.
My mate made a reasonable bet that didn't pay off. He’s not stupid for having moved, and he’s not stupid for moving back. He gathered information, updated his probabilities, and made a new bet based on what he learned. That's exactly what thinking in bets looks like in practice. It's messy, it involves backtracking, and it doesn't guarantee success.
But it gives you a fighting chance of making good decisions in a world that refuses to offer you certainty.

