Probabilistic thinking is a mental model that you likely use all the time. As Farnam Street writes in their book “The Great Mental Models” it’s simply “…trying to estimate, using some tools of math or logic, the likelihood of any specific outcome coming to pass.”
That’s it.
But in a world that is defined by increasing information, complexity, and a finite supply of time — probabilistic thinking is your best step forward for identifying likely outcomes. This can impact how you make decisions and increase your likelihood of precision. As we engage a constantly changing economy, our ability to thrive is determined by the quality of our decisions. I wrote a bit about this on a thread about deep work:
In the FS book, there are four areas to understand:
1. Bayesian thinking
You might remember this from an old statistics course in high school or college.
The core idea is, “…given that we have limited but useful information about the world, and are constantly encountering new information, we should probably take into account what we already know when we learn something new.”
Example: You’re reading an article about how violent crimes are on the rise. A Bayesian approach would be looking at long-term trends. Let’s say the article quotes a rate that your likelihood of getting shot by a gun is 1 in 100,000, but that has recently gone up to 2 in 100,000. The headlines scream, “Murder is up 100%!” But if you look at long-term trends and find that in 1980, your chances were 100 in 100,000, you’ll notice the long-term trend is still downward. You need to take into account this as you frame your thoughts on the trend.
The key thinking with Bayesian is to estimate a probability of things being true. New evidence can change the likelihood of these estimates or even replace them entirely.
Jordan Ellenberg writes a lot about Bayesian Inference in his great book, “How Not To Be Wrong,” that “[how] much you believe something after you see the evidence depends on what the evidence shows and how much you believed to begin with (a priori).”
Some questions to ask when making uncertain decisions:
What are the relevant priors?
What might I already know that I can use to better understand the reality of the situation?
2. Conditional Probability
This is similar to Bayesian thinking, but with a twist.
This states that, “When you use historical events to predict the future, you have to be mindful of the conditions that surrounded that event.”
This helps you define the likelihood of an event based on the occurrence of a previous outcome or event. It’s important to understand whether events prior are independent (e.g. tossing a coin where the previous event has no impact on the next toss) or dependent (e.g. not paying your power bill for 3 months and having your power cut off midway through your favorite TV show “The Wire”).
3. Fat-curved tails
This requires an understanding of bell curves which are a way to capture the relative frequency of items. It allows you to view the parameters and plan for likely outcomes.
Fat-tailed curves ARE DIFFERENT. You can see here:
In a normal bell curve (seen above in blue) that you’ll likely see for weights/heights (e.g. “Your in the 99th percentile for weight.” - Gee thank you!) As FS writes, in the “fat tail” scenario seen above in green - the extreme events at the edges (the tail) are more likely to occur.
In situations with a normal curve, extreme situations are less likely. For example if you take weight. You aren’t going to meet many people who are 20x the average size of the average person. Those unlikely scenarios are very extreme and scarce. But if you’re measuring wealth, you’ll have a fat tail where extreme events are more possible. So there are a lot more people who have 100x (or more) the average wealth of an average person.
So what can you do?
Don’t sit down and imagine every single scenario of every event.
It’s important to recognize when you’re in a normal vs fat tail scenario
To survive (and thrive) in an unpredictable future, you need to be planning for a world you don’t understand.
Nassim Taleb writes in his books “The Black Swan” and “Antifragile” on how small errors in measuring risks in extreme events can lead to being off by an order of magnitude. We can try and live our lives predicting what will happen — there’s money and prestige and power in the ability to do so—or giving the appearance of confidence n uncertainty. But, it’s more efficient to prepare than predict. There’s a few paths to take:
Upside optionality: Seek out situations that have good odds of offering us opportunities. This could be investing a portion of your time learning new skills that could be used at your current job or at a new one.
Learn how to fail properly. Don’t take risks that will destroy you completely and develop the ability to learn from your failures.
4. Asymmetries
It’s important to recognize the limitations of your thinking. This is an area called “metaprobability” - the probability your estimates are any good.
FS notes that, “Far more probability estimates are wrong on the ‘over-optimistic’ side than the ‘under-optimistic’ side.”
Summed up:
To think probabilistically you need to:
Identify what matters
Get a sense of odds of success
Check your prior assumptions
Make a decision