What about the bombs that were never dropped?
Humans are too focused on trying to build correlation and causation of things that have already happened. I was in Kosovo and was having an argument with a colleague, that the path that we are walking down right now — we feel is a straight line that is due to causation and correlation, but what we fail to notice is that there was an infinite number of paths that we could have taken.
What if we look at the things that were avoided, didn’t happen, but had the chance of happening?
In the final scenes of Tenet (a trippy time-traveling movie), the Protagonist turns to Neil (actor Robert Pattinson) and asks if we can change the future by doing things differently in the past?
Neil replies,
“whats happened has happened…no one cares about the bombs that didn’t go off.”
Three things started to synthesize in my head:
- Book, The Black Swan by Nicholas Nassim Taleb. This book is heavy, super philosophical, and really gives me the overview effect. The last book to do this was Sapiens by Yuval Noah Harrari. But NNT really digs deep into how it’s the non-decisions that sometimes are the clues to ‘Black Swan Events’. Black Swan events are game-changing unpredictable events that usually have a defining impact on the outcome.
- Movie, Tenet on the flight back to the US. This movie is about time-travel, and the final lines of the movie really tied together what NNT talks about in the Black Swan. The long-and-short is they travel back in time to fix events in the future (a sort of Back to the Future storyline).
- Book, Dark Matter by Blake Crouch. This book is a great ‘time travel’ book as well. A sort of Rick and Morty(esque) storyline where the protagonist is traveling through various dimensions and storylines of themselves. Sometimes environmental changes where the world is in a nuclear winter or a large pandemic or ones where there are minor changes to their own storyline. Maybe they’re a little wealthier, didn’t have a child…etc.
So, what is a Black Swan event?
My simplest way of explaining Nassim Taleb’s theory is that Black Swans are highly unpredictable events that have a very large impact to the outcome, and (a corollary) when looked at in hindsight — we attribute causality to it, projecting that we ‘knew’ it would happen.
Some examples are:
- The financial crash of 2008
- The COVID-19 pandemic
- The accidental discovery of penicillin
This has a direct play on startups, too. We continuously focus on the startups and successful companies that are, but what about the graveyard of the 99% that didn’t make it? We attribute this ‘magic’ gameplan that Elon or Bezos used, but what’s at play is survivorship bias.
We see books on how Elon, Zuckerberg, and Gates achieved success — but not books about the 10 mistakes that didn’t get me to where I wanted to be.
Anne-Laure Le Cunf from Ness Labs has a great article on survivorship bias, which you should totally take the time to read later. There’s one story Anne-Laure spotlighted that really caught my attention.
In this WWII plane, the ones that were returning had bullet holes on the wings and tails were receiving the most bullets. The engineers thought to make these areas (the wings and tails) more robust. Right?
If the tails and the wings are getting hit, let’s make them stronger. Sound strategy.
However; a statistician Abraham Wald said that maybe the planes that were not returning were ones that were shot in the engine, and that the US military should actually make the protection around the engine more robust.
The Water Puddle Thought Experiment
The water puddle problem (introduced by NNT, taken from his friends Aaron Brown and Paul Wilmott) explains this brilliantly.
There are two operations:
- Operation 1. (the melting cube). From the ice cube, can you figure out the shape of the puddle? Probably yes, with enough engineering and physics modeling it should be doable.
- Operation 2. (from puddle to the cube). From the puddle can you build the shape of the ice cube (if it even was one to begin with)? This is a much more difficult operation, and there can be an infinite number of possibilities of the ice cube’s shape (again, even if it was one).
So, overall — with enough modeling, you could probably predict the next step that’ll be the most favorable, however; looking in hindsight Taleb would say, “Our problem is not that we do not know the future, we do not know much of the past either”.
Taleb continues:
Less than 0.25 percent of all the companies listed in the world represent around half the market capitalization, a less than minuscule percentage of novels on the planet accounts for approximately half of fiction sales, less than 0.1 percent of drugs generate a little more than half the pharmaceutical industry’s sales — and less than 0.1 percent of the risky events will cause at least half the damages and losses.
OK, so now that we don’t know anything — what do we do (from Taleb)?
NNT says it best in The Black Swan. The way to correct this thinking is to not ‘tunnel’, and to not be ignorant and ego-driven that we’re here because of causality. To understand that there were and are decisions that could (and will be made) that we have no control over.
Switching the mental model to view uncertainty versus certainty, allows us a glimmer of hope in potentially understanding the ‘ground breaking and black swan events’ that we continually label as ‘once in a lifetime events’.
Building functional redundancies (continued from Taleb):
So, when you have a lot of functional redundancies, randomness helps on balance, but under one condition — that you can benefit from the randomness more than you can be hurt by it (an argument I call more convexity to uncertainty). This is certainly the case with many engineering applications, in which tools emerge from other tools.
Asymmetric outcomes:
Asymmetric outcomes — I will never get to know the unknown since, by definition, it is unknown. However, I can always guess how it might affect me, and I should base my decisions around that.
Put yourself in situations where favorable consequences are much larger than unfavorable ones.
The barbell strategy
Strategy is to be as hyperconservative and hyperaggrssive as you can be instead of being mildly aggressive or conservative.
Taleb continues:
As a matter of fact, I suspect the most successful businesses are precisely those that know how to work around inherent unpredictability and exploit it.
…the biotech company whose managers understood the essence of research is in the unknown unknowns. Also, notice how they seized the “corners”, those free lottery tickets in the world.
In conclusion
There is really no way to predict the future, and even harder to build a ‘blueprint’ for the past (to model the future). Build for convexity, or design for the positive feedback loops in uncertainty, and leave optionality to change and pivot from external (and internal) sources.
Instead of focusing on the rigid plan and the end-result, think about the bombs that could have been dropped, and when they do (if they do) — allow yourself optionality to shift and pivot.
…and leaving you with two last quotes that really stuck out:
Pascal’s wager — I do not know whether God exists, but I know that I have nothing to gain from being an atheist if he does not exist, whereas I have plenty to lose if he does. Hence, this justifies my belief in God.
History does not revel its mind to us — we need to guess whats inside of it.
Music I was listening to while I wrote this
God Rest Ye Merry Gentlemen — Jerry Garcia
This is day 26 of my #90DayOfProse challenge.