This book was not an easy read. Author Nassim Taleb takes the reader on a long slow meandering walk, offering opinions about seemingly inconsequential matters all along the way. He opines at one time or another about subjects as diverse as international politics, financial management, personal health, and ancient philosophy. Yet, after reading this book for the first time, I made a point of re-reading it a second time just so I could take detailed notes. Many of Mr. Taleb’s asides somehow resonated with me, and I was willing to make the monotonous stroll with him once more to make sure that I really understood what he was trying to say.
The term “black swan” refers to an unexpected event. It was thought, at one time, that all swans were white; no European had seen a swan of any other color. However, black swans exist in Australia, and it came as a surprise to some scientists that their assumptions were wrong. This book argues that we set ourselves up for unexpected shocks when we assume that past occurrences are representative of future events. Our reasons for doing so are complex, but for me, the most important lesson of The Black Swan is that we humans are highly susceptible to confusing our abstract models with reality. Mr. Taleb calls this “Platonicity,” named for the philosopher Plato, who believed abstract thought to be more “pure” than perceptions of the material world.
The human mind suffers from three ailments [...], what I call the triplet of opacity. They are: the illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than they realize; the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and the overvaluation of factual information and the handicap of authoritative and learned people, particularly when they create categories—when they “Platonify.”
There is little novelty in the notion that humans frequently “confuse a map for the territory.” We accept that our models are imperfect, never aligning precisely with reality–but we assure ourselves that the differences are not large. This book, however, argues that severe consequences arise within these gaps:
Any reduction of the world around us can have explosive consequences since it rules out some sources of uncertainty; it drives us to a misunderstanding of the fabric of the world.
A recurring analogy used throughout the book is the dilemma of the Thanksgiving turkey. Prior to the end of November, the captive bird thinks all is well. It gets fed frequently and is fenced off from predators. It has no inkling that its life will come to a sudden end, as it has never before seen such an event.
It is misleading to build a general rule from observed facts. Contrary to conventional wisdom, our body of knowledge does not increase from a series of confirmatory observations, like the turkey’s.
This strikes me as a cautionary parable for engineers. The Fukushima power plant in Japan was designed to handle an earthquake of magnitude 8.0, but was not capable of dealing with the magnitude 9.0 event that occurred. Another thousand years could have gone by and everyone would have thought that all was well, as long as no tremor above the design limit were seen. Yet the potential for such a deadly event would have remained. A 2008 Quantas flight (QF72) cruising at 37,000 feet was plunged into a several rapid descents due to a software bug that had obviously never been anticipated. With knowledge of such eventualities, our engineering designs can be improved. However, it is incredibly difficult to envision that which has never before taken place.