A simple concept best summarizes the main idea of Nassim Nicholas Taleb’s Skin in the Game – if there is upside, there must also be downside. Taleb builds on the basic foundation of this principle throughout the book and highlights many examples of its applications to a wide range of topics including complex systems, analysis, and ethics.
Let’s take a brief look at how this principle applies to complex systems. If every upside opportunity comes with a proportional downside risk, it means that no one within the system can win without bearing a fair share of downside in the event of a loss. This works just fine in theory if people who can afford the downside take the risks because the losers in this setup will not be ruined in the event of a bad outcome and will not burden innocent others with an undue share of the downside.
In the best systems, people pool their risk together and share the burden so that, collectively, upside risks can be taken without putting the safety of the system in jeopardy. This is the general setup of close-knit communities, fair insurance policies, and equitable access to affordable credit. Over time, the system collectively advances as members and groups benefit from upside without any single component of the system being ruined by downside.
The problems arise when the mechanism of risk transfer loses its inherent symmetry. Stated simply, this means a risk taker who would benefit from a win does not suffer from a loss. There are some easy high profile examples for this point. My personal favorite is the executive who pockets megabucks year-end bonuses despite the company having laid off hundreds. If the big shots can benefit even as their (former) employees suffer, how can the company expect to attract or retain top employees in the coming year? The obvious answer is that it can’t (and perhaps this explains why big companies are often dragged under by smaller, newer competitors).
Another good example that I’ve highlighted on TOA in the past is how municipalities tend to invest far more in automobile infrastructure than they do in cycling or pedestrian equivalents (in this ancient post, I cited Happy City, a book that put this ratio as high as thirty-to-one). What do we expect to happen in cities where it is far easier (and safer) to drive than it is to bike or walk? My guess: the city becomes a place where people drive more than they bike or walk. Part of the explanation is the safety benefit, of course, but a more theoretical approach suggests driving is cheaper in the sense of ‘buying upside’ - a driver gets somewhere faster and, in the event of a collision, stands a much better chance of being unharmed than a biker or walker. Over time, these places lose their sense of community and togetherness as people isolate themselves in their cars and go weeks at a time without ever interacting with a stranger.
Taleb’s overall point is that a system is in danger of crashing anytime the mechanism of risk transfer starts to lose its symmetry. The somewhat ridiculous example he cites is in commercial aviation. In the early days, a bad pilot would remove himself from the system by crashing the plane. This meant that inexperienced pilots bore all the risk while training to become a pilot. Over time, training tools such as the flight simulator meant pilots who would once have been, er, ‘ruled out’ of a job could now ‘crash’ their plane without weeding themselves out of the system. If it weren’t for the major advances in flight technology that made it easier for pilots to fly (and therefore made it possible for lesser pilots to fly without transferring risk to passengers), it is entirely possible that the flight industry would have crumbled under the weight of the accidents caused by its inexperienced pilots.
The most conversational way Taleb makes this point is when he talks about having something described as ‘good for you’. No doubt about it, many things are good for you and it would be unwise to dismiss every such opportunity out of hand. However, a good test is to make sure the person selling the idea can explain why any idea that is good for you is also good for me. If such an explanation is not forthcoming, it suggests that though you may benefit from the opportunity if things work out, you might also be on the hook for a greater share of the downside risk if things were to go wrong.