Sunday, May 26, 2019

reading review - skin in the game (riff offs, part 2 - the effect of the unseen)

Reader, today the long journey through my notes, thoughts, and observations about Nassim Nicholas Taleb’s Skin in the Game concludes at last. If you’ve made it this far, I congratulate you. And if you’ve enjoyed the posts, I humbly recommend reading the book itself.

Of course, you may be protesting at the moment – why read the book after such an exhaustive series of reviews? I agree with the thought, or at least the spirit of it, but in that case I’ll suggest one of his earlier works. A major shared theme among those books is the effect of the unseen or overlooked and I’m sure those who’ve enjoyed these posts will find plenty to like about those prior works.

Today’s post looks over some thoughts along that same theme from Skin in the Game – the effect of the unseen or overlooked.

Historians often create their own problems by relying too much on observed events and not considering enough the impact of the unseen.

This comment rings a little true for me. When I joke around here that there is always a reason and The Real Reason, I think I’m getting at a tendency people have to make firm conclusions based on what they observe rather than try to get the full context of what remains left to learn.

But what is the historian supposed to do, study the unseen? This would turn ‘the unseen’ into ‘the seen’, right? How would a historian access the unseen? Perhaps this thought probably says more about a person’s attitude towards the subject of history (and its perceived use) rather than towards the historian.

On a semi-related note, one of the funniest job descriptions I’ve ever heard was when my friend described a historian as someone who ‘knows what happened’.

Though it seems a contradiction that the Vatican goes to the doctor before turning to prayer, recall that voluntary death is banned by the faith.

If you like this type of thought, you must be the sort of person who gets excited for new Taleb books…

A degree in a certain discipline is likely BS if the prestige of the school is closely tied to the value of the degree.

…and this thought is the type that upsets readers once they get their hands on the new Taleb book.

Systems often collapse long before any structural defects can be cured.

This comment brings me back to his first bestselling book, The Black Swan, and one of the main premises from the work – instead of trying to predict inherently unpredictable events, we should try to find ways to mitigate the negative effects of such events. Countries in earthquake zones, for example, don’t predict the next tremor and evacuate two days before the coming quake – they design resilient buildings, educate the population on how to respond during an emergency, and conserve resources to help the hardest hit victims.

I think a common tendency is to look at a system’s defects and consider ways to fix them. This is an idea I like in theory but in practice a system’s defects are usually accepted evils that enable its strengths. My expectation is that those who attempt to fix such defects within a system often encounter a lot of unexpected resistance from those who stand to lose the most from the repairs. A more productive approach might be to learn the negative effects produced by a system’s defects and proactively find ways to limit the influence of those defects.

Isolating one variable and studying how subjects respond to minute changes in the cost or risk burden is not a rigorous way to study attitudes toward risk. For most, this decision is lumped together will all other attitudes or exposures to risk. A person who continues to take on or renew a small, ‘one-off’ risk will eventually face ruin.

This thought gets at the difficult task researchers set for themselves of trying to isolate factors in order to best understand people’s attitudes and preferences. Someone who keeps a fire extinguisher at bedside probably has a very different attitude towards risk than someone who doesn’t but this difference is unlikely to be reflected in behavioral or consumer responses to price changes in fire extinguishers.

Comparing the multiplicative with the independent leads to great distortions in understanding risk. One person killed by a bookcase at home is not the same as one person dying from contracting a highly contagious illness because the observation of the latter greatly increases the probability of another person suffering the same fate.

This was my favorite thought from the book and one I referenced a number of times in my many posts about Skin in the Game. It speaks to the difference between actions and interactions (one of the subtle themes from the book) and underscores the importance of knowing how to determine what will happen next given some event in the present.

A story of someone crashing a car is a tragedy but most such stories go ignored because yesterday’s crash has essentially no effect on the chances of you crashing if you go for a drive today. On the other hand, the panicked report about a shark attack (or even a shark sighting) always leads the news because it represents a slightly greater chance that you might get eaten the next time you go for a swim.

If more people die on the road than in the ocean, maybe we should lock up cars instead of sharks, and put them all in parks, where we can go and view them…

OK, fine, that wasn’t from Skin in the Game, it was from TOA favorite Courtney Barnett’s ‘Dead Fox’, who apparently in addition to being an excellent guitar player and songwriter also possesses a highly developed understanding of risk.

But who better to get the last word in a riff off?

Thanks for reading.

Tim