Wednesday, November 9, 2016

moneyball was my favorite book, once

Good morning,

A couple of months ago, I commented briefly on Chuck Klosterman's But What If We're Wrong. My thoughts centered around the portion of his conversation with Dan Carlin, a political commentator, historian, and podcast host, that explored the way President Reagan's legacy was shifting over time. I return to that conversation today to link another topic they discuss in detail- the general trend of history as an academic subject- to a book I once considered my personal favorite.

Historians, according to Carlin, struggle with finding the balance of arranging verified facts into a story and presenting informed opinions based on their own interpretations. As Carlin sees it, the trend is constantly moving toward emphasis of the former (and he does not see this as necessarily a good thing). (1)


The way this discussion was presented in the book limited its application to history. But I think this is in some ways a general trend. Our access to facts is greater today than at any point in the past. Our access to facts tomorrow will be greater than our access to facts today. This access is tempting. Surely, if we dig through it all, we will unearth great revelations in the data that will make life better than it ever was before.

The first book I ever read that captured the struggle with this balance was Moneyball. Written by Michael Lewis, this book documented how baseball's first advanced statisticians revolutionized the way baseball teams analyzed their data. It quickly became my favorite book and remained so for most of my high school and college days.

To summarize, the statisticians' approach of seeking verifiable facts within data worked in direct opposition to the methods relied upon by traditional scouting. Drawing on a lifetime of intuition honed through their careful observation of the sport, scouts made their living by simply interpreting the games they saw from the bleachers and forming conclusions. They passed these 'scouting reports' on to the teams that employed them.

When I was young, I believed in the pattern for change described by this book. I was enthralled by the idea of drawing conclusions from data. I sought to learn as much math, statistics, and economics as possible in my college coursework. Fully prepared by a rigorous double major to do this sort of analysis, I sought work that promised direct involvement with analytical projects and exposed me to deeper thinking about the field.

Back then, the possibility that I would drift away from this line of thinking in the coming years would have seemed remote. Surely, big data was the way forward and I was among the fortune few for whom the future had arrived.

And yet, here we are. Over the past half-decade, I began to recognize limits in the approach of advanced analytics. Some of those limits were built into the field and could be overcome with diligence (such as the difficulty in gathering clean data of an appropriate sample size) but many of the limits were my own.

I sensed that the skills needed to reflect, interpret, or emphasize on a personal level would atrophy if I continued to specialize in analytical work. I feared this trend would either render me incapable of answering any question where the data was insufficient or tempt me to answer, anyway, hoping that shaky data would prove a better guide through uncertainty than careful intuition.

Today, I'm drifting somewhere between the two extremes. I do not have a sense of where the right place for me is along the continuum. Perhaps this exemplifies the best approach- ready to adjust as the situation requires but always mindful of balancing the extremes.

It is a compromise of sorts between empathetic interpretation of events and calculated arrangement of facts. The middle ground allows clean facts to support unexplainable insight. It gives leeway for intuition to direct the next exploration project within a data set.

One way to summarize this position is to examine the story from Moneyball about how on-base percentage was considered a great innovation in baseball analytics. On-base percentage is not a complex statistic to compute. In fact, it is an easily calculable ratio. The simple but thoughtful application of this statistic rather than the complexity of computing the metric is what made this method successful.

The story highlights a basic problem with an over-reliance on analytically focused approaches. Rather than remaining open to new intuition and constantly evaluating updated information to make better decisions, too often more complex or time-consuming analysis of the same data to answer the same question is utilized instead. As Carlin observed in the case of history, focusing too minutely on the known details often diminishes the understanding of those details not yet known. It also reduces the role of intuition required in asking the right questions about complex situations.

Moneyball, I see today, is not the cool story about analyzing data that I long considered it to be. It is a story about how to ask the right questions about the worth of baseball players. Those featured asked 'What are the established metrics telling us about players' instead of accepting the status quo for player evaluation. By pondering this question, they forced new questions to use in determining the best players.

Just like Moneyball, What If We're Wrong is a book about asking the best questions. Dan Carlin's two podcasts are examples of the same. Books and shows like these are extremely valuable for how they cultivate the skill of examining, questioning, and interpreting the world around us. What little practical value such media has in simple or immediate contexts is offset by how they equip their audiences to approach novel or unseen challenges. The skills called on in these examples challenge us to assess rather than compare, to determine what is best rather than just settle for what is better. (2)

Thanks again for reading. Back on Monday to wrap up my thoughts about Moneyball.

Tim

Footnotes...

1. Dan Carlin's podcasting style is slanted toward the latter...

I recently learned from one of his shows (can't recall which, though probably Hardcore History) that swordsmanship (again, can't remember where, though probably samurai-era Japan or somewhere in Central Asia) is one of these topics that historians have no record to draw from. There is apparently no technical documentation of how people fought with swords.

Historians know there was likely sword fighting given depictions in artwork and the recovery of the weapons. However, no primary source explains the tactics or techniques used by swordsmen in battle or in sport. What we see in movies is an interpretation agreed upon over time about how these duels likely took place.

Another example Klosterman and Carlin use to illustrate this point is society's perceptions of democracy and totalitarianism. Someone who suspects that totalitarianism is a better form of governance will always run lose an argument against someone who presents the verified facts that history's totalitarians have been horrible leaders. A person who interprets that totalitarianism fails only when poor leaders take power is not taken seriously at all.

2. This method requires some patience, though...

In general, the writing or programming that explores ways to ask the best questions rather than those looking for the best answers seem to suit me.

It is a tricky approach because the answers to bad questions often inspire great ingenuity (just see my mock job interview questions I posted a couple of months ago). Clever responses often hide the uselessness of poor questions.

Most of the time, answers to bad questions mislead us or tell us only trivial information about how things truly work. It is the great questions, those without clear, immediate, or even any answers, that drive away the trivial until only what is true remains.