Performance Perspectives Blog

When the stats don’t add up

by | Sep 26, 2011

I am a very big fan of Michael Lewis, beginning with his best seller Liar’s Poker. I was advised to read Moneyball, and as with his other books, found it very well written. And given my lifelong love of baseball, found it quite intriguing: the idea that the Oakland A’s could assemble a team of low priced players and meet with great success. The book has been transformed into a movie, which opened this past Friday.

In last Thursday’s Wall Street Journal Allen Barra wrote a piece titled “The ‘Moneyball’ Myth,” which questions the conclusions drawn by Lewis and others, that by focusing on certain statistics rather than others, success could be realized; that teams didn’t necessarily have to spend a bundle.

Thus, Barra was questioning Lewis’ attribution analysis, was he not? Were the conclusions that were done, based upon the story that was told by the team’s general manager, appropriate? Barra shares information that causes one to wonder.

Statistics are curious things, are they not? As the saying (credited to Britain’s former Prime Minister Disraeli, and made famous by Mark Twain) goes, “there are lies, damn lies, and statistics.” We can get statistics to prove or disprove just about anything. In the world of performance measurement, we have loads of measures, some that compete with one another (money- versus time-weighting; equal- versus asset weighting; geometric versus arithmetic), which produce differing results.

One must be sure that they are willing to step back and understand what the numbers are supposed to represent, to determine if they’re appropriate, whether they’re complete, and whether additional information is needed.

Baseball is perhaps the best example of statistics gone wild. For example, ESPN can tell us what the batter’s chances are of getting a hit, with two on, facing a right handed pitcher, and with a ball and a strike. In the end, one is tempted to ask “do we care?” But with performance measurement, and the ability to understand what is working and what isn’t, we surely do care. But having the skills and tools necessary isn’t perhaps always so evident.

A performance measurement professional friend and veteran of our industry visited our offices last week to chat. We talked about the world in which we find ourselves, and what we like about it. For me, the dynamics are perhaps the best part: we don’t know all the answers, and are continuing to question and learn. One cannot ever think that “it’s done,” ’cause it isn’t.

Free Subscription!

The Journal of Performance Measurement

The Performance Measurement Resource.

Click to Subscribe