There’s been a lot of talk after the election about how one big winner (after Obama, I imagine) is Nate Silver, of the FiveThirtyEight blog. He had come under fire in the days/weeks leading up to the election for his refusal to call the race a “toss up” even when Obama had only a narrow lead in national polls. He even prompted a couple of posts (in his defense). Turns out that Silver called the election right – all fifty states- down to Florida being a virtual tie.
But that’s old news. I want to focus on something that may be as, or even more, important. The underlying polling. We take it for granted that the pollsters did the right thing, but their methodology, too, was under attack. Even now, there are people – quants, even – who were shocked that Romney lost because their methodology going in to the election was just plain wrong.
So, that’s where I want to focus this post - not just on “math” but on principled methodology.
It’s easy to take the pollster methodology for granted. After all, they’ve been doing it for many, many years. That, plus the methodology is mostly transparent, and past polls can be measured against outcomes. Taking all of this methodology information into account is where Silver bettered his peers who simply “averaged” polls (and how Silver accurately forecasted a winner with some confidence months ago). Everybody was doing the math, but unless that math incorporated quality methodology in a reasonable way, the results suffered.
It didn’t have to be that way, though. As Silver himself noted in a final pre-election post:
As any poker player knows, those 8 percent chances [of Romney winning] do come up once in a while. If it happens this year, then a lot of polling firms will have to re-examine their assumptions — and we will have to re-examine ours about how trustworthy the polls are.
This is the point of my title. Yes, Silver got it right, and did some really great work. The pollsters, however, used (for the most part) methodologies with the right assumptions to provide accurate data to reach the right answers.
The importance of methodology to quantitative analysis is not limited to polling, of course. Legal and economic scholarship is replete with empirical work based on faulty methodology. The numbers add up correctly, but the underlying theory and data collection might be problematic or the conclusions drawn might not be supported by those calculations.
I live in a glass house, so I won’t be throwing any stones by giving examples. My primary point, especially for those who are amazed by the math but not so great at it themselves, is that you have to do more than calculate. You have to have methods, and those methods have to be grounded in sound scientific practice. Evaluation of someone else’s results should demand as much.