I hate to do this to the Professional Left's most darling polling professional, but Nate Silver is simply not that good at predicting closely contested statewide races for federal office.
True, he catapulted to national attention by "correctly predicting" 49 out of 50 states presidential electoral outcomes in 2008. Nicely done. But not nearly as nicely as the number looks at a first glance. You could have predicted the outcomes of about 40 states without any need for polling data. News flash: most states aren't "in play" for president. Still, 9 out of 10 is pretty impressive, but I could have done it by simply guessing that Barack Obama would basically run the table in the swing states - and really, did you really need polling data to know that after McCain and Palin showed us who they were? Seriously, no "calculus" needed.
We really got a closer look at Silver's ability to call close races, though, in 2010. Once again, Five Thirty Eight's Wikipedia entry will tell you that Nate Silver correctly called 34 out of the 37 Senate races that year. Technically, yes. But then again, you could have called at least 32 of them. I mean, really. Did we need Nate Silver to predict that Barbara Boxer was going to win in California or that Lincoln was going get thrown out in Arkansas? Come on. There really were only five (maybe actually four) closely contested races for the Senate in 2010: Colorado, Nevada, Pennsylvania, Alaska and Washington state (if you can even consider WA and NV close, but let's.) Do you know how many of those races Silver called correctly? Two. That's right. Out of the 5 close Senate races, Nate Silver missed the mark on 3.
Wanna know something funnier? Which way do you think all of those models that lead him to a wrong prediction skewed? That's right, they all slanted GOP. So much so that Nate Silver actually predicted that Sharron Angle of Second-Amendment-remedies fame was going to win Nevada by 3 points. Harry Reid cleaned her clock by 5.5 points. Silver got that one wrong by a whopping 8.5 points. To be fair to Nate, he had an excuse: the pollsters didn't poll enough Latinos, he said. Hmm, I wonder if that could put in doubt further inclusion of polls without questioning their demographics in Nate's future "calculus." Nah.
Here are the other races that year that Nate Silver got wrong: he predicted that Ken Buck would win in Colorado by one point. One election day, Buck lost by that margin to Democratic Senator Michael Bennett. And in Alaska, Silver confidently predicted that crazy Teabagger Joe Miller was on his way to the Senate, when actually Republican-turned-independent Sen. Lisa Murkowski ended up winning.
In essence, leaving out Washington state - which in that election could hardly be considered a tossup, even though it was hotly contested by a two-time loser from the Republican side, Nate Silver basically predicted a GOP victory in all close races, and came up short 3 out of 4 times. That's his record, as far as predicting close statewide federal races go.
That brings me to a conclusion: Nate Silver is in fact often wrong while predicting the results of statewide federal contests, and when he is, he pretty much always errs on the side of a greater Republican advantage than is actually born out by the election results. The case was the same in that one state he got wrong in 2008: he predicted McCain would win Indiana (in reality, Obama won the state) and he underestimated Obama's popular vote edge by a seemingly small one-point margin (Silver's estimate of margin: 6.1 points, actual: 7.2 points), but that's a 17% error.
I am not telling you not to look at Nate Silver's numbers. Look at them. But look at them with the knowledge that he continues to incorporate polls without regard to demographic data despite having specific and admitted knowledge that entire lines of polls can underestimate minority votes. Look at his numbers with the knowledge that he is often wrong on contested states, and usually always wrong in favor of Republicans. Look at his numbers, but know that even the best models - and there actually isn't that much evidence that his model is the best model - can be screwed by bad data, and his data are limited to polls that are being seriously tinkered with thanks to the right's working-the-refs strategy.
Oh, and look at them with another thing in mind: Nate Silver does work for the New York Times. That's not a bad thing, though the Times has as much interest as anyone in the corporate media to keep the horse-race going. FiveThirtyEight is not an independent blog.
All I'm saying is this: it's best not to get all animated by polls. Your time will be far better spent getting the vote out for the president, or for your local candidate for Congress. But if you're going to get all animated by the polls and projections, please take the time to understand both the strengths and the pitfalls of those polls and projections.
UPDATE: See comment below from Rustbelt_Dem about how Nate Silver's disastrously wrong predictions on UK's 2010 elections. Quoting:
Link to Nate's predictions, one week before the general election:
Let's see how accurate it was:
Nate Result (Change) Prediction Miss
% 34.4 36.1 -1.7
Seats 299 (+89) 306 (+96) - 7
% 27.0 29.0 -2.0
Seats 199 (-150) 258 (-91) -59
% 29.5 23.0 +6.5
Seats 120 (+58) 57 (-5) +53
So close, Nate. By the way, why did the LD's get so favorable of a vote prediction? Because Clegg had a good debate over Gordon Brown. Hmmm, that does sound familiar, doesn't it?
Like what you read? Chip in, keep us going.