How to tell if a poll is an outlier


It just so happens that we have excellent real-world examples, contained in a poll of Minnesotans recently conducted by SurveyUSA for KSTP. SUSA is among the most accurate pollsters, cycle after cycle. But in this game, even the best sometimes miss.

A fundamental rule of public polling, arguably the #1 rule, is that if numbers are way out of line with everybody else‘s, they’re wrong. There are three prime examples, here.

  • Obama job approval: 36% approve, 54% disapprove. The current national average is about 44% job approval, and that’s where it’s been for quite a while. It doesn’t make sense that blue Minnesota would have him down in George W. Bush territory.
  • Marijuana legalization: 29% approve, 64% disapprove. I’m not suggesting Minnesota is pro-full legalization at this time. For one thing, the drug is simply not as popular here as in many other states, especially on the coasts. But, again, way out of whack with the national numbers, which are pretty regularly around 50/50, or even slight majority approval, these days.
  • Obamacare: 33% approve, 54% disapprove. Note that the question is worded a little differently than straight up/down. That goes for the marijuana question, too. Anyway, that too is well into the double-digits below the approval levels expressed in other polling. The biggest problem here, though, is readily identifiable: like much Obamacare polling, there is no identification among disapproves, between those who want the law repealed, and those who want it strengthened, in the direction of universal single-payer. When that is explored, you get 2-1 overall support. (The relevant bar graph is near the bottom.)

It’s OK to use common sense, when evaluating poll results. Also, if numbers diverge widely from what else is out there, for no apparent reason, it’s not conspiracy theory time. The occasional outlier is inevitable, with statistical sampling. If you flip a coin 50 times every morning, just for the heck of it, once in a while you’re going to get a split of, say, 35-15 or greater. And if you do a lot of sampling for public polling on the questions of the day, sometimes you’re going to get a skewed (in this case, to the right) sample.

Obviously, if polling yet to be done also shows numbers like these, then these examples are not outliers, after all. But that seems unlikely.