What Does the Public Really Think?
Weighing in on the statistical validity of survey and polls
Getting public input that reasonably represents the voting public may be easier and cheaper than many public policymakers and special interest advocates thought. Springs Insight Exchange, a non-partisan community group in Colorado Springs, has tested public sentiment on a number of local issues, including the priorities engaged citizens would like to see addressed by local government. One of the valid criticisms of their approach is that survey results are not statistically valid. So what is the value of seeking public, or for that matter business stakeholder input on important matters, if they’re not valid?
According to Tom Binnings, executive committee member of Springs Insight Exchange, leaders and decision-makers often assume the need for statistical validity that is it readily achievable through expensive surveys. Statistical validity requires random samples, which, by definition, do not apply to opinion surveys. There will always be sampling bias. Who is called on the phone, receives the email link, and who is willing to spend the five to 12 minutes interacting with the survey? The 2016 Brexit and U.S. presidential election polls revealed these findings can be wrong and seem to be missing the mark at an increasing rate.
What is statistical validity anyway?
In a nutshell, we want the sample results of respondents to accurately reflect the views of the full population if we are able to ask and get answers from everyone. But we can never be certain. So, survey companies, might shoot for a 95 percent confidence level that the results of the survey reflect the population +/- 5 percent. In Colorado Springs for example, to get accurate poll results on an election with 100,000 active voters, 95 percent certainty +/- 5 percent requires 400 random surveys be completed. Depending upon the length of such a survey, the sponsor can expect to spend $20,000 to $40,000.
But what if, as part of assessing public sentiment or customer feedback, we could be happy with lower reliability?
Moving from 95 percent to 90 percent confidence and shooting for +/- 5 percent lowers the surveys required to 275 from 400. This means that, assuming randomness (which we already determined is never really met with surveys) the results will reflect the population’s view +/-5 percent 90 times out of 100 instead of 95 times out of 100. Is the addition five correct times worth the additional cost?
Springs Insight Exchange, responding to reluctance by some policymakers and special interest groups to assess public sentiment because of the cost to get it right, conducted an experiment in the November 7 election. For $50 and 30 man-hours they posted a one-minute survey on Facebook and sent a survey link to their 200 members who had signed up over the last couple years to take part in providing their point of view on local issues. The result was 85 percent of the membership favored a city ballot initiative to create a stormwater fee. This favorable response “clearly reflected a bias of the membership and had to be adjusted,” according to Binnings. In contrast, the number of responses from the Facebook survey, which one would assume is more random, supported the initiative by 60 percent. The number of Facebook responses resulted in a 90 percent confidence level +/- 10 percent, which could be interpreted to mean nine times out of 10, the initiative should pass based on the Facebook responses (60 percent support – 10 percent margin of error). Taking the responses Springs Insight Exchange used historical voting patterns to estimate the actual election results would be somewhere between 56 percent to 63 percent in favor, while most of the local pundits thought it would be a very close election. The actual results were 54 percent in favor of the initiative. A previous “statistically valid phone survey” in August came in at 56 percent%.
According to Binnings, Springs Insight Exchange’s effort to calibrate their process “was pretty good, although there is more work to do especially when it comes to getting a more diverse membership.”
For a full summary of the effort go to: www.springsinsightexchange.com.