Polling election day voter registration in Maine

The September 7 Pulse/Rasmussen poll

Now that the People’s Veto to bring back election day voter registration is officially on, Maine has its first public poll on the topic.  (See these pdfs for results and cross-tabs.)  It reports that 53% support the law that ended elected day voter registration in Maine, while 47% oppose it.

Before evaluating this as public opinion research, I discuss some standard questions to ask about any poll.

As new polls are released on the subject, I hope to address those same questions and will link back to this post. Other posts may look at some additional aspects of polling.

If you want to skip ahead to the assessment, scroll down past the next section.

 Some poll questions and issues:

 One starting point is that polls take samples from a population in order to get a valid estimate of the views or behaviors of the population. OK, onto the questions.

1. What is the sample size? This question is important because, if the poll used good methods involving random sampling, larger samples will yield a result with a smaller margin of error than ones with smaller samples. People should always think of a poll’s result with the margin of error and the confidence level. The confidence level tells you how confident you are that the sample reflects the actual percentage in the population, within the margin of error.

2. Over how many days was the poll taken? A one day poll is problematic since you miss people who are not home. Best practice is take a poll over two to three days and to call back people who were not home if you called and they weren’t home.  If there were no callbacks, there’s a risk that a sample has error built in since people who are out more may be different from the ones you contacted.

3. Did the poll include cell phones along with the landlines? It’s increasingly problematic if a poll did not call cell phones. Research indicates that people who only have cell phones are different from others in politically significant ways.

4. Does the poll report everyone or a subset that it considers to be likely voters? If it’s likely voters, was the selection method sensible? This is terribly important.  When a polling firm focuses on likely voters, this means that it is throwing out responses from people they think unlikely to vote. On the one hand, this makes sense since many adults who are eligible to vote won’t vote. However, this is a major source of poll error. Most pollsters will not report likely voters until close to the election because it is too hard to accurately identify likeliness of voting beforehand. Also, it is best practices to disclose one’s method of placing respondents into the likely voter category. 

Now, let’s look at the MHPC poll on election day registration.

1. Sample size. The poll had a sample size of 500. If you look at past polling in Maine, say on the same-sex marriage referendum and the last race for governor, it is typical for this polling firm, but it falls on the low side. There are other state polls which have used even smaller samples. With this relatively small sample size, the margin of error is higher than some other polls.  

2. Over how many days was the poll taken? This poll was taken on just one day. This not a good practice.

3. Did the poll use cell phones and landlines or just landlines? This poll is landline only. This is not the best practice, as it disproportionately leaves out younger voters. This especially matters because college students have been a key focus of the debate. 

4. Is this is a likely voter poll and, if so, how did it determine who is likely to vote? This is a likely voter poll. As explained above, this is quite early to cut out some respondents because the pollster thinks they are not likely to vote. Also, one cannot evaluate how they determined who is likely to vote, as Rasmussen has a track record of not disclosing its likely voter screen and the MHPC has not included this information. Rasmussen’s lack of information about its likely voter screen has generated criticism from the polling and scholarly communities. Their procedure used to sort voters likely contributes to its skew toward Republicans, which was confirmed in 2010 election results.

The upshot: There are some problematic aspects of this poll. Does that mean that its findings are incorrect? It’s hard to say.

While some issues undermine its credibility, it would be better if we had the results of other public polls against which to place it.  Every poll has the potential to be off beyond its margin of error — If you have one with a 95% confidence interval, there’s a one-in-twenty chance that it is off by that much. 

Also, every poll is a snapshot.  We don’t know how Rasmussen determined likely voters, but the likelihood of voting often changes over the course of a campaign. So, even if its likely voter screen is reasonable (something we can’t really assess, since they aren’t transparent about their methodology), they may not be picking up voters who will actually vote.  

Amy Fried

About Amy Fried

Amy Fried loves Maine's sense of community and the wonderful mix of culture and outdoor recreation. She loves politics in three ways: as an analytical political scientist, a devoted political junkie and a citizen who believes politics matters for people's lives. Fried is Professor of Political Science at the University of Maine. Her views do not reflect those of her employer or any group to which she belongs.