Your opinion matters. Just ask any pollster.

But randomized telephone surveys—a historically reliable way to assess public opinion—have become too pricey for many organizations to conduct, while cheaper online surveys—where the future seems to lie—have delivered some famously inaccurate results.

So what happens when research organizations decide it’s time to move traditional polls online? How do they ensure their trusted results will remain reliable?

This year, two established surveys took the plunge.

The University of Michigan’s Panel Study of Income Dynamics, the longest-running longitudinal study in the world, and the U.S. portion of the Pew Research Center’s Global Attitudes Survey, the largest multinational series focused on global issues, shifted to the web.

Both cited cost savings as a reason for the change.

One of the biggest roadblocks to running repeated randomized telephone surveys is the time-consuming chore of convincing enough people to participate.

“It took over 40,000 numbers to have a sample of 300 for our landline phones and over 67,000 numbers for a sample of 1,200 cellphone responses,” said Martha McRoy, a research methodologist at the Pew Research Center. She noted the organization would dial a number five times before giving up.

To improve efficiency, Pew, which has already migrated much of its U.S. polling online, assembled what it calls its American Trends Panel in 2014.

Initially, panelists for the online polls were recruited using randomized calls to landlines and cellphones, creating the kind of probability-based sample typically used for scientific research and telephone surveys. Now, participants are invited by mail using randomly selected addresses, and those who don’t have internet access at home are provided a tablet and internet service.

The costs of setting up such a panel are comparable to a typical telephone survey. But over time, a panel becomes less expensive because the same people are surveyed repeatedly.

To prepare for its latest shift to the web, Pew administered a bridge survey last year on global attitudes that asked 78 questions online and by phone. It was the organization’s first extensive test to see how a different survey mode might affect the way Americans answer foreign-policy and internationally focused questions.

Most responses between the two versions of the survey differed by at least 4 percentage points, but 19 differed by at least 10 percentage points.

The largest variations were associated with questions that offered four answer choices, such as “strongly agree,” “agree,” “disagree” or “strongly disagree.” The disparities generally indicated differences in intensity, not direction, meaning that results weren’t contradictory.

“The phone respondents were more likely to give extreme responses, whereas panelists were more likely to give neutral, softer responses,” Ms. McRoy said.

Answers to self-administered online surveys sometimes diverge from those collected by an interviewer during a telephone call because of what researchers term “mode effects,” or variations in responses attributed to the survey format rather than to differences of opinion.

One example of a mode effect is that online survey respondents tend to favor the first answer choice they read, Ms. McRoy said, while people answering by telephone tend to favor the last choice they hear.

‘The phone respondents were more likely to give extreme responses, whereas panelists were more likely to give neutral, softer responses.’

— Martha McRoy, research methodologist at the Pew Research Center

In a 2014 study designed to measure these kinds of inconsistencies, Pew randomly assigned 3,003 respondents to either a telephone or online survey group and then asked them the same 60 questions.

Differences in responses between the two ranged in size from 0 to 18 percentage points.

In this test, the discrepancies were especially large when respondents were asked to assess their quality of family and social life (telephone respondents reported higher levels of satisfaction); rate political figures (online respondents were more likely to choose “very unfavorable”); and judge the discrimination faced by different groups (telephone respondents were more likely to say gay, lesbian, Black and Hispanic people face a lot of discrimination).

Explanations for the differences include the possibility that individuals responding to an interviewer during a telephone call might be more inclined to offer socially acceptable answers—but the effects differ by individual and type of question.

SHARE YOUR THOUGHTS

What do you think is the best way to poll Americans for accurate results? Why? Join the conversation below.

In its preliminary tests, Michigan’s Panel Study of Income Dynamics found few discrepancies between responses to its online and telephone surveys, perhaps because of the kinds of questions it asks.

“We conducted two pilot surveys and a pretest with the web and phone, and did not find many differences,” said David Johnson, a spokesman for the University of Michigan.

“The PSID does not ask many attitudinal questions, hence, we don’t expect many differences.”

When responses to the same questions diverge, there’s no way to determine whether the telephone or online response is more accurate, according to Pew. So, for now, researchers have to weigh the trade-offs, when they analyze the survey results, and factor in the differences they’ve discovered.

Write to Jo Craven McGinty at [email protected]

Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

This post first appeared on wsj.com

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Amazon and Apple Built Vast Wireless Networks Using Your Devices. Here’s How They Work.

What to do if you’re a globe-spanning tech titan that wants to…

Mind-blowing 10billion-year-old ‘Super Earth’ that is one of galaxy’s oldest planets discovered

A ROCKY “Super-Earth” planet has been spotted orbiting one of the Milky…

Why DeepMind Is Sending AI Humanoids to Soccer Camp

“This didn’t really work,” says Nicolas Heess, also a research scientist at…

Why am I getting an iPhone ad-tracking pop-up and what does it mean?

CONFUSED by a new iPhone pop-up asking you to allow or disable…