Mixing Modes in Survey Administration

Primary tabs

Publication Date: 
Friday, July 22, 2016

When a researcher is considering whether to launch a survey in order to collect people’s opinions on a topic, one of the first decisions is how the study will be conducted:  by phone, in-person, by mail, via web or a mixture of those modes?

Mixing modes has been a common approach used by survey professionals for decades. Each method has its advantages and disadvantages. Having a toolkit of options is helpful because there is no right fit for every potential participant:  A hearing-impaired person can respond more easily by mail or web, while a blind person may prefer doing the survey by phone.

But mixing modes must be done carefully, and it is a challenge to weave together a questionnaire that is equivalent whether conducted by a professional interviewer or self-administered by paper and pencil.

 

How to offer choices

 

Americans value the freedom to choose, whether voting at the ballot box, seeking health care, or finding a career. Our supermarket shelves bear witness to the array of choice that consumers expect. There is an assumption that more choice means better options and greater satisfaction. The expectation was that if survey participants are offered a variety of methods to complete the survey, they will choose the best fit and will be more likely to complete the survey. 

However, several investigators have found that response rates actually declined when people were offered multiple choices at once.

One such study came from UF researcher Glenn Israel who conducted a study of Florida Cooperative Extension’s customers using various approaches.[1] As part of the study, some were sent a survey only by mail, while another groups was offered the choice to respond by mail or web in both the initial mailing and follow-up mailing of the questionnaire.  He reported that overall response was lower in the group given the mail/web choice (59.2 percent) compared to mail only (64.8 percent).

 

Table 1.  Response by experimental treatment, mail only vs. mail/web choice (Israel 2009)

Treatment

Sample size

Mail completes

Web completes

Total completes

Percent responding by mail

Percent responding by Web

Total response rate

Mail only

437

283

0

282

64.8%

0.0%

64.8%

Mail/Web choice

436

224

34

258

51.4%

7.8%

59.2%

 

Other studies around the country have found similar patterns where more choices yield lower response.

This may be due to “choice overload,” a phenomenon discussed by psychologist Barry Schwartz in his popular book The Paradox of Choice [2] and a much-viewed Ted talk. Schwartz makes the counterintuitive argument that too many choices can cause stress and self-doubt, whereas eliminating choices can reduce the anxiety and busyness of our lives.

So nowadays survey researchers will use mixed modes, but carefully offer the options one at a time, sequentially rather than all at once. For example, patient experience surveys conducted with the methods developed by the National Committee on Quality Assurance offer a mixed-mode approach with mail followed by telephone.[3] 

 

Achieving equivalence in question design

 

If some people will be answering by phone and others via web or on paper, how can questions be written to be as functionally similar as possible despite one being read by eyes and the other heard through the ear?

There has been a lot of research on validating survey questionnaires across modes. This pain intensity scale is an example. The self-administered version for paper or web is clear about the time period and labels the boxes at either end, allowing the survey respondent to choose a midpoint if they don’t have a strong opinion in either direction.

 

 

Then, to translate that question into an ear-friendly version for interviewer administration by phone or in-person, the question again starts with the time period, then explains the scale, and finally asks for a response.  The word “zero” is spelled out for clarity in reading, because if it were a number 0, an interviewer might inadvertently read it as “Oh,” causing confusion.  So the oral version is

In the past 7 days, how bad was your pain on average? Using any number from zero to ten, where zero is no pain and ten is the worst pain you can think of, what number would you use to describe your pain?

But questionnaire wording itself is not the only difference between survey modes. One of the strengths of interviewer administration is that an interviewer can politely follow up when a survey participant gives a “don’t know” answer, whereas in a self-administered survey, the item would just be left blank. This may contribute to item non-response and impact survey quality. The best practice for telephone interviewing is to probe once, since there are so many reasons why someone might say they don’t know. One respected guide to survey interviewing suggests

Respondents who start their answers by saying, "Oh, I don't know," or, "it depends," do not always mean it. Some say it as a way to stall while searching their memories for an appropriate answer. Others are embarrassed to admit that they did not hear the question fully. Lazy respondents who do not want to exert much mental effort rely on "don't know" answers to sidestep questions. Cautious or uncertain respondents may say they "don't know" while seeking clues from you on how to answer. Probe "don't know" answers for a complete answer.[4]

In an effort to mimic the follow-up that is possible in a conversation, some researchers are designing web surveys that offer a pop-up screen when someone tries to move forward without giving an answer to an item.[5] The screen politely reminds the survey participant that they did not answer and again asks for a response:

The respondent can still decline to answer, but the new screen gives them encouragement. This offers a technological way of emulating a polite interviewer probe in an online survey format.

Mixed-mode studies can be very successful and new research continues to provide insights on the best way to conduct surveys with multiple methods.

 

 

[1] Glenn D. Israel, “Obtaining Responses by Mail or Web: Response Rates and Data Consequences,” paper presented to the American Association for Public Opinion Research, May 2009, accessible at

https://www.amstat.org/sections/srms/proceedings/y2009/Files/400050.pdf

[2] Barry Schwartz, The Paradox of Choice: Why More Is Less, Harper Perennial, 2004.

[3] HEDIS® (The Healthcare Effectiveness Data and Information Set) Volume 3: Specifications for Survey Measures, September 2015.

[4] Patty Gwartney, The Telephone Interviewer's Handbook: How to Conduct Standardized Conversations, Jossey-Bass, 2007.

[5] Edith D. de Leeuw, Joop J. Hox, and Anja Boeve, “Handling Do-Not-Know Answers: Exploring New Approaches in Online and Mixed-Mode Surveys,” in Social Science Computer Review 2016, Vol. 34(1) 116-132.

Publication Types: 
BEBR Division: