Advertisement Advertisement

Comparison of Mobile Online, Landline, Cell Phone, & Online Access Panel

The landline, cell phone, and online access panel are well known and frequently used channel modes for data collection. In this research report, online access panel users were found to be more like landline respondents than cell phone or mobile online respondents. Learn more in this white paper from Discovery Research Group.

 

Executive Summary

The landline, cell phone, and online access panel are well known and frequently used channel modes for data collection. Given the relatively new frontier of mobile online using smartphones as a collection tool it is important to understand this mode and the characteristics of respondents who do respond. The present study focuses on four modes (traditional landlines, cell phones, PC-based online access panel, and mobile-based online) for collecting data from the general population in a national survey of the contiguous United States. A total of 852 individuals responded to the survey, answering between 7 and 12 questions.

Question wording and response options were identical across modes. There are three categorical question types, namely demographic questions, cell phone preferences and usage, and one potentially sensitive question. Most findings are unsurprising and to be as expected, such as landline respondents are older, they use fewer data functions on their cell phones, and most are female respondents.

An interesting finding did come out of this research. In this sample, online access panel users were found to be more like landline respondents than cell phone or mobile online respondents. Online access panel respondents, compared with cell phone and mobile online respondents, are on average older, more likely to be landline only or make calls with a landline instead of a cell phone, and use fewer data functions on their cell phones in general.

Each mode is better suited for reaching a different age demographic: Landlines are best suited for reaching respondents age 55 and older; online access panels appear to represent 45-55 year olds most accurately; cell phone respondents fell out in the 35-44 age category; while the mobile online was the youngest at 18-34 years old. A combination of the four modes reflects the least amount of difference to US Census data than each mode individually.

In order to adjust for effects derived from each mode, weight schemes are created and briefly discussed in this paper. As a result of adjusting for modal differences to Census data a small percent of response comparisons gained significance where not previously found, while a large percentage of the categorical differences lost significance.

 

Introduction

Discovery Research Group (DRG) has been a key market research industry data provider for over 20 years, and with the advent of new modes of data collection seeks to not only provide the capability but have become an industry leader concerning these modes and methods.

Landline telephone surveys are becoming increasingly more difficult to administer due to rising costs and declining response rates. Cell phone dominance over landline use is continuing to grow, thus affecting landline response rates and representativeness. The internet as a tool for data collection is fast becoming a viable and acceptable alternative.

The big question mark in data collection channels regards the mobile online mode. Mobile online on smartphones is relatively new and thus has many unanswered questions concerning its appropriateness as a channel mode for data collection. Demographic characteristics of respondents, the instances in which it is a feasible option, and how the cost of administration compares are a few of the presently unknown issues about mobile online.

DRG, in an effort to better understand some of the unknowns concerning this mode, has commissioned this study on mobile online survey administration. Results of mobile online responses are compared against landline, cell phone, and online access panel sample. A combined sample file of the four modes is also compared against each individual mode.

Mobile online, much like the online access panel, has some enticing realistic advantages, such as a consistent question delivery, no interviewer bias, less pressure to respond in a socially acceptable fashion, ability to easily present visual and/or audible materials, respondents have spatial and time flexibility to complete, and an ability for the respondent to transfer to a different mode (e.g. self-directed mobile online to an interviewer).

 

Secondary Research

Using an internet-based research approach has many benefits over other methods like landline or paper and pencil. The number of individuals who have web access continues to increase, thus potentially providing access to large samples of individuals that have been unreachable by traditional means (Michalak 1998 & Pasveer & Ellard 1998).

As reported by The Nielsen Company from the results of their mobile insights survey in March of 2010, the market share of mobile online smartphones is projected to surpass the "feature" cell phone by Christmas of 2011, thus increasing the accessibility of a traditionally difficult-to-reach young subpopulation. A mobile online approach, like the traditional online access panel, benefits from a shorter time period needed for fielding, ability to include visual and auditory materials, requires no data entry by an interviewer, and still allows for complex skip patterns (Blyth 2008; Bronner & Kuijlen 2007; Roberts 2007; and Webb, Zimet, Fortenberry, and Blythe 1999).

There is a concern that the quality of the data from a mobile online sample will have higher sampling error. Sampling error occurs when the whole population is not surveyed. Like most online access panel sampling, mobile online sampling is not created from probability but from a convenience approach, and may result in systematic bias, and the confidence interval of the associated variability cannot be reliably described (Nagelhout et al 2010). When using a mixed-mode administration, weighting was found to reduce bias and enhance poststratification adjustments (Baines, Partin, Davern, and Rockwood 2007). Population weighting schemes typically include geography, race, age, gender, and ethnicity to improve the representation of these groups.

Baines, Partin, Davern, and Rockwood (2007) report that post-stratification weighting adjustments are enhanced and non-response bias is decreased with the use of a mixed mode survey administration. Gallagher et al (2005), de Leeuw (2005), Sala and Lynn (2007) found that mixed-mode surveys encouraged responses, reduced respondent burden, and were an effective means to increase response rates. When the response rate of a survey is less than 100%, there needs to be some amount of attention spent addressing how non-responders differ from responders. With an understanding of how these two differ, more appropriate assumptions and conclusions concerning the target population can be made from the collected data. Generally, it is assumed that the data are “missing at random” (MAR). MAR is built on the premise that those subgroups that are not well represented do not differ qualitatively in their responses within the survey from the members of that subgroup that did respond (Groves 1989).

Baines, Partin, Davern, and Rockwood (2007) discovered that phone nonresponders who were followed up with a mail survey were different. Through the use of a mixed-mode design they were allowed to reach segments that would have otherwise been missed if using only one mode. Their findings are not universally supported. When mixed-modes have been researched, the results do not clearly provide evidence for equivalent data or different results (O'Toole, Battistutta, Long, & Crouch, 1986; Wu et al 1997; Weinberger et al 1994; McHorney, Kosinski, & Ware 1994; & Galobardes et al 1998). Differences in findings have been noted in mode of administration (interviewer-directed versus self-directed; visual versus auditory), variable question responses, characteristics of the sample, or social desirability bias (Herzog & Rodgers 1988; Sala & Lynn 2007, Brambilla & McKinlay 1987; Groves 1989; & Hox, de Leeuw 1994).

The interviewer presence, in-person or over the phone, can be a distraction to respondents. Respondents have been shown to provide a large amount of positive or socially desirable responses because of their desire to not reveal beliefs they feel will not likely be endorsed by the interviewer (Bowling 2001). Interviewers vary in their ability to be neutral. Probing, listening, intonations, appearance, and recording of responses are all possible areas where interviewers can bias responses (Bowling 2001). Monitoring and training of interviewers can minimize this source of bias, but self-administered modes, such as online access panel and mobile online, avoid this source of bias.

Bowling continues by stating that when respondents interact with an interviewer, the interview becomes a social interaction, thus increasing the likelihood that respondents will take into consideration social norms and the interviewer's perception. Self-administration methods, like online access panels and mobile online, have a greater amount of impersonality than interviewer-directed methods, like landline and cell phone. As a result, findings may be confounded with respondents under-reporting undesirable behaviors and over-reporting desirable behaviors. Relationships between questions found during analysis will be suspect. There is much support for Bowling's finding in other studies. Lyons et al (1999), Bowling, Bond, Jenkinson, & Lamping (1999), Presser & Stinson (1998), Tourangeau et al (1997), and Tourangeau & Smith (1996) all found that respondents did give more positive and socially acceptable responses in interviewer-directed (phone and face-to-face) surveys than in self-administered (postal) surveys even when taking into account the order and contextual effects.

There is also clear support from many studies that there is a positive exaggeration of estimates concerning positive health status, health related quality of life, and desirable behavior and activity engagement, when based on face-to-face or telephone interviews (Lyons et al 1999; Vuillemin et al 2000; Brambilla & McKinlay 1987; McHorney, Kosinski, & Ware 1994; Perkins & Sanson-Fisher 1998; and Weinberger, Oddone, Samsa, & Landsman 1996).

The use of self-administered modes, as compared to interviewer-directed, are likely to increase respondent willingness in disclosing sensitive information. Anonymity in mail surveys has been shown to have higher item responses and more accurate reporting on sensitive topics of health and behavior because of its weak social presence (Tourangeau, Rips, & Rasinski 2000; Pruchno & Hayden 2000; Parker & Dewey 2000; Davis 1999; Pasveer & Ellard 1998; Smith & Leigh 1997; and Siemiatycki 1979).

This is an excerpt, the full white paper contains further detailed analysis, charts, and appendices; download the full pdf here.

This content was provided by Discovery Research Group. Visit their website at www.discoveryresearchgroup.com.

Company profile

Discovery Research Group

Discovery Research Group

Salt Lake City, Utah, United States of America
Telephone:
(800) 678-3748
Email:
info@discoveryresearchgroup.com
Website:
http://www.DiscoveryResearchGroup.com
About Discovery Research Group:
Discovery Research Group provides multi- service market research solutions by leveraging traditional research with new market research techniques.
http://www.DiscoveryResearchGroup.com

other content shared by Discovery Research Group

Friendly Captivity and Customer Intelligence

Robert Higginson, Discovery Research Group

I attended a conference in San Francisco earlier this month, titled “The Future of Customer Intelligence.” We need to begin a journey of “knowing more about what we do not know”. One place to start th...
Going Green: Social Media Research

David Johnson, Director of Project Analytics, Discovery Research Group

This project is a case study of St. Patrick’s Day and uses social media research to quantify unstructured text content for analysis and reporting.
Connect with the Customer Through Open-ended Context

David Johnson, Director of Project Analytics, Discovery Research Group

Due to ever-changing business environments and customer attitudes, it is imperative businesses continue to understand and measure their customers in ways that provide evolving rich detail. An increase...