<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=204513679968251&amp;ev=PageView&amp;noscript=1">

Do You Even Data

A data-driven marketing blog

Want to learn how you can translate incredible data list information into killer marketing campaigns? Want to better understand how data research and models can enhance the data you already have?
All Posts

The impact of multimode data collection on the survey data itself

In my previous post, I talked about the effects of different sampling approaches on the proportion of web versus telephone completes on our newly multimode customer satisfaction program.  Now let’s look at how the resulting survey data were impacted by the transition from a fully phone-only methodology to collecting some portion of the completes online.

 

For this evaluation, we looked at data from a seven-month field period – February through August 2018. During this time, about three-fourths of our completes came from phone dialing, and about one-fourth from emailed invitations.  These data comprise thousands of surveys across a variety of different topics, but there are two key questions that are asked of all respondents, related to their satisfaction with their utility and their likelihood to choose their utility if they were able to select from more than one utility in their area.  Both questions are measured on a ten-point scale; for satisfaction the scale runs from 1 (very dissatisfied) to 10 (very satisfied), and for choice from 1 (very unlikely to choose this utility again) to 10 (very likely to choose this utility again).

 

When comparing web to phone, there are differences on both measures.  For satisfaction, phone responders give an average score of 8.8 on that ten-point scale, significantly higher than our web responders at 8.3 out of 10. Similarly, likelihood to choose the same utility again scored higher on the telephone (average rating 8.7) versus the web (8.3).  The tendency of respondents to give higher scores on scale questions is something we have observed in other research we have conducted and has been a well-documented phenomenon in the research industry for many years– see Christian, Dillman & Smyth’s paper dated 12/15/05.[1]

 

Digging a little deeper, we see other differences in the data for these key metrics. Females are more likely to say they are satisfied with their utility (8.8 average) than males (8.6).  They are also more likely to choose that utility again (8.7 versus 8.4).  Age also has an impact on how positively customers respond to these questions; those in the 65+ age group give higher scores on average than any other age group (8.9 for satisfaction and 9.0 for choice).  The lowest scores for both measures were given by people aged 35-44 (8.3 satisfaction and 7.9 choice). The differences between web and phone scores are largely consistent across every age group, and both genders.  Males and females both give higher scores for both measures on the phone compared to the web, and the same pattern is visible across each individual age group.

 

Finally, we looked at how the individual age groups responded to the survey, and again, we see some differences here.  People aged 18-34 and 65+ are the most likely to take the survey on the telephone (75.7% and 88.3% respectively) instead of on the web.  The three middle age groups (35-44, 45-54 and 55-64) are the most likely to respond online (37.5%, 39.9% and 39.5%).  Method of data collection for this study is largely driven by whether or not the customer has provided an email address to their utility and given permission to be contacted that way.  If an email address exists in their customer record, every attempt is made to reach them via the web before reaching out over the phone.

 

While it is logical that fewer customers in the 65+ age group do not regularly use the Internet, it’s much less likely that this is the case with 18-34 year olds, given their near universal access in the general population.[2]  It’s possible that many people in this youngest age group are newer customers for the utilities, having recently finished their education and moved into housing where they are directly responsible for paying the bills.  In that scenario, many of them may not have yet had a customer service experience that would have triggered a request by the utility to obtain an email address.  Currently, many of the utilities participating in this study do not default to collecting email address when a new customer signs up.   This may therefore be the most important finding of this evaluation – that a concerted effort by the utilities to collect email address when establishing new service could be the fastest and most efficient way to maximize web response across all age cohorts.

Request a Call with Heather Primm

 

[1] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.472.6819&rep=rep1&type=pdf
[2] http://www.pewinternet.org/fact-sheet/internet-broadband/

 

Heather Primm
Heather Primm
Heather Primm

Heather Primm, Director of Operations, came to Data Decisions Group in 1996 in a part-time capacity as an interviewer in the call center at night while working in property management during the day. Now, nearly twenty years later, Heather has spent time in almost every functional area of the company, working in quality control, project coordination, project management, product management, and now management of the operations team that delivers high-quality data and research solutions to our clients.

In her role at Data Decisions Group, Heather wears a number of different hats: consulting at the project level, keeping the team abreast of legislative changes that impact our industry, product development, process improvement, and relationship management. What she loves best about the work she does is helping deliver high-quality, actionable results to our clients that allow them to improve their business performance…and their bottom line.

Heather received her undergraduate degree from the University of North Carolina at Chapel Hill in 1992.