<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=204513679968251&amp;ev=PageView&amp;noscript=1">

Do You Even Data

A data-driven marketing blog

Want to learn how you can translate incredible data list information into killer marketing campaigns? Want to better understand how data research and models can enhance the data you already have?
All Posts

TCPA Legislation

For the past twenty years, DDG has been collecting data for a massive, multi-utility customer satisfaction study.  Because many of these utilities service residential customers in geographies that are heavily rural, where Internet access is not a guarantee, this study has heretofore been conducted using a telephone data collection methodology.  However, as TCPA restrictions on the use of predictive dialers, decreasing penetration of land lines, and flagging telephone response rates have made meeting monthly targets more and more challenging, DDG undertook a significant redesign of the study to allow for multimode data collection.  For the first time, online interviews would supplant some of the telephone interviews.

The surveys were reprogrammed into a single multimode instrument, enabling the efficient sharing of complex, nested quotas irrespective of mode.  This meant that no complicated management of monthly targets was required.  Beta testing was conducted in Q4 of 2017 before full rollout in January of 2018. 

A critical aspect of the sample plan for this research is the EPSEM (Equal Probability Selection Method) sampling requirement, whereby each member of the population has an equal and known probability of being chosen to participate in the study.  Simply put, the sample selection for each month of fieldwork was designed to ensure that no customer had a higher chance of being selected for the study than any other, regardless of whether an email address was available for them.  If sampled consumers don’t respond to the utility-branded email invitation, the data collection system automatically moves them over to get a phone call once we reach a certain point in fieldwork.

So far, we have conducted twelve months of multimode fieldwork. To maximize the number of online completes we acquire (thus minimizing costs for the utilities), telephone and web data collection must be carefully timed and managed. Generally, these utilities have email addresses for about half of their customers, so we aim to deliver about half of the completed interviews on the web.  In early months, the proportion of online interviews was lower, yielding between 25-35% of all interviews.  Each month different combinations of phone sample release and web communications were tested until we found the optimal mix that yielded the 50% web completes we were aiming for.  Currently we are working with the utilities on ways they can eventually boost this percentage, and even possibly transition to doing the entire study on the web.

When comparing our telephone completes to our web completes, we do see some significant differences.  For our initial evaluation, we looked at data from a seven-month field period – February through August 2018. During this time, about three-fourths of our completes came from phone dialing, and about one-fourth from emailed invitations.  These data comprise thousands of surveys across a variety of different topics, but there are two key questions that are asked of all respondents, related to their satisfaction with their utility and their likelihood to choose their utility if they were able to select from more than one utility in their area.  Both questions are measured on a ten-point scale; for satisfaction the scale runs from 1 (very dissatisfied) to 10 (very satisfied), and for choice from 1 (very unlikely to choose this utility again) to 10 (very likely to choose this utility again).

When comparing web to phone, there are differences on both measures.  For satisfaction, phone responders give an average score of 8.8 on that ten-point scale, significantly higher than our web responders at 8.3 out of 10. Similarly, likelihood to choose the same utility again scored higher on the telephone (average rating 8.7) versus the web (8.3).  The tendency of respondents to give higher scores on scale questions is something we have observed in other research we have conducted, and it’s been a well-documented phenomenon in the research industry for many years– see Christian, Dillman & Smyth’s paper dated 12/15/05.[1]

Digging a little deeper, we see other differences in the data for these key metrics. Females are more likely to say they are satisfied with their utility (8.8 average) than males (8.6).  They are also more likely to choose that utility again (8.7 versus 8.4).  Age also has an impact on how positively customers respond to these questions; those in the 65+ age group give higher scores on average than any other age group (8.9 for satisfaction and 9.0 for choice).  The lowest scores for both measures were given by people aged 35-44 (8.3 satisfaction and 7.9 choice). The differences between web and phone scores are largely consistent across every age group, and both genders.  Males and females both give higher scores for both measures on the phone compared to the web, and the same pattern is visible across each individual age group.

Finally, we looked at how the individual age groups responded to the survey, and again, we see some differences here.  People aged 18-34 and 65+ are the most likely to take the survey on the telephone (75.7% and 88.3% respectively) instead of on the web.  The three middle age groups (35-44, 45-54 and 55-64) are the most likely to respond online (37.5%, 39.9% and 39.5%).  Method of data collection for this study is largely driven by whether or not the customer has provided an email address to their utility and given permission to be contacted that way.  If an email address exists in their customer record, every attempt is made to reach them via the web before reaching out over the phone.

While researchers (and sometimes clients) tend to view differences by data collection mode as cause for alarm, and a potential indicator that one or the other method is the “correct” result, and the other is somehow inaccurate, this is not necessarily the case. The sampling design for this study ensures that each potential participant has an equal opportunity of being contacted to complete the survey. And the dual mode approach means that they can complete the survey in the way that’s most convenient for them. A telephone only version of the study has the potential to disproportionately represent people in the youngest and oldest cohort, while online only would have the reverse impact. So in addition to reducing costs for the cooperatives, the new version of this research program allows them to effectively reach a more representative sample of their customers.

 

[1] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.472.6819&rep=rep1&type=pdf

Heather Primm
Heather Primm
Heather Primm

Heather Primm, Director of Operations, came to Data Decisions Group in 1996 in a part-time capacity as an interviewer in the call center at night while working in property management during the day. Now, nearly twenty years later, Heather has spent time in almost every functional area of the company, working in quality control, project coordination, project management, product management, and now management of the operations team that delivers high-quality data and research solutions to our clients.

In her role at Data Decisions Group, Heather wears a number of different hats: consulting at the project level, keeping the team abreast of legislative changes that impact our industry, product development, process improvement, and relationship management. What she loves best about the work she does is helping deliver high-quality, actionable results to our clients that allow them to improve their business performance…and their bottom line.

Heather received her undergraduate degree from the University of North Carolina at Chapel Hill in 1992.