<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=204513679968251&amp;ev=PageView&amp;noscript=1">

Do You Even Data

A data-driven marketing blog

Want to learn how you can translate incredible data list information into killer marketing campaigns? Want to better understand how data research and models can enhance the data you already have?
All Posts

Survey Response Rates in the Time of COVID-19

The past few weeks have required fundamental changes in the way consumers and businesses operate, in the US (and globally). With 175 million people in the US currently being asked to shelter at home, activity in the streets of many of our cities and towns has slowed to a crawl.

So, what does this mean for consumer research?

We know that in times of natural disaster (hurricanes, fires, etc.), behaviors among consumers, as reflected by randomly selected research respondents, change.

We frequently see such reduced response to our surveys during these events that the decision is made to suspend fieldwork until people are no longer focused on preparing their homes for an impending storm or cleaning up in the aftermath.  COVID-19, however, is a very different kind of crisis.

We theorized that response behaviors might be affected by COVID-19. In order to quantify this (we are researchers, after all), we looked at detailed response patterns to an ongoing customer satisfaction tracker we’ve been running for the past twenty years. A multi-mode project using a combination telephone/online sample, this seemed like the perfect study to determine if the current situation was having an impact on response behaviors.

The study runs monthly, with a survey under 10 minutes in length on average and a consistent incidence rate. With March’s fieldwork already completed, we decided to compare several key metrics against the same study’s performance in February, when news of COVID-19 was predominantly limited to the initial cases in China and a handful of isolated cases elsewhere.

On the web     

For the online component we looked at two key metrics:

  • Percent of those invited to take the survey that completed it
  • Requests to be removed from future survey communications

On the phone

To gauge differences in phone survey behavior, we evaluated the following:

  • Response rate
  • Refusal rate
  • No answer rate (i.e. unresolved records)
  • Dialing attempts required

Here’s what we found:

  • Response rates on the web seem largely unchanged. There is no significant difference in completion rate in March compared to February, although March’s was nominally higher (less than 1%). There was also no change in opt out behavior between the two months.
  • On the phone we saw similar performance for key response metrics– there was no difference between February and March for response rate, refusal rate, or no answer rate.

At first glance, this would seem to indicate that there was, for all intents and purposes, no functional change in response behavior between February and March. However, with the last metric we evaluated – dialing attempts required to obtain a phone complete, we did see something interesting.

For both months we delivered roughly the same number of completes, with roughly the same amount of starting sample. In March, less dialing was required to deliver the needed completes. In fact, it took 12% fewer dialing attempts to obtain a complete in March compared to February. On a study delivering ~1600 phone completes per month, that adds up to a substantial reduction in the call center’s interviewer hours, especially when considering that due to TCPA restrictions on predictive dialing, most of these calls are dialed manually.

Conclusion 

From a production and response standpoint, this study is not suffering any apparent ill-effects from the coronavirus outbreak.

On the web people are continuing to respond at the rates that they normally do.

On the phone, it’s requiring fewer attempts to reach the people we want to survey. It’s possible that people who typically do not respond to phone surveys are doing so now, due to being home for prolonged periods of time, or having regular activities curtailed. 

We will continue to monitor response behaviors on this and other studies, but for now, it would seem to be a good time to reach out to consumers on either the web or the phone. If you have questions for your customers about how their behavior is changing in the current environment, please don’t hesitate to call me at 919-932-8852 or reach out via email at heather.primm@datadecisionsgroup.com.

Heather Primm
Heather Primm
Heather Primm

Heather Primm, Director of Operations, came to Data Decisions Group in 1996 in a part-time capacity as an interviewer in the call center at night while working in property management during the day. Now, over twenty years later, Heather has spent time in almost every functional area of the company, working in quality control, project coordination, project management, product management, and now management of the operations team that delivers high-quality data and research solutions to our clients.

In her role at Data Decisions Group, Heather wears a number of different hats: consulting at the project level, keeping the team abreast of legislative changes that impact our industry, product development, process improvement, and relationship management. What she loves best about the work she does is helping deliver high-quality, actionable results to our clients that allow them to improve their business performance…and their bottom line.

Heather received her undergraduate degree from the University of North Carolina at Chapel Hill in 1992.