The past few weeks have required fundamental changes in the way consumers and businesses operate, in the US (and globally). With 175 million people in the US currently being asked to shelter at home, activity in the streets of many of our cities and towns has slowed to a crawl.
So, what does this mean for consumer research?
We know that in times of natural disaster (hurricanes, fires, etc.), behaviors among consumers, as reflected by randomly selected research respondents, change.
We frequently see such reduced response to our surveys during these events that the decision is made to suspend fieldwork until people are no longer focused on preparing their homes for an impending storm or cleaning up in the aftermath. COVID-19, however, is a very different kind of crisis.
We theorized that response behaviors might be affected by COVID-19. In order to quantify this (we are researchers, after all), we looked at detailed response patterns to an ongoing customer satisfaction tracker we’ve been running for the past twenty years. A multi-mode project using a combination telephone/online sample, this seemed like the perfect study to determine if the current situation was having an impact on response behaviors.
The study runs monthly, with a survey under 10 minutes in length on average and a consistent incidence rate. With March’s fieldwork already completed, we decided to compare several key metrics against the same study’s performance in February, when news of COVID-19 was predominantly limited to the initial cases in China and a handful of isolated cases elsewhere.
On the web
For the online component we looked at two key metrics:
- Percent of those invited to take the survey that completed it
- Requests to be removed from future survey communications
On the phone
To gauge differences in phone survey behavior, we evaluated the following:
- Response rate
- Refusal rate
- No answer rate (i.e. unresolved records)
- Dialing attempts required
Here’s what we found:
- Response rates on the web seem largely unchanged. There is no significant difference in completion rate in March compared to February, although March’s was nominally higher (less than 1%). There was also no change in opt out behavior between the two months.
- On the phone we saw similar performance for key response metrics– there was no difference between February and March for response rate, refusal rate, or no answer rate.
At first glance, this would seem to indicate that there was, for all intents and purposes, no functional change in response behavior between February and March. However, with the last metric we evaluated – dialing attempts required to obtain a phone complete, we did see something interesting.
For both months we delivered roughly the same number of completes, with roughly the same amount of starting sample. In March, less dialing was required to deliver the needed completes. In fact, it took 12% fewer dialing attempts to obtain a complete in March compared to February. On a study delivering ~1600 phone completes per month, that adds up to a substantial reduction in the call center’s interviewer hours, especially when considering that due to TCPA restrictions on predictive dialing, most of these calls are dialed manually.
From a production and response standpoint, this study is not suffering any apparent ill-effects from the coronavirus outbreak.
On the web people are continuing to respond at the rates that they normally do.
On the phone, it’s requiring fewer attempts to reach the people we want to survey. It’s possible that people who typically do not respond to phone surveys are doing so now, due to being home for prolonged periods of time, or having regular activities curtailed.
We will continue to monitor response behaviors on this and other studies, but for now, it would seem to be a good time to reach out to consumers on either the web or the phone. If you have questions for your customers about how their behavior is changing in the current environment, please don’t hesitate to call me at 919-932-8852 or reach out via email at email@example.com.