Survey incentives: What’s the right answer?
Marketers rely on surveys to help understand attitudes, opinions and behaviors in the marketplace. But obtaining an adequate survey response can be a challenge.
To help overcome target apathy, market researchers commonly include incentives with surveys. These gifts, ranging from trinkets to cash, can help increase survey response rates.¹ But why do incentives work? Which ones work best? And do they affect the quality of survey responses? The answers may surprise you.
Incentives trigger feelings of obligation
Motivating factor
According to the National Business Research Institute (NBRI), including incentives with surveys significantly increases response rates. Why? Because people feel obligated to return the favor by giving the survey special attention. This behavior is known to psychologists as the Norm of Reciprocity.¹
In his book, Influence: The Psychology of Persuasion, Dr. Robert Cialdini states that reciprocity is a powerful drive found in all societies and can often turn a “no” into a “yes,” even for requests that would normally be refused.²
Surprisingly, it appears that any kind of incentive can have a positive effect on response rate. In two trials with over 10,700 targets, including a pen or pencil with a questionnaire increased response rate 15-19 percentage points compared to surveys without incentives.³ According to researchers, people feel obligated to repay the kindness, regardless of the type or value of the gift.1,2
Research shows incentives significantly increase survey response rates — regardless of the incentive’s value.1,2 Targets receiving a gift feel obligated to repay the kindness by completing the survey.¹
EXECUTIVE SUMMARY
- Psychologists believe incentives included with a survey invoke a response called the Norm of Reciprocity that creates feelings of obligation — and drives recipients to respond.1,2
- Researchers have observed that using incentives can increase data quality as well as response rate.4
- Some studies show higher-value incentives don’t significantly increase survey response rate compared to lower-value incentives,1,5 which is good, because incentives are given before targets participate.
- Separate research studies show incentives that are effective for one audience may not be effective for a different audience.6,7
Incentives and data quality
Studies show incentives included with surveys can increase response rate, but can they also influence the quality of data collected?1,3,4 Researchers
typically find positive effects on data quality4:
- Significantly fewer “don’t know” or “no answer” responses
- Increased time dedicated to completing surveys
- More complete answers to open-ended questions
- More comments
Marketers considering incentives may be concerned they will influence people to respond favorably, thus creating bias in their data. But based on a social attitude survey of more than 5,000 people offered cash incentives, British researchers concluded that incentives didn’t bias the data.4
Sometimes less is more
If even small incentives can increase response rate and improve data quality, do higher-value incentives create a larger impact? In one study, including a logo pen or pencil significantly increased survey response rate compared to mailings with no incentive.3 In another study, researchers found response rates were not significantly different among people given $1, $2, $5 or $10 cash incentives.5 Their conclusion: Increasing the value of an incentive doesn’t automatically increase response.
Cash incentives: Less can be more
In a Penn State University mail-response study, researchers concluded cash incentives can significantly improve response rates. However, the incentive value didn’t seem to matter, as higher incentives didn’t produce significantly higher response rates.5 This study’s 36% response rate without incentive is stunning. A closer look reveals a well-constructed list and a survey on a subject of passionate regional interest: sport fishing.
Audience considerations
Effective incentives for one audience may not be effective for another. In one study, monetary incentives increased response rate among physicians, but a separate study showed monetary incentives had no effect on executives of nonprofit organizations.6,7 However, the delivery method affected response rates in both studies. Delivery by courier, rather than standard mail, significantly increased response rate in both groups.6,7 Testing incentive and delivery options on a small subset of your list can help gauge the correct formula for success.
1 Survey Incentives: Response Rates and Data Quality. (n.d.). National Business Research Institute. Retrieved from https://www.nbrii.com/customer-survey-white-papers/survey-incentives-response-rates-and-data-quality
2 Cialdini, R. (1993). Influence: The Psychology of Persuasion. New York, NY: W. Morrow and Company.
3 White, E., Carney, P., & Shattuck Kolar, A. (2005). Increasing Response Rate to Mailed Questionnaires by Including a Pencil/Pen. American Journal of Epidemiology, 162(3):261-266.
4 Tzamourani, P. & Lynn, P. (1980, February). The effect of monetary incentives on data quality: Results from the British Social Attitudes Survey 1998 Experiment. Centre for Research into Elections and Social Trends. Working Paper 73. Retrieved from https://www.researchgate.net/publication/251639408_The_effect_of_monetary_incentives_on_data_quality_-_Results_from_the_British_Social_Attitudes_Survey_1998_experiment
5 Wheeler, J., Lazo, J., Heberling, M., Fisher, A. & Epp, D. (2007). Monetary Incentive Response Effects in Contingent Valuation Mail Surveys. University Library of Munich, Germany. Retrieved from http://ideas.repec.org/p/wpa/wuwpot/9703001.html.
6 Kasprzyk, D., Montano, D., St. Lawrence, J., & Phillips, W. (2001). The Effects of Variations in Mode of Delivery and Monetary Incentive on Physicians’ Responses to a Mailed Survey Assessing STD Practice Patterns. Evaluation & the Health Professions, 24(1):3-17.
7 Hager, M., Pollak, T., Rooney, P., & Wilson, S. (2003). Response Rates for Mail Surveys of Nonprofit Organizations: A Review and Empirical Test. Nonprofit and Voluntary Sector Quarterly, 32(2):252-267.