You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Targeted letters: Effects on sample composition and item non-response

Abstract

General population surveys as well as business surveys are affected by low response rates. There is a need to investigate possible ways to increase participation in surveys and to motivate interviewee in answering questionnaires. Even in official statistics the problem is relevant since lack of motivation undermines not only participation to the survey, but it is related to the timeliness in participation, the quality of the answers and hence the quality of the measurements of socio-economic phenomena. A possible approach to cope with this problem, which has received attention in recent years, is targeted design. In targeted designs, several survey features may be the object of interventions (timing of contact, number of contact attempts, mode of contact, interviewers' assignment, communication features, etc.). In our paper, we consider the invitation letter, which is one possible targeted feature. This is related to motivation to respond to surveys and to respond accurately, completely, and timely. We analyze a case study in the UK Household Longitudinal Study Innovation Panel. At wave 6 of the panel, a randomized experiment was carried out to study the effects of targeted initial letters [1]. We investigate effects of targeted initial letters on sample composition, representativity, and item non-response.

References

[1] 

Lynn P., Targeted appeals for participation in letters to panel survey members. Public Opinion Quarterly 80(3) 2016, 771-782.

[2] 

Groves R.M., and Heeringa S.G., Responsive design for household surveys: Tools for actively controlling survey errors and costs, Journal of the Royal Statistical Society: Series A 169 (2006), 439-457.

[3] 

Wagner J., Adaptive survey design to reduce non-response bias. PhD Thesis, University of Michigan 2008.

[4] 

Luiten A., and Schouten B., Tailored fieldwork design to increase representative household survey response: An experiment in the Survey of Consumer Satisfaction, Journal of the Royal Statistical Society: Series A 176 (2013), 169-189.

[5] 

Massey D., and Tourangeau R., New challenges to social measurement, The Annals of the American Academy of Political and Social Science 645 (2013), 6-22.

[6] 

Tourangeau R., , Brick M., , Lohr S., and Li J., Adaptive and responsive survey designs: a review and assessment, Journal of the Royal Statistical Society, Series A (2016), 1-21.

[7] 

Lundquist P., and Särndal C.E., Aspects of responsive design with applications to the Swedish Living Conditions Survey, Journal of Official Statistics 29 (2013), 557-582.

[8] 

Shlomo N., , Schouten B., and de Heij V., Designing Adaptive Survey Designs with R-Indicators. Paper presented at the New Techniques and Technologies for Statistics Conference 2013, Brussels.

[9] 

Schouten B., , Cobben F., , Lundquist P., and Wagner J., Does more balanced survey response imply less non-response bias? Journal of the Royal Statistical Society Series A 179(3) (2016), 727-748.

[10] 

Dillman D.A., , Smyth J.D., and Christian L.M., Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method. 4th ed. Hoboken, NJ: John Wiley, 2014.

[11] 

Groves R.M., and Couper M.P., Nonresponse in Household Interview Surveys. New York: Wiley, 1998.

[12] 

Singer E., Exploring the Meaning of Consent: Participation in Research and Beliefs about Risks and Benefits, Journal of Official Statistics 19 (2003), 273-285.

[13] 

Groves R.M., , Cialdini R.B., and Couper M.P., Understanding the decision to participate in a survey, Public Opinion Quarterly 56(4) (1992), 475-495.

[14] 

Couper M.P., , Singer E., , Conrad F.G., and Groves R.M., Risk of Disclosure, Perceptions of Risk, and Concerns about Privacy and Confidentiality as Factors in Survey Participation, Journal of Official Statistics 24(2) (2008), 255-275.

[15] 

Kropf M., and Blair J., Testing Theories of Survey Cooperation: Incentives, Self-Interest and Norms of Cooperation, Evaluation Review 29(6) (2005), 559-575.

[16] 

Martin C.L., The impact of topic interest on mail survey response behaviour, Journal of Market Research Society 36 (1994), 327-338.

[17] 

Roose H., , Waege H., and Agneessens F., Respondent related correlates of response behavior in audience research, Quality & Quantity 37 (2003), 411-434.

[18] 

Roose H., , Lievens J., and Waege H., The joint effect of topic interest and follow-up procedures on the response in a mail questionnaire, Sociological Methods & Research 35 (2007), 410-428.

[19] 

Groves R.M., , Singer E., and Corning A., Leverage-saliency theory of survey participation: description and an illustration, Public Opinion Quarterly 64 (2000), 299-308.

[20] 

Groves R.M., and Couper M., Nonresponse in Household Interview Surveys. New York: Wiley, 1998.

[21] 

Cannell C.F., , Miller P.V., and Oksenberg L., Research on Interviewing Techniques, Sociological Methodology 12 (1981), 389-437.

[22] 

Groves R.M., Survey Errors and Survey Costs. New York: Wiley, 1989.

[23] 

De Leeuw E.D., , Hox J., and Huisman M., Prevention and treatment of item nonresponse, Journal of Official Statistics 19 (2003), 153-176.

[24] 

Beatty P., and Herrmann D., To answer or not to answer: Decision processes related to survey item nonresponse. In: R. Groves, D. Dillman, J. Eltinge, R. Little, eds, Survey nonresponse. New York: John Wiley & Sons, 2002, pp. 71-69.

[25] 

Mathiowetz N., Presentation in Response to T. Murato and P.A. Gwartney, Question Saliency, Question Difficulty, and Item Nonresponse in Survey Research. International Conference on Survey Nonresponse, Portland, October 1999.

[26] 

Lynn P., Targeted response inducement strategies on longitudinal surveys. In: U. Engel, B. Jann, P. Lynn, A. Scherpenzeel and P. Sturgis, eds, Improving Survey Methods: Lessons from Recent Research. New York: Routledge/Psychology Press, 2015, pp. 322-338.

[27] 

Bianchi A., and Biffignandi S., Responsive design for economic data in mixed-mode panels. In: Mecatti F, Conti PL, Ranalli MG, eds, Contribution to Sampling Statistics. Springer, 2014, pp. 85-102.

[28] 

Calderwood L., , Cleary A., , Flore G., and Wiggins R.D., Using response propensity models to inform fieldwork practice on the 5th wave of the Millenium Cohort Study. Paper presented at the International Panel Survey Methods Workshop Melbourne, Australia, July 2012.

[29] 

Buck N., and McFall S., Understanding Society: Design Overview, Longitudinal and Life Couse Studies 31 (2012), 5-17.

[30] 

Uhrig SCN. Using experiments to guide decision making in Understanding Society: Introducing the Innovation Panel, chapter 13. In: S.L. McFall and C. Garrington, eds, Understanding Society: Early Findings from the First Wave of the UK's Household Longitudinal Study. Colchester: University of Essex, 2011. Available from: http://research. understandingsociety.org.uk/findings/early-findings.

[31] 

Jäckle A., , Lynn P., and Burton J., Going Online with a Face-to-Face Household Panel: Effects of a Mixed Mode Design on Item and Unit Non-Response, Survey Research Methods 9 (2015), 57-70.

[32] 

Bianchi A., , Biffignandi S., and Lynn P., Web-CAPI sequential mixed mode design in a longitudinal survey: effects on participation rates, sample composition and costs. To appear in Journal of Official Statistics 2017.

[33] 

Lugtig P., Panel Attrition: Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers, Sociological Methods and Research 43 (2014), 699-723.

[34] 

Schoeni R.F., , Stafford F., , McGonagle K.A., and Andreski P., Response Rates in National Panel Surveys, Annals of the American Academy of Political and Social Science 645 (2013), 60-87.

[35] 

American Association for Public Opinion Research. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. 8th ed., Lenexa: American Association for Public Opinion Research, 2015.

[36] 

Schouten B., , Cobben F., and Bethlehem J., Indicators for the representativeness of survey response. Survey, Methodology 35 (2009), 101-113.

[37] 

Schouten B., , Shlomo N., and Skinner C., Indicators for Monitoring and Improving Representativeness of Response, Journal of Official Statistics 27 (2011), 231-253.

[38] 

Bianchi A., and Biffignandi S., Representativeness in panel surveys. To appear in Mathematical Population Studies (2017).