Web Based Surveys: An Analysis of Nonresponse Causes

Jump To References Section

Authors

  • Bangalore University, Bangalore ,IN
  • Utkal University, Bhubaneswar ,IN
  • IBSB, Bangalore ,IN

Keywords:

ANOVA, Demographic Block, Factor Analysis, Noncoverage, Nonresponse, Questionnaire Design, Web Based Surveys.
Statistics

Abstract

Web based surveys (WBS) are becoming increasingly common due to widening net connectivity. WBS mode has its positive and negative points. Cost and time reductions are on the plus side while the question of randomness of sample, nonresponse and quality of response are on the other. Thus, WBS presents a mixed bag. This paper examines the reasons for nonresponse in WBS. The often quoted reasons are examined using Factor Analysis. It is found that design is a predominant factor.This aspect is further examined by analyzing the effect of positioning of demographic block, using ANOVA. It is noted that this block in the beginning of the questionnaire results in more drop outs as compared to it being later on. Also examined is the impact of sensitivity of questions and its interaction with the positioning of demographic block.

Downloads

Published

2010-12-17

How to Cite

Srivenkataramana, T., Mishra, G., & Saisree, M. (2010). Web Based Surveys: An Analysis of Nonresponse Causes. DHARANA - Bhavan’s International Journal of Business, 4(2), 77–82. Retrieved from https://informaticsjournals.com/index.php/dbijb/article/view/18039

Issue

Section

Research Articles

 

References

Bikart, B., and Schmittlein, D. (1999). The distribution of survey contact and participation in the United states: constructing a survey-based estimate. Journal of Marketing Research, 36(2), 286-294.

Bosnjak, M. and Tuten. T. L. (2001). Classifying Response Behaviors in Web-Based Surveys. Journal of Computer-Mediated Communication 6(3).

Couper, M. P. (2000). Web Surveys. A Review of issues and approaches. Public opinion quarterly, 64, 464-494.

Couper, M. P., Blair, J. and Triplett T. (1999) A comparision of mail and email for a survey of employees in federal statistical agencies. Journal of Official Statistics, 15, 39-56.

DeLeeuw, E. D. (2005). To mix or not to mix Data collection modes in Surveys. Journal of Official statistics, 21, 233-255.

Dilman, D. A., Tortora, R. D., Conradt, J., and Bowker, D. (1998). Influence of plain versus Fancy Design on Response Rates for Web Surveys. Paper presented at the annual meeting of the American Statistical Association, Dallas, TX.

Grooves and Couper, M. P. (1998). Nonresponse in interview surveys, New York, Wiley series in Survey Methodology.

Knapp, F. and Heidings Felder, M. (2001): "Drop out analysis: Effects of the survey design". Pabst science publishers, pp. 221-230.

MacElroy, B. (2000): "Variables influencing dropout rates in Web based surveys". Quirks marketing research review, July/August 2000. Paper. http:/www.quirks.com/(November 21, 2001)

Pratesi, M., Lozar. M. (2004). List-based Web Surveys: Quality, Timeliness, and Nonresponse in the Steps of the Participation Flow, Journal of Official Statistics 20, 451-465.

Schaefer and Dillman 1998, Development of standard email methodology, public opinion quarterly, 62: 378-97.

Vera,T., Marcel,D. and Arthur van Soest (2009) Design of Web Questionnaires: The effect of Lay out in Rating scales, Journal of Official Statistics 25, 509-528.