The Quality of Responses to Open-ended Questions

A few years ago I teamed up with Linelle Blais (my client at the American Cancer Society) to conduct a test of online versus traditional paper survey mode effects.  At issue was whether the Volunteer Satisfaction Study, which we had been conducting for many years nationwide, could be converted from a self-administered paper questionnaire delivered via USPS to an online format.  By shifting to online we hoped to save the Society money as well as provide a faster turnaround of the findings.  This experiment set out to determine:  the rate at which volunteers would respond to a Web survey; the rate at which sub-groups would respond; the completeness of volunteers’ responses; and, the degree to which item responses vary as a result of Web vs. paper completion.

A fairly complete account of this research was published in Quirk’s last year (“A Volunteered Response”), and much of the following comments are taken from this article.

Among other things this research confirmed what researchers elsewhere had found, including a lower response rate on the Web but also a lower rate of item nonresponse.  Interestingly, respondents to the Web survey not only answered more questions but also were significantly more likely to respond to the open-ended question asking for their suggestions to improve volunteer satisfaction.  Sixty-five percent (65%) of the volunteers in the Web mode answered this question compared to 52% of the volunteers in the paper mode.  The sentiment of these comments (i.e., positive vs. negative vs. neutral remarks) did not differ greatly across modes; however, the length of these comments varied hugely by mode.  The average word count of comments made by volunteers responding to the online survey was 13 times higher than the word count among volunteers responding on paper – 268 words per comment vs. 20 words per comment, respectively.

There is also some indication that the quality of the open-end comments in the Web mode may be superior to that in the paper mode.  A cursory analysis of comments in both modes suggests that comments from Web respondents are more detailed (e.g., references to specific examples or names) and tend to be more constructive (i.e., offer suggestions for improvement) than comments from the paper questionnaire.

The readability of these comments, however, appears to be a different issue.  Looking at the readability scores, based on the Flesch-Kincaid Grade Level analysis, the comments from the paper survey read at an 8th grade level while comments from the online respondents read at the 7th grade level.  Whether this is a function of the younger age of Web respondents or the informal (even, sloppy) writing style many email users have adopted or something else is left for further research.

Open-ended questions – both in qualitative and quantitative research – are typically added to interview and questionnaire designs to enrich the data, bringing meaningful insight that might otherwise be lost in a completely close-ended approach.   Given that the quality of these responses can vary across modes should serve as an important consideration in constructing our research designs as well as in our analyses and interpretation.

2 comments

  1. There was debate at our company about including or not including open-ended questions in the survey. Experts had their own opinions about the same. What became clear was that even though people agree in principle that open-ended questions are key to finding out the real reason, the analysis of the collected data can be a big pain in the neck. There are some expensive and time consuming solutions available, but we were not exactly on a million dollar budget and 6+ month timeline.

    We ran into this tool(http://insight-magnet.com) when wanted to be able to analyze our survey datamart quickly and economically. You load the file and the tool lets you dice and slice the data any which way you want. The best part is that it actually reads through your open-ended responses and tells you the categories of feedback you received.

    They have a short tour on their website as well – http://insight-magnet.com/tour

    Don’t know what the latest thinking of the survey experts are but we think that asking open-ended questions is critical in collecting the feedback. The biggest mistake one can make is to ask for feedback and not do anything about it.

    I have included the link in the website link as well.

    Like

  2. Interesting data. We conducted a visitor satisfaction study for the Georgia Aquarium about two years ago. We used an online version and an on site version. We found much fuller responses to the open ends with the online version. Of course, those responding on-site were more likely to have distractions (kids) and time constraints (they filled out the questionnaire while exiting). And we didn’t do type of analysis you did on the content. But, your data confirms our recommendation to the Aquarium to do their satisfaction study online…especially given the data collection environment.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.