Reporting What We Know From What We Ask

For most of us, it is important to write a final research report that goes beyond the questions we asked and the responses we received. Unlike a topline debriefing that may require a simple rundown of the questions and responses, our qualitative and quantitative studies typically Pop-tartsculminate in write-ups that provide thoughtful discussions of our analyses and interpretations of the data.

The consumers of our research reports take it on blind faith that the data along with the corresponding questions and issues are reported accurately, and that the researchers’ interpretations of the findings are consistent with both the data and the questions asked or issues raised.   And yet blind faith is not always enough. Those are the times when a closer look at what the research actually asked and what is actually reported is needed.

One example is a July 2014 report from Gallup on its research concerning Americans’ consumption habits. The report, in part, shows that nearly all (more than 90%) “actively try to include” fruits and/or vegetables in their diet. The report’s author thinks this percentage may be too high, stating that “it is not clear that such a high proportion of Americans really do eat this healthily.” Although the consumption rate may seem a bit bloated, the reality is that we do not know what respondents were including when thinking about the category of fruit or of vegetables. Some may have limited their notion of “fruit” to fresh, frozen, or canned; while others may believe that their concerted efforts to choose strawberry over brown sugar cinnamon Pop-Tarts® are a deliberate attempt to put fruit in their diet. And what about the respondent who considers his daily consumption of French fries as an active effort to eat more vegetables? If we include Pop-Tarts® and French fries under fruits and vegetables, the reported 90%+ figure may not be too unrealistic. Without the added question, “What did you consider when stating that you actively include fruit in your diet?” the researchers (and research users) are not able to make health claims related to fruit consumption.

Pew Research released a report on a 2013 study – “Social Media and the ‘Spiral of Silence’” – that set out to understand people’s willingness to speak openly about public policy issues and consider others’ views in various face-to-face and online settings. In particular, they focused on Edward Snowden’s leaks to the media pertaining to the government surveillance of Americans’ telephone and email communications. Throughout the report, the authors refer to the “Snowden-NSA story” – as in “People were less willing to discuss the Snowden-NSA story in social media than they were in person” and Those who said they were very interested in the Snowden-NSA story were more likely than those who were not as interested to express their opinions. Problem is, the survey interview never explicitly asked about the “Snowden-NSA story” but rather asked about “a government program with the aim of collecting information about people’s telephone calls, emails and other online communications” or simply “the government’s surveillance programs.” It is understandable that directly referring to Snowden and the NSA in the interview may have biased the responses, yet it is reasonable to wonder if respondents actually interpreted the survey questions as intended by the researchers – i.e., as referring to the “Snowden-NSA story” – or rather had a different understanding of “surveillance programs” or were thinking of a different media story altogether or were not thinking of anything in particular but in generalities. A simple add-on question at the end of the survey interview – such as, “Were you thinking of anything in particular when I asked you about the government’s surveillance programs?” [IF YES] What were you thinking?” – would have shed some light on the extent to which respondents were in sync with the researcher’s meaning and, specifically, whether they were thinking of the “Snowden-NSA story” when responding.

The readers and users of our research shouldn’t have to double check the veracity of our assumptions and interpretations. But when they do, they should find that what we report is derived from what we actually asked in the research. Until we know what Americans include in the category of “fruit,” strawberry Pop-Tarts® might as well fall into the same basket with fresh peaches and pears.

Image captured from:


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.