It is not unusual for an in-depth interview (IDI) or focus group participant to wonder at some point in an interview or discussion if the participant “did okay”; that is, whether the participant responded to the researcher’s questions in the manner in which the researcher intended. For instance, an interviewer investigating parents’ healthy food purchases for their children might ask a mother to describe a typical shopping trip to the grocery store. In response, the mother might talk about the day of the week, the time of day, where she shops, and whether she is alone or with her children or someone else. After which she might ask the interviewer, Is that the kind of thing you were looking for? Is that what you mean? Did I do okay in answering your question? The interviewer’s follow up might be, Tell me something about the in-store experience such as the sections of the store you visit and the kinds of food items you typically buy.
It is one thing to misinterpret the intention of a researcher’s question – e.g., detailing the logistics of food purchasing rather than the actual food purchase experience – but another thing to adjust responses based on any number of factors influenced by the researcher-participant interaction. These interaction effects stem, in part, from the participant’s attempt to “do okay” in their role in the research process. Dr. Kathryn Roulston at the University of Georgia has written quite a bit about interaction in research interviews, including an edited volume Interactional Studies of Qualitative Research Interviews.
The dynamics that come into play in an IDI or focus group study – and in varying degrees, ethnographic research – are of great interest to qualitative researchers and important considerations in the overall quality of the research. This is the reason that a lot has been written about the researcher’s reflexive journal and its importance in Read Full Text
In March 2018, Mario Luis Small gave a public lecture at Columbia University on “Rhetoric and Evidence in a Polarized Society.” In this terrific must-read speech, Small asserts that today’s public discourse concerning society’s most deserving issues – poverty, inequality, and economic opportunity – has been seriously weakened by the absence of “qualitative literacy.” Qualitative literacy has to do with “the ability to understand, handle, and properly interpret qualitative evidence” such as ethnographic and in-depth interview (IDI) data. Small contrasts the general lack of qualitative literacy with the “remarkable improvement” in “quantitative literacy” particularly among those in the media where data-driven journalism is on the rise, published stories are written with a greater knowledge of quantitative data and use of terminology (e.g., the inclusion of means and medians), and more care is given to the quantitative evidence cited in media commentary (i.e., op-eds).
Small explains that the extent to which a researcher (or journalist or anyone involved in the use of research) possesses qualitative literacy can be determined by looking at the person’s ability to “assess whether the ethnographer has collected and evaluated fieldnote data properly, or the interviewer has conducted interviews effectively and analyzed the transcripts properly.” This determination serves as the backbone of “basic qualitative literacy” which enables the research user to identify the difference between a rigorous qualitative study and Read Full Text
Researchers of all ilk care about bias and how it may creep into their research designs resulting in measurement error. This is true among quantitative researchers as well as among qualitative researchers who routinely demonstrate their sensitivity to potential bias in their data by way of building interviewer training, careful recruitment screening, and appropriate modes into their research designs. It is these types of measures that acknowledge qualitative researchers’ concerns about quality data; and yet, there are many other ways to mitigate bias in qualitative research that are often overlooked.
Marketing researchers (and marketing clients) in particular could benefit from thinking more deeply about bias and measurement error. In the interest of “faster, cheaper, better” research solutions, marketing researchers often lose sight of quality design issues, not the least of which concern bias and measurement error in the data. If marketing researchers care enough about mitigating bias to train interviewers/moderators, develop screening questions that effectively target the appropriate participant, and carefully select the suitable mode for the population segment, then it is sensible to adopt broader design standards that more fully embrace the collecting of quality data.
An example of a tool that serves to raise the design standard is the reflexive journal. The reflexive journal has been the subject (in whole or in part) of many articles in Research Design Review, most notably Read Full Text