Error in (Qualitative) Research
May 14, 2010
It should be pretty obvious from my earlier posts that I am a big believer in the idea that research design is governed by core principles that apply to everything we do. I believe that it is not good enough to be a qualitative researcher or a quantitative researcher or an online researcher or an ethnographer or whatever. That, regardless of our mode or technique, we are obligated as researchers to practice “good research” defined by adhering to basic tenets that we all should have learned in school. Unfortunately, college marketing research courses may fuel silo thinking in research design by organizing in-class discussions around research “classifications” rather than focusing on the discipline of research itself. It might not be a bad thing if students of marketing research were required to take research methods classes across fields – such as psychology, sociology, and political science – to gain an appreciation for the fundamentals of this thing we call “research.” In this respect I have often thought that I would like to come back in another life as a methodologist. Not too dissimilar from what Bill Neal of SDR discussed back in 1998a, i.e., as someone who has “specific education in, and knowledge of, a variety of converging disciplines” that would enable me to evaluate and craft efficient, powerful research designs. I published a short article on the idea of qualitative researchers as methodologists in 2001. I am nothing if not consistent.
What I really want to talk about is error. The preceding remarks were not so much a diversion as a reminder that, yes, it is okay to talk about error in the qualitative as well as the quantitative realm.
Both quantitative and qualitative research designs are typically shaped to ensure that responses to research questions are heard correctly and to improve the accuracy of analyses. The potential for achieving both these aims – accuracy in response interpretation and analysis – is realized to the extent that certain parameters are utilized in the conduct of the research. Quantitative studies, because of the structured design, can control for or logically theorize about sampling and non-sampling errors. Errors in qualitative research, on the other hand, are not as easily seen, yet they exist to a high degree and are often willingly introduced by the researcher. Knowing that error exists in (for example) focus group research is problematic because all researchers aim for confidence in their findings. Being highly aware of error introduced by convenience samples, as well as non-sampling errors (such as interviewer and selection bias in recruiting, moderator and response bias in the discussions themselves), qualitative researchers build in measures to control error in their selection and interviewing procedures similar to their quantitative colleagues (e.g., questionnaire design protocol in recruiting screeners, properly trained recruiting interviewers, non-leading interview techniques).
The notion of error in qualitative marketing research is rarely discussed but a concept worth exploring. Without it, qualitative research is weakened under scrutiny and simply becomes an exercise where all ideas are “good ideas,” where individual differences don’t matter, and where all responses to qualitative questions are legitimate. Some might go further and say that focus group research devolves into a haphazard process of ransacking the moderator’s projective toolbox. If this was true (which it is not), researchers wouldn’t incorporate any controls into their qualitative research designs or care too deeply about analysis. But as researchers we do care about the design and analytical elements of our qualitative research because we care about the transparency of the processes and the degree of confidence by which we can report study findings.
Error – controls – transparency – confidence in results. These are all issues that I come back to time and again. Am I building my own list of core research principles?
Neal, William D. “The Marketing Research Methodologist.” Marketing Research Magazine. Spring 1998.