March 30, 2014
In November 2012, Research Design Review posted an article titled, “Interviewer Bias & Reflexivity in Qualitative Research.” This article talks about why self-reflection is an important and necessary step for qualitative researchers to take in order to address “the distortions or preconceptions researchers’ unwittingly introduce in their qualitative designs.” Although the article focuses on the need for reflection as it relates to the potential for bias in the in-depth interview (IDI) method, the relatively¹ intimate, social component of qualitative research generally and other methods specifically – focus groups, ethnography, narrative – make them equally susceptible to researcher biases and suppositions.
The outcomes from a qualitative study are only as good as the data the researcher returns from the field. And one of the biggest threats to the quality of the research data is the ever-present Read Full Text
February 26, 2014
A graduate course in qualitative research methods may be framed around discussions of the particular theoretical or philosophical paradigms – belief systems or world view – that qualitative researchers use in varying degrees to orient their approach for any given study. And, indeed, if the instructor is using popular texts such as those from Norman Denzin and Yvonna Lincoln (2011) or John Creswell (2013), among many others, students would be learning first about the different implications and approaches associated with various paradigm orientations, followed by (or along with) the corresponding methodological considerations.
There have been over the years debates in the academic qualitative research community about how best to identify and talk about these paradigms as well as quality concerns related to conducting research based around any one of these belief systems. In the broadest sense, the most oft-discussed paradigms in qualitative research are: postpositivism – often allied with a more quantitative approach where the emphasis is on maintaining objectivity and controlling variables in order to approximate “reality”; constructivism or interpretivism – in which the belief is not hinged to one objective reality but multiple Read Full Text
February 17, 2014
Every week new email invitations arrive asking me to participate in an online survey concerning some product or service I recently used. And each time, as I read the stated reasons why I should comply with the request, I find myself taking a mental inventory of what I know or don’t know about the subject matter, what I can or cannot recall about my user experience, how positive or not positive the user experience was, and how important I think this product or service is in my life to be worthy of my time to answer their survey questions.
Last week I was asked by one of my trade organizations to participate in an online survey about their quarterly magazine. Or is it a monthly magazine? Maybe every two months? I am not sure, but I do know that I receive it and I read it. I stared at the email invitation taking the usual inventory, sifting through my usual battery of qualifying questions, pondering whether I should complete this survey or not. Yes, I told myself, I remember receiving this magazine, I know that I read it when it arrives, but do I really have anything to say about this magazine? My opinion of this magazine falls in some neutral territory Read Full Text
Designing Research to Understand How People Think: The Bridge that Connects Quantitative & Qualitative Research
January 15, 2014
In 2013, Research Design Review posted five articles that directly speak to common design considerations in quantitative and qualitative research that address the basic goal of understanding how people think. These common concerns, and the articles where they are discussed, include: using effective content analysis procedures to reveal underlying subjective connections for each respondent/participant (“Content Analysis & Navigating the Stream of Consciousness”); the importance of design approaches that target people’s stories (“‘Tell Me What Happened’ & Other Stories”); research designs that incorporate good listening techniques with appropriate, well-constructed questions (“Listening: A Lesson from New Coke”); utilizing qualitative research to examine the thinking that helps explain quantitative data (“Looking Under the Hood: What Survey Researchers Can Learn from Deceptive Product Reviews”); and the role of Daniel Kahneman’s System 1 (intuitive) and System 2 (cognitive) thinking framework in considering behavior in the marketplace (“Fast & Slow Thinking in Research Design”).
These five articles have been compiled into one pdf document that can be accessed here. Anyone who has read this blog since its inception in 2009 knows that a recurring theme revolves around research design issues that impact how well (or not) researchers gain an understanding of how people think. There is no reason to believe that the tradition won’t continue in 2014.
January 6, 2014
Research Design Review published 13 articles in 2013 that dealt explicitly with qualitative research design. These range from general topics – such as the “10 Distinctive Qualities of Qualitative Research,” Unilever’s accreditation program (and the absence of design considerations in its priorities), and the “messiness” (even scariness) of qualitative research design to quantitative-leaning researchers and/or clients with a discussion on ways to overcome their trepidation – to method-specific issues – such as the implications of group composition and interactions in focus group research, the roles of “analytical sensibilities” and deception in ethnography, and the unique design factors associated with multi-method, case-centered research. And, while all of the articles discuss quality design measures in some fashion, several posts extend quality considerations to particular facets of the research process, such as participant cooperation, the use of projective techniques, data validation, proposal writing, and the reporting of design in the final research document.
The 13 articles on qualitative research design from RDR in 2013 have been compiled into one pdf document which can be accessed here. As always, it is hoped that greater awareness of the issues impinging on quality in qualitative research design will foster a greater appreciation for and discussion of these issues leading to more credible, analyzable, transparent, and ultimately more useful qualitative research studies.
December 17, 2013
The November/December issue of ESOMAR’s Research World is largely devoted to behavioral economics (BE), an increasingly-popular topic in marketing circles. In it, various researchers discuss the virtues of embracing a BE model, with repeated reference specifically to Daniel Kahneman and his System 1-System 2 theory which is the foundation of his 2011 book Thinking, Fast and Slow.
The overall takeaway is the idea that marketing researchers would do well to focus their efforts on research that gets at System 1 thinking – intuitive, instinctive, automatic, fast thinking – rather than System 2 thinking – deliberative, “effortful,” attentive, slow thinking – because of its predominance in many of the decisions people make. Indeed, Kahneman emphasizes in his book that, unbeknownst to many of us, System 1 (automatic, effortless) thinking exerts significant influence on our experiences and is “the secret author of many of the choices and judgments you make” (p. 13). And this, of course, can be a very Read Full Text
November 26, 2013
Eric Anderson and Duncan Simester published a paper in May 2013 titled “Deceptive Reviews: The Influential Tail.” It talks about their analysis of many thousands of reviews for a major apparel “private label retailer” with the focus on a comparison of reviews made by customers who actually made a prior transaction (i.e., customers who actually purchased the item they were reviewing) and customers who had not made a prior transaction (i.e., customers who reviewed items they had not actually purchased). Their comparisons largely revolved around four key measures or indicators that characterize deception in online reviews and messaging: 1) a greater number of words (compared to reviews from customers who had bought the item); 2) the use of simpler, shorter words; 3) the inappropriate reference to family (i.e., referring to a family event unrelated to the product being reviewed such as “I remember when my mother took me shopping for school clothes…”); and 4) the extraordinary use of exclamation points (i.e., “!!” or “!!!”). Apparently, deceivers tend to overcompensate for their lack of true knowledge and wax eloquent about something they know nothing about. This wouldn’t matter except that deceivers’ deceptive reviews (i.e., reviews from customers who have not purchased the item reviewed) are more likely to be Read Full Text
November 15, 2013
A focus group moderator’s guide will often include group exercises or facilitation techniques as alternative approaches to direct questioning. While many of these alternative tactics are not unique to the group discussion method, and are also used in in-depth interview research, they have become a popular device in focus groups, esp., in the marketing research field. These alternative approaches can be broadly categorized as either enabling or projective techniques, the difference being whether the moderator’s intent is to simply modify a direct question to make it easier for group participants to express their opinions (enabling techniques) or delve into participants’ less conscious, less rational, less socially-acceptable feelings by way of indirect exercises (projective techniques). Examples of enabling techniques are: sentence completion – e.g., “When I think of my favorite foods, I think of _____.” or “The best thing about the new city transit system is _____.”; word association – e.g., asking prospective college students, “What is the first word you think of when Read Full Text
October 30, 2013
Last week, Susan Eliot posted a terrific piece on listening (a common theme on her blog The Listening Resource) titled “Listening For Versus Collecting Data.” In it, she talks about the power imbalance – and, I would add, the insensitive mindset – implied by the idea that researchers are “collecting data from subjects” compared to the more useful notion that we are listening “one human to another.” Eliot goes on to cite Martin Buber and his distinction of I-Thou and I-It interactions or relationships between people, with Eliot stating “When we look upon the other person as a ‘thou’ (a unique, sentient human being) rather than an ‘it’ (a data repository), we approach the research with a humanistic perspective, one that is likely to net us rich and meaningful data.”
Extolling the virtues of listening seems almost trite (we all claim to “listen” in some shape or form) yet why is it so very difficult? It is difficult, not only among researchers where listening is (should be) a required skill but, among all of us where listening is a fundamental component of human interaction.
The October 18, 2013 NPR TED Radio Hour program “Haves and Have-Nots” presents two important examples on the importance of listening and, more particularly, the negative effects of not listening well. The first is a TED talk given by Ernesto Sirolli titled “Want to help someone? Shut up and listen!” where he tells the story of an ill-fated attempt to teach people in Zambia Read Full Text