April 22, 2015
The analysis of qualitative research data is no small thing. Because the very nature of qualitative research is complicated by the complexities inherent in being human, attempting to qualitatively measure and then make sense of behavior and attitudes is daunting. In fact, it is this overwhelming aspect of qualitative research that may lead researchers – who live in the real world of time and budget constraints – to succumb to a less-than-rigorous analytical process.
And yet, Analyzability* is a critical component in qualitative research design.
All of the data collection in the world – all the group discussions, IDIs, observations, storytelling, or in-the-moment research – amounts to a meaningless exercise unless and until a methodical processing and verification of the data is conducted. Without the thoughtful work required to achieve a quality research product, qualitative data simply sits as an inert compilation of discrete elements lacking import.
Finding the connections in the qualitative data that make sense of the phenomenon, concept, or construct under investigation may, for some, be difficult and worthy of shortcuts; but proper analysis is the only thing that separates an honest, professional qualitative study from a random amalgamation of conversations or online snapshots.
In April of last year, this blog discussed one facet of Analyzability, i.e., verification. Verification, however, only comes after the researcher has conducted the all-important processing phase that converts qualitative data – that amalgamation of discrete elements – into meaningful connections that give rise to interpretations and implications, and the ultimate usefulness, of the research.
A quality approach to qualitative research design necessitates a well-thought-out plan for finding connections and making sense of the data. Here are six recommended steps in that process*:
• Select the unit of analysis – a subject matter, an activity, a complete narrative or interview.
• Develop unique codes – an iterative process utilizing a codebook that pays particular attention to context to derive explicit, closely-defined code designations.
• Code – a dynamic process that incorporates pretesting of codes, inter-coder checks, and coder retraining as necessary.
• Identify categories – a group of codes that share an underlying construct.
• Identify themes or patterns – by looking at the coding overall and the identified categories to reveal the essence of the outcomes. This is made easier by way of visual displays via various programs such as PowerPoint and CAQDAS**.
• Draw interpretations and implications – from scrutinizing the coded and categorized data as well as ancillary materials such as reflexive journals, coders’ coding forms (with their comments), and other supporting documents.
* Analyzability is one of four components of the Total Quality Framework. This framework and the six general steps in qualitative research analysis are discussed fully in Applied Qualitative Research Design: A Total Quality Framework Approach (Roller, M. R. & Lavrakas, P. J., 2015).
Image captured from: http://www.breakthroughresults.co.uk/interim-management.php/
Every researcher working with human subjects strives to ensure the highest ethical standards. Regardless of whether the research is quantitative or qualitative in nature – or in the field of health, communications, education, psychology, marketing, anthropology, or sociology – researchers care about protecting the confidentiality, anonymity, and basic “rights” (such as privacy and freedom of thought) of the people who agree to be part of their studies. It is with this in mind that, in addition to gaining IRB approval (as required), researchers openly discuss the goals and intended use of their research with participants, as well as asking them to carefully read and agree to the appropriate consent forms. Online group discussions (focus groups) present a particularly delicate matter. Unlike any other overt form of research – unlike an online survey dominated by closed-end questions, or an online in-depth interview with one person at any moment in time – the online group discussion – with its amalgamation of many people (typically, strangers to each other) responding at length to many open-ended questions over the course of multiple (possibly, many) days – potentially raises important security and identity concerns among participants. Even with a signed consent form, online group participants may still have serious doubts about the containment of their input to the discussion and, hence, their willingness to contribute Read Full Text
Transparency plays a pivotal role in the final product of any research study. It is by revealing the study’s intricacies and details in the final document that the ultimate consumers of the research gain the understanding they need to (a) fully comprehend the people, phenomena, and context under investigation; (b) assign value to the interpretations and recommendations; and/or (c) transfer some aspect of the study to other contexts. Transparency, and its importance to the research process, has been discussed often in this blog, with articles in November 2009 and December 2012 devoted to the topic.
At the core of transparency is the notion of “thick description.” The use of the term here goes beyond its traditional meaning of
“describing and interpreting observed social action (or behavior) within its particular context…[along with] the thoughts and feelings of participants as well as the often complex web of relationships among them. Thick meaning of findings leads readers to a sense of verisimilitude, wherein they can cognitively and emotively ‘place’ themselves within the research context” (Ponterotto, 2006, p. 543).
to also include detailed information pertaining to data collection and analysis. Ethnography, for example, is greatly enriched (“thickened”) by the reporting of specifics in 25 areas related to the: Read Full Text
In all sorts of research it is common to ask not only about behavior – When did you first begin smoking cigarettes? How often do you take a multivitamin? Where did you go on your most recent vacation? – but also the “why” and/or “what” questions – What prompted you to start smoking? Why do you take a multivitamin? Why did you pick that particular spot for your most recent vacation? It is usual for the researcher to want to know more than just what happened. The researcher’s goal is typically to go beyond behavior, with a keen interest in getting to the thinking that can be linked with the behavior. It is this “probing” that enables the researcher to make associations and otherwise interpret – give meaning to – the data.
This is, after all, what keeps marketing researchers up at night. It is difficult to remember a time when marketing researchers were not obsessed with the reasons people buy certain products/services and not others. Whether rational or irrational, conscious or Read Full Text
January 30, 2015
As discussed elsewhere in this blog, there is a “new day” dawning for qualitative research; one that not only brings new life into its use but, along with it, an evolving enthusiasm for the idea that researchers of any ilk cannot truly grapple with human behavior and attitudes without an understanding of contexts, constructs, and the human condition. It is truly gratifying, for instance, to watch this enthusiasm grow in organizations such as the American Psychological Association where just this month a featured article in the American Psychologist is titled, “The Promises of Qualitative Inquiry” (Gergen, Josselson, & Freeman, 2015).
In 2014, Research Design Review published four articles pertaining to the ways survey research can be “made whole” with a nod to the use and/or sensitivities of qualitative research. This is because it is the role of qualitative research to unlock the human condition in our research by providing the context and meaning to constructs that define what is being measured. Without a direct or underlying qualitative research component, how is the survey researcher to understand – be comfortable in the knowledge of – his or her analysis and interpretation of the data?
These articles emphasize the challenges survey researchers face when they ask about vague yet highly-personal constructs – such as “the good life,” “happiness,” “satisfaction,” “preference,” or (even) the idea of “actively” incorporating “fruits” and “vegetables” in the diet – without the benefit of context or meaning from the respondent, or at least a concise definition by the researcher.
These four articles have been compiled into one document which can be downloaded here.
Gergen, K. J., Josselson, R., & Freeman, M. (2015). The promises of qualitative inquiry. American Psychologist, 70(1), 1-9.
Image captured from: http://www.designboom.com/history/friedrich2.html
December 16, 2014
Survey research is pretty good at allowing people to describe “things” in such a way that the researcher winds up with a fairly accurate idea of the thing being described. The most straight-forward example is a survey question that asks, “Which of the following features came with your new Toyota Corolla?” followed by a list of possible features. However, survey research can also get at descriptions of more experiential phenomena with questions such as, “On a scale from ‘1’ to ‘5’, how does each of the following statements describe your experience in buying a new home?” In these cases, the use of survey methods to research a great number of people, and compile and report the data as efficiently as possible, make good use of closed-ended questions to gain an understanding of respondents’ accounts of the “things” of interest. This can also be said of beliefs. Pew’s recent survey pertaining to the Christmas story that asked, Read Full Text
The fourth edition of Michael Quinn Patton’s book Qualitative Research & Evaluation Methods has just been published by Sage. It is a big book – over 800 pages – with updated and new content from earlier editions, including something he calls “ruminations” which are highlighted sections in each chapter that present Patton’s commentary and reflections on issues that have “persistently engaged, sometimes annoyed” him throughout his long career in qualitative research. Patton has made some of these ruminations available online via his posts on the betterevaluation.org blog.
In his November 14th post, Patton shares his “Rumination #2: Confusing empathy with bias.” In it, he raises an important issue – having to do with the personal nature of qualitative research and how that impacts data collection – that, on some level, runs through the qualitative-quantitative debates waged by researchers who argue for one form of research over another. Such a debate might involve a survey researcher who, entrenched in statistical analysis, wonders, Read Full Text
November 22, 2014
Back in April 2013, a post in RDR talked about the “daunting job of conducting a content analysis that reveals how people think [the “stream of consciousness”] while at the same time answers the research question and takes the sponsoring client to the next step.” The article outlines the basic steps in a content analysis, including the analysis and interpretation phases of the process. Making interpretations from a content analysis are tricky things, esp., when conducting a “primary content analysis” when the content being analyzed is derived from non-research-related, pre-existing sources such as newspapers, blog posts, Hollywood films, YouTube videos, television broadcasts, and the like. The issue here is the “trap” content analysts can fall into by (a) thinking there are causal relationships in the data when there are not, and/or (b) trying to build a story in the shape of their interpretations when the story (based on the data) has little merit. In this way, an overabundance of subjectivity can creep into the qualitative content analysis method.
These traps, related to causality and storytelling, are fairly easy to fall into unless a systematic and conscientious approach is taken in the analysis and interpretation phases. In particular, Read Full Text