The analysis of qualitative research data is no small thing. Because the very nature of qualitative research is complicated by the complexities inherent in being human, attempting to qualitatively measure and then make sense of behavior and attitudes is daunting. In fact, it is this overwhelming aspect of qualitative research that may lead researchers – who live in the real world of time and budget constraints – to succumb to a less-than-rigorous analytical process.
And yet, Analyzability* is a critical component in qualitative research design.
All of the data collection in the world – all the group discussions, IDIs, observations, storytelling, or in-the-moment research – amounts to a meaningless exercise unless and until a thorough processing and verification of the data is conducted. Without the thoughtful work required to achieve a quality research product, qualitative data simply sits as an inert compilation of discrete elements lacking import.
Finding the connections in the qualitative data that make sense of the phenomenon, concept, or construct under investigation may, for some, be difficult and worthy of shortcuts; but proper analysis is the only thing that separates an honest, professional qualitative study from a random amalgamation of conversations or online snapshots.
In April of 2014, Research Design Review discussed one facet of Analyzability, i.e., verification. Verification, however, only comes into play after the researcher has conducted the all-important processing phase that converts qualitative data – that amalgamation of discrete elements – into meaningful connections that give rise to interpretations and implications, and the ultimate usefulness, of the research.
A quality approach to qualitative research design necessitates a well-thought-out plan for finding connections and making sense of the data. Here are six recommended steps in that process*:
• Select the unit of analysis – a subject matter, an activity, a complete narrative or interview.
• Develop unique codes – an iterative process utilizing a codebook that pays particular attention to context to derive explicit, closely-defined code designations.
• Code – a dynamic process that incorporates pretesting of codes, inter-coder checks, and coder retraining as necessary.
• Identify categories – a group of codes that share an underlying construct.
• Identify themes or patterns – by looking at the coding overall and the identified categories to reveal the essence of the outcomes. This may be made easier by way of visual displays via various programs such as PowerPoint and CAQDAS**.
• Draw interpretations and implications – from scrutinizing the coded and categorized data as well as ancillary materials such as reflexive journals, coders’ coding forms (with their comments), and other supporting documents.
* Analyzability is one of four components of the Total Quality Framework. This framework and the six general steps in qualitative research analysis are discussed fully in Applied Qualitative Research Design: A Total Quality Framework Approach (Roller, M. R. & Lavrakas, P. J., 2015).
** Computer-assisted qualitative data analysis software, such as NVivo, Atlas.ti, MAXQDA.
Image captured from: http://www.breakthroughresults.co.uk/interim-management.php/
3 comments