Research Analysis

Analyzable Qualitative Research: The Total Quality Framework Analyzability Component

A March 2017 article in Research Design Review discussed the Credibility component of the Total Quality Framework (TQF). As stated in the March article, the TQF “offers qualitative researchers a way to think about the quality of their research designs across qualitative methods and irrespective of any particular paradigm or theoretical orientation” and revolves around the four phases of the qualitative research process – data collection, analysis, reporting, and doing something of value with the outcomes (i.e., usefulness). The Credibility piece of the TQF has to do with data collection. The main elements of Credibility are Scope and Data Gathering – i.e., how well the study is inclusive of the population of interest (Scope) and how well the data collected accurately represent the constructs the study set out to investigate (Data Gathering).

The present article briefly describes the second TQF component – Analyzability. Analyzability is concerned with the “completeness and accuracy of the analysis and interpretations” of the qualitative data derived in data collection and consists of two key parts – Processing and Verification. Processing involves the careful consideration of: Read Full Text

Words Versus Meanings

There is a significant hurdle that researchers face when considering the addition of qualitative methods to the-suntheir research designs.  This has to do with the analysis – making sense – of the qualitative data.  One could argue that there are certainly other hurdles that lie ahead, such as those related to a quality approach to data collection, but the greatest perceived obstacle seems to reside in how to efficiently analyze qualitative outcomes.  This means that researchers working in large organizations that hope to conduct many qualitative studies over the course of a year are looking for a relatively fast and inexpensive analysis solution compared to the traditionally more laborious thought-intensive efforts utilized by qualitative researchers.

Among these researchers, efficiency is defined in terms of speed and cost.  And for these reasons they gravitate to text analytic programs and models powered by underlying algorithms.  The core of modeling solutions – such as word2vec and topic modeling – rests on “training” text corpora to produce vectors or clusters of co-occurring words or topics.  There are any number of programs that support these types of analytics, including those that incorporate data visualization functions that enable the researcher to see how words or topics congregate (or not), producing images such as these Read Full Text

Finding Connections & Making Sense of Qualitative Data

The analysis of qualitative research data is no small thing. Because the very nature of qualitative research is complicated by the complexiconnectionsties inherent in being human, attempting to qualitatively measure and then make sense of behavior and attitudes is daunting. In fact, it is this overwhelming aspect of qualitative research that may lead researchers – who live in the real world of time and budget constraints – to succumb to a less-than-rigorous analytical process.

And yet, Analyzability* is a critical component in qualitative research design.

All of the data collection in the world – all the group discussions, IDIs, observations, storytelling, or in-the-moment research – amounts to a meaningless exercise unless and until a thorough processing and verification of the data is conducted. Without the thoughtful work required to achieve a quality research product, qualitative data simply sits as an inert compilation of discrete elements lacking import.

Finding the connections in the qualitative data that make sense of the phenomenon, concept, or construct under investigation may, for some, be difficult and worthy of shortcuts; but proper analysis is the only thing that separates an honest, professional qualitative study from a random amalgamation of conversations or online snapshots.

In April of 2014, Research Design Review discussed one facet of Analyzability, i.e., verification. Verification, however, only comes into play after the researcher has conducted the all-important processing phase that converts qualitative data – that amalgamation of discrete elements – into meaningful connections that give rise to interpretations and implications, and the ultimate usefulness, of the research.

A quality approach to qualitative research design necessitates a well-thought-out plan for finding connections and making sense of the data. Here are six recommended steps in that process*:

•  Select the unit of analysis – a subject matter, an activity, a complete narrative or interview.
•  Develop unique codes – an iterative process utilizing a codebook that pays particular attention to context to derive explicit, closely-defined code designations.
•  Code – a dynamic process that incorporates pretesting of codes, inter-coder checks, and coder retraining as necessary.
•  Identify categories – a group of codes that share an underlying construct.
•  Identify themes or patterns – by looking at the coding overall and the identified categories to reveal the essence of the outcomes. This may be made easier by way of visual displays via various programs such as PowerPoint and CAQDAS**.
Draw interpretations and implications – from scrutinizing the coded and categorized data as well as ancillary materials such as reflexive journals, coders’ coding forms (with their comments), and other supporting documents.

* Analyzability is one of four components of the Total Quality Framework. This framework and the six general steps in qualitative research analysis are discussed fully in Applied Qualitative Research Design: A Total Quality Framework Approach (Roller, M. R. & Lavrakas, P. J., 2015).

** Computer-assisted qualitative data analysis software, such as NVivo, Atlas.ti, MAXQDA.

Image captured from: http://www.breakthroughresults.co.uk/interim-management.php/