A March 2017 article in Research Design Review discussed the Credibility component of the Total Quality Framework (TQF). As stated in the March article, the TQF “offers qualitative researchers a way to think about the quality of their research designs across qualitative methods and irrespective of any particular paradigm or theoretical orientation” and revolves around the four phases of the qualitative research process – data collection, analysis, reporting, and doing something of value with the outcomes (i.e., usefulness). The Credibility piece of the TQF has to do with data collection. The main elements of Credibility are Scope and Data Gathering – i.e., how well the study is inclusive of the population of interest (Scope) and how well the data collected accurately represent the constructs the study set out to investigate (Data Gathering).
The present article briefly describes the second TQF component – Analyzability. Analyzability is concerned with the “completeness and accuracy of the analysis and interpretations” of the qualitative data derived in data collection and consists of two key parts – Processing and Verification. Processing involves the careful consideration of: (a) how the preliminary data are transformed into the final dataset that is used in analysis and (b) the actual analysis of the final set of data. The transformation of preliminary data typically involves converting audio or video recordings to a written transcript. From a TQF perspective, the qualitative researcher needs to give serious thought to, among other things, the quality of the transcripts created, with particular attention to the knowledge and accuracy of the transcriptionist*. The qualitative researcher also needs to reflect on the limitations of transcripts and, specifically, what can and cannot be learned from the data in transcript form.
Once the final dataset has been developed, the qualitative researcher is ready to make sense of the data by way of analysis. The analysis process may vary among researchers depending on their particular approach or orientation. Broadly speaking, the analysis involves: (a) selecting the unit of analysis (e.g., an entire in-depth interview), (b) developing codes (designations that give meaning to some portion of the data in the context of the interview and research question), (c) coding, (d) identifying categories (i.e., groups of codes that share an underlying construct), (e) identifying themes or patterns across categories, and (f) drawing interpretations and implications.
Verification is the other principal piece of the TQF Analyzability component. It is at the Verification stage – that is, when interpretations and implications are being conceptualized – that qualitative researchers give critical attention to the data by looking for alternative sources of evidence that support or contradict early interpretations of the study data. The verification step is an important one that contributes heavily to the overall quality of a qualitative research design. The various verification techniques include: (a) peer debriefing (the unbiased review of the research by an impartial peer), (b) a reflexive journal (the researcher’s diary of what went on in the study including reflections on their own values or beliefs that may have impacted data gathering or analysis), (c) triangulation (contrasting and comparing the data with other sources, such as data from different types of participants, different methods, or different interviewers or moderators), and (d) deviant cases (looking for “negative cases” or outliers that contradict the prevailing interpretation). There is another verification technique – member checking – that many researchers endorse but, from a TQF perspective, potentially weakens the quality of a qualitative study**.
Verification is the topic of discussion in a 2014 article posted in RDR – “Verification: Looking Beyond the Data in Qualitative Data Analysis.” Readers of this blog will also be interested in the Morse, et al. (2002) article in International Journal of Qualitative Methods on verification strategies where the authors advocate utilizing verification “mechanisms” during the course of the qualitative research per se (i.e., not just at the analysis stage) to ensure the “reliability and validity and, thus, the rigor of a study.”
Not unlike credible qualitative research (the subject of the March 2017 RDR post), analyzable qualitative research is the product of knowing how to think about quality approaches to data processing and verification. It is not about concrete procedures to follow but rather the ability to conceptualize and integrate research practices that maximize the validity as well as the ultimate usefulness of a qualitative research study. The TQF Analyzability component is a vehicle by which qualitative researchers can think about where and how to apply quality principles in the processing and verification of their data. In doing so, researchers gain rich interpretations of the data leading to outcomes that address the research question and have value.
Value or usefulness, however, is not solely dependent on credible and analyzable research. Before a qualitative study can be truly useful it must be effectively communicated. That is where Transparency – the third component of the TQF and the subject of the next blog post – comes in.
*Specific recommended qualities of a transcriptionist are delineated in Roller & Lavrakas (2015, p. 35).
**A discussion of member checking and its potential to weaken study design can be found in Roller & Lavrakas (2015, p.43).
Morse, J. M., Barrett, M., Mayan, M., Olson, K., & Spiers, J. (2002). Verification strategies for establishing reliability and validity in qualitative research. International Journal of Qualitative Methods, 1(2), 13–22.