Context

Actively Conducting an Analysis to Construct an Interpretation

It is not uncommon for researchers who are reporting the results of their quantitative studies to go beyond describing their numerical data and attempt to interpret the meaning associated with this data. For example, in a survey concerning services at a healthcare facility, the portion of respondents who selected the midpoint on a five-point scale to rate the improvement of these services from the year before might be interpreted as having a neutral opinion, i.e., these respondents believe the caliber of services has remained the same, neither better nor worse than a year earlier. And yet there are other interpretations of the midpoint response that may be equally viable. These respondents may not know whether the services have improved or not (e.g., they were not qualified to answer the question). Or, these respondents may believe that the services have gotten worse but are reluctant to give a negative opinion.

Survey researchers fall into this gray area of interpretation because they often lack the tools to build a knowledgeable understanding of vague data types, such as scale midpoints. Unless the study is a hybrid research design (i.e., a quantitative study that incorporates qualitative components), the researcher is left to guess respondents’ meaning.

In contrast, the unique attributes of qualitative research methods offer researchers the tools they need to construct informed interpretations of their data. By way of context, latent (coupled with manifest) meanings, the participant-researcher relationship, and other fundamentals associated with qualitative research, the trained researcher collects thick data from which to build an interpretation that addresses the research objectives in a profound and valuable manner for the users of the research.

Qualitative data analysis is a process by which the researcher is actively involved in the creation of themes from the data and the interpretation within and across themes to construct results that move the topic of investigation forward in some meaningful way. This active involvement is central to what it means to conduct qualitative research. Faithful to the principles that define qualitative research, researchers do not rest on manifest content, such as words alone, or on automated tools that exploit the obvious, such as word clouds.

This is another way of saying — as stated in this article on sample size and saturation — that “themes do not simply pop up…but rather are the result of actively conducting an analysis to construct an interpretation.” As Staller (2015) states, “In lieu of the language of ‘discovering’ things with its positivistic roots, the researcher is actually interpreting the evidence” (p. 147).

Braun and Clarke (2006, 2016, 2019, 2021) have written extensively about the idea that “themes do not passively emerge” (2019, p. 594, italics in original) from thematic analysis and that meaning

is not inherent or self-evident in data, that meaning resides at the intersection of the data and the researcher’s contextual and theoretically embedded interpretative practices – in short, that meaning requires interpretation. (2021, p. 210)

An article posted in 2018 in Research Design Review“The Important Role of ‘Buckets’ in Qualitative Data Analysis” — illustrates this point. The article discusses the analytical step of creating categories (or “buckets”) of codes representing shared constructs prior to building themes. As an example, the discussion focuses on three categories that were developed from an in-depth interview study with financial managers — Technology, Partner, Communication. The researcher constructed themes by looking within and across categories, considering the meaning and context associated with each code. One such theme was “strong partnership,” as illustrated below.

Themes from buckets

The theme “strong partnership” did not simply emerge from the data, it was not lying in the data waiting to be discovered. Rather, the researcher utilized their analytical skills, in conjunction with their constructed understanding of each participant’s contribution to the data, to create contextually sound, meaningful themes such as “strong partnership.” Then, with the depth of definition associated with each theme, the researcher looked within and across themes to build an interpretation of the research data targeted at the research objectives, and provided the users of the research with a meaningful path forward.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Braun, V., & Clarke, V. (2016). (Mis)conceptualising themes, thematic analysis, and other problems with Fugard and Potts’ (2015) sample-size tool for thematic analysis. International Journal of Social Research Methodology, 19(6), 739–743. https://doi.org/10.1080/13645579.2016.1195588

Braun, V., & Clarke, V. (2019). Reflecting on reflexive thematic analysis. Qualitative Research in Sport, Exercise and Health, Vol. 11, pp. 589–597. https://doi.org/10.1080/2159676X.2019.1628806

Braun, V., & Clarke, V. (2021). To saturate or not to saturate? Questioning data saturation as a useful concept for thematic analysis and sample-size rationales. Qualitative Research in Sport, Exercise and Health, 13(2), 201–216. https://doi.org/10.1080/2159676X.2019.1704846

Staller, K. M. (2015). Qualitative analysis: The art of building bridging relationships. Qualitative Social Work, 14(2), 145–153. https://doi.org/10.1177/1473325015571210

Member Checking & the Importance of Context

Unique attributes-Impt of Context

A social constructionist orientation to qualitative research leans heavily on many of the unique attributes of qualitative research. Along with the absence of  “truth,” the importance of meaning, the participant-researcher relationship, and flexibility of design, context plays an important role as the social constructionist researcher goes about collecting, analyzing and interpreting, as well as reporting qualitative data. As depicted in the Total Quality Framework, the phases of the research process are connected and support each other to the extent that the integrity of the contextually-rich data is maintained throughout.

Lincoln and Guba (1985) are often cited for their discussion of “member checks” or “member checking,” one of five approaches they advocate toward adding credibility to qualitative research. The authors describe the member check as “the most crucial technique for establishing credibility” (p. 314) because it requires the researcher to go back to participants (e.g., by way of a written summary or transcript, in-depth interview, group discussion) and gain participants’ input on the researcher’s data, analytic categories, interpretations, and conclusions. This, according to Lincoln and Guba (1985), allows the researcher to “assess intentionality” on the part of the participant while also Read Full Text

Qualitative Data Analysis: The Unit of Analysis

The following is a modified excerpt from Applied Qualitative Research Design: A Total Quality Framework Approach (Roller & Lavrakas, 2015, pp. 262-263).

As discussed in two earlier articles in Research Design Review (see “The Important Role of ‘Buckets’ in Qualitative Data Analysis” and “Finding Connections & Making Sense of Qualitative Data”), the selection of the unit of analysis is one of the  first steps in the qualitative data analysis process. The “unit of analysis” refers to the portion of content that will be the basis for decisions made during the development of codes. For example, in textual content analyses, the unit of analysis may be at the level of a word, a sentence (Milne & Adler, 1999), a paragraph, an article or chapter, an entire edition or volume, a complete response to an interview question, entire diaries from research participants, or some other level of text. The unit of analysis may not be defined by the content per se but rather by a characteristic of the content originator (e.g., person’s age), or the unit of analysis might be at the individual level with, for example, each participant in an in-depth interview (IDI) study treated as a case. Whatever the unit of analysis, the researcher will make coding decisions based on various elements of the content, including length, complexity, manifest meanings, and latent meanings based on such nebulous variables as the person’s tone or manner.

Deciding on the unit of analysis is a very important decision because it guides the development of codes as well as the coding process. If a weak unit of analysis is chosen, one of two outcomes may result: 1) If the unit chosen is too precise (i.e., at too much of a micro-level than what is actually needed), the researcher will set in motion Read Full Text