Total Quality Framework

Qualitative Tech Solutions: Coverage & Validity Considerations

Back in 2018, Research Design Review posted an article titled “Five Tech Solutions to Qualitative Data Collection: What Strengthens or Weakens Data Quality?” The focus of this article is on a presentation given in May 2018 concerning technological alternatives TQF Credibilityto qualitative research data collection. Importantly, the aim of the presentation was, not to simply identify different approaches to data collection beyond the in-person and telephone modes but rather, to examine the strengths and limitations of these technological solutions from a data quality – specifically, Credibility – standpoint.

Broadly speaking, technological approaches to qualitative research data gathering offer clear advantages over in-person methods, particularly in the areas of:

  • Representation, e.g., geographic coverage, potential access to hard-to-reach population segments;
  • Cooperation, e.g., convenience and flexibility of time and place for participants, appropriateness for certain demographic segments (18-49 year olds*);
  • Validity associated with data accuracy, e.g., research capturing in-the-moment experiences do not rely on memory recall;
  • Validity associated with the depth of data, e.g., capturing multiple contextual dimensions through text, video, and images;
  • Validity associated with data accuracy and depth allowing for the triangulation of data;
  • Researcher effects, e.g., mitigated by the opportunity for greater reflection and consistency across research events;
  • Participant effects, e.g., mitigated by the multiple ways to express thoughts, willingness to discuss sensitive issues, and (possibly) a lower tendency for social desirability responding; and
  • Efficient use of resources (i.e., time, money, and staff).

There are also potential drawbacks to any technological solution, including those associated with:

  • Uneven Internet access and comfort with technology among certain demographic groups (e.g., sampling favors “tech savvy” individuals), hard-to-reach and marginalized segments of the population;
  • Difficulty in managing engagement, including the unique researcher skills and allocation of time required;
  • Potential participant burnout from researcher’s requests for multiple input activities and/or days of engagement. This is a type of participant effect that negatively impacts validity;
  • Nonresponse due to mode, e.g., unwillingness or inability to participate to a mostly text-based discussion;
  • Data accuracy, e.g., participant alters behavior in a study observing in-home meal preparation;
  • Missing important visual &/or verbal cues which may interfere with rapport building and an in-depth exploration of responses;
  • Difficulty managing analysis due to lots and lots of data (in volume & formats);
  • Fraud, misrepresentation – “Identity is fluid and potentially multiple on the Internet” (James and Bushner, 2009, p. 35) and people may not share certain images or video that reveal something “embarrassing” about themselves**; and
  • Security, confidentiality, anonymity (e.g., data storage, de-identification).

 

 

* https://www.pewresearch.org/internet/fact-sheet/internet-broadband/

** https://www.businesswire.com/news/home/20180409006050/en/Minute-Maid-Debuts-New-Campaign-Celebrates-Good

James, N., & Busher, H. (2009). Online interviewing. London: Sage Publications.

Supporting Observational Research

The following is a modified excerpt from Applied Qualitative Research Design: A Total Quality Framework Approach (Roller & Lavrakas, 2015, pp. 217-219) which is a qualitative methods text covering in-depth interviews, focus group discussions, ethnography, qualitative content analysis, case study, and narrative research.

An important element in the Total Quality Framework Analyzability component is Verification, i.e., taking steps to establish some level of support for the data gathered in order to move the researcher closer to achieving high quality outcomes. The verificationSupporting qualitative data tools at the ethnographer’s disposal go beyond those identified for the in-depth interview (IDI) and group discussion methods in that they include the technique of expanded observation. For example, Lincoln and Guba (1985) stated that it is “more likely that credible findings and interpretations” will come from ethnographic data with “prolonged engagement” in the field and “persistent observation” (p. 301). The former refers to spending adequate time at an observation site to experience the breadth of stimuli and activities relevant to the research, and the purpose of the latter (i.e., persistent observation) is “to identify those characteristics and elements in the situation that are most relevant to the problem or issue” (p. 304)—that is, to provide a depth of understanding of the “salient factors.” Both prolonged engagement and persistent observation speak to the idea of expanding observation in terms of time as well as diligence in exploring variables as they emerge in the observation. Although expanding observations in this way may be unrealistic due to the realities of deadlines and research funding, it is an important verification approach unique to ethnography. When practicable, it is recommended that researchers maximize the time allotted for observation and train observers to look for the unexpected or examine more closely seemingly minor occurrences or variables that may ultimately support (or contradict) the observer’s dominant understanding.

The ultimate usefulness of expanded observation is not unlike deviant or negative case analysis (see earlier link). In both instances, the goal is to identify and investigate observational events (or particular variables in these events) that defy explanation or otherwise contradict the general patterns or themes that appear to be emerging from the data. For example, a researcher conducting in-home nonparticipant observations of young mothers Read Full Text

Qualitative Research Analysis: Selected Articles from 2019

Research Design Review is a blog first published in November 2009. RDR currently consists of more thQualitative Research Analysisan 220 articles and has 650+ subscribers along with nearly 780,000 views. As in recent years, many of the articles published in 2019 centered on qualitative research. This paper — “Qualitative Research: Analysis” — represents a compilation of four of these articles pertaining to qualitative research analysis.

These articles cover a range of topics including: considerations when defining the unit of analysis; a discussion on handling “gaps” in the data; a cautionary perspective on coding, i.e., reminding researchers that an overemphasis on coding may miss the true intention of qualitative data analysis; and a look at a Total Quality Framework approach to the qualitative content analysis method.

A separate paper consisting of 14 2019 RDR articles on design and methods can be found here.