One of the 10 unique attributes of qualitative research is the “absence of truth.” This refers to the idea that the highly contextual and social constructionist nature of qualitative research renders data that is, not absolute “truth” but, useful knowledge that is the matter of the researcher’s own subjective interpretation. For all these reasons – contextuality, social constructionism, and subjectivity – qualitative researchers continually question their data, scrutinize outliers (negative cases), and implement other steps towards verification.
Qualitative researchers also conduct their research in such a way as to maximize the accuracy of the data. Accuracy should not be confused with “truth.” Accuracy in the data refers to gaining information that comes as close as possible to what the research participant is thinking or experiencing at any moment in time. This information may be the product of any number of contextual (situational) and co-constructed factors – i.e., the absence of “truth” – yet an accurate account of a participant’s stance on a given issue or topic.
It is accuracy that qualitative researchers strive for when they craft their research designs to mitigate bias and inconsistency. For example, focus group moderators are trained to give equal attention to their group participants – allowing everyone an opportunity to communicate their thoughts – rather than bias the data – i.e., leading to inaccurate information – by favoring more attention on some participants than on others. A trained moderator is also skilled at listening for inconsistencies or contradictions throughout a discussion in order to follow up on each participant’s comments, asking Read Full Text
In Conceptual Blockbusting: A Guide to Better Ideas, James Adams offers readers a varied and ingenious collection of approaches to overcoming the barriers to effective problem solving. Specifically, Adams emphasizes the idea that to solve complex problems, it is necessary to identify the barriers and then learn to think differently. As far as barriers, he discusses four “blocks” that interfere with conceptual thinking – perceptual, emotional, cultural and environmental, and intellectual and expressive – as well as ways to modify thinking to overcome these blocks – e.g., a questioning attitude, looking for the core problem, list-making, and soliciting ideas from other people.
Adams’ chapter on emotional blocks discusses ways that the thinking process builds barriers to problem solving. One of these is the inability or unwillingness to think through “chaotic situations.” Adams contends that a path to complex problem solving is bringing order to chaos yet some people have “an excessive fondness for order in all things” leaving them with an “inability to tolerate ambiguity.” In other words, they have “no appetite for chaos.” Adams puts it this way –
The solution of a complex problem is a messy process. Rigorous and logical techniques are often necessary, but not sufficient. You must usually wallow in misleading and ill-fitting data, hazy and difficult-to-test concepts, opinions, values, and other such untidy quantities. In a sense, problem-solving is bringing order to chaos. (p. 48)
Problem solving is a “messy process” and no less so when carrying out an analysis of qualitative data. There are several articles in Research Design Review that Read Full Text
There is a significant hurdle that researchers face when considering the addition of qualitative methods to their research designs. This has to do with the analysis – making sense – of the qualitative data. One could argue that there are certainly other hurdles that lie ahead, such as those related to a quality approach to data collection, but the greatest perceived obstacle seems to reside in how to efficiently analyze qualitative outcomes. This means that researchers working in large organizations that hope to conduct many qualitative studies over the course of a year are looking for a relatively fast and inexpensive analysis solution compared to the traditionally more laborious thought-intensive efforts utilized by qualitative researchers.
Among these researchers, efficiency is defined in terms of speed and cost. And for these reasons they gravitate to text analytic programs and models powered by underlying algorithms. The core of modeling solutions – such as word2vec and topic modeling – rests on “training” text corpora to produce vectors or clusters of co-occurring words or topics. There are any number of programs that support these types of analytics, including those that incorporate data visualization functions that enable the researcher to see how words or topics congregate (or not), producing images such as these Read Full Text