transparency

The Stanford Prison Experiment: A Case for Sharing Data

The October 2019 issue of American Psychologist included two articles on the famed Stanford Prison Experiment (SPE) condThe Stanford Prison Experimentucted by Philip Zimbardo in 1971. The first, “Rethinking the Nature of Cruelty: The Role of Identity Leadership in the Stanford Prison Experiment” (Haslam, Reicher, & Van Bavel, 2019), discusses the outcomes of the SPE within the context of social identity and, specifically, identity leadership theories espousing, among other things, the idea that “when group identity becomes salient, individuals seek to ascertain and to conform to those understandings which define what it means to be a member of the relevant group” (p. 812) and “leadership is not just about how leaders act but also about their capacity to shape the actions of followers” (p. 813). It is within this context that the authors conclude from their examination of the SPE archival material that the “totality of evidence indicates that, far from slipping naturally into their assigned roles, some of Zimbardo’s guards actively resisted [and] were consequently subjected to intense interventions from the experimenters” (p. 820), resulting in behavior “more consistent with an identity leadership account than…the standard role account” (p. 819).

In the second article, “Debunking the Stanford Prison Experiment” (Le Texier, 2019), the author discusses his content analysis study of the documents and audio/video recordings retrieved from the SPE archives located at Stanford University and the Archives of the History of American Psychology at the University of Akron, including a triangulation phase by way of in-depth interviews with SPE participants and a comparative analysis utilizing various publications and texts referring to the SPE. The purpose of this research was to learn whether the SPE archives, participants, and comparative analysis would reveal “any important information about the SPE that had not been included in and, more importantly, was in conflict with that reported in Zimbardo’s published accounts of the study” (p. 825). Le Texier derives a number of key findings from his study that shed doubt on the integrity of the SPE, including the fact that the prison guards were aware of the results Read Full Text

The Asynchronous Focus Group Method: Participant Participation & Transparency

There is a great deal that is written about transparency in research. It is generally acknowledged that researchers owe it to their research sponsors as well as to the broader research community to divulge the details of their designs and the implementation of their studies. Articles pertaining to transparency Participant participation in asynchronous focus group discussionshave been posted throughout Research Design Review.

The need for transparency in qualitative research is as relevant for designs utilizing off-line modes, such as in-person interviews and focus group discussions, as it is for online research, such as asynchronous focus groups. A transparency detail that is critical for the users of online asynchronous – not-in-real-time – focus group discussions research is the level of participant participation. This may, in fact, be the most important information concerning an asynchronous study that a researcher can provide.

Participation level in asynchronous discussions is particularly important because participation in the online asynchronous mode can be erratic and weak. Nicholas et al. (2010) found that “online focus group participants offered substantially less information than did those in the [in-person] groups” (p. 114) and others have underscored a serious limitation of this mode; that is, “it is very difficult to get subjects with little interest in [the topic] to participate and the moderator has more limited options for energising and motivating the participants” (Murgado-Armenteros et al., 2012, p. 79) and, indeed, researchers have found that “participation in the online focus group dropped steadily” during the discussion period (Deggs et al., 2010, p. 1032).

The integrity and ultimate usefulness of focus group data hinge solidly on the level of participation and engagement among group participants. This is true regardless of mode but it is a particularly critical consideration when conducting asynchronous discussions. Because of this and because transparency is vital to the health of the qualitative research community, focus group researchers employing the online asynchronous method are encouraged to continually monitor, record, and report on the rate and level of participation, e.g., how many and who (in terms of relevant characteristics) of the recruited sample entered into the discussion, how many and who responded to all questions, how thoughtful and in-depth (or not) were responses, how many and who engaged with the moderator, and how many and who engaged with other participants.

This transparent account of participant participation offers the users of asynchronous focus group research an essential ingredient as they assess the value of the study conducted.

Deggs, D., Grover, K., & Kacirek, K. (2010). Using message boards to conduct online focus groups. Retrieved from http://www.nova.edu/ssss/QR/QR15-4/deggs.pdf

Murgado-Armenteros, E. M., Torres-Ruiz, F. J., & Vega-Zamora, M. (2012). Differences between online and face-to-face focus groups, viewed through two approaches. Journal of Theoretical and Applied Electronic Commerce Research, 7(2), 73–86.

Nicholas, D. B., Lach, L., King, G., Scott, M., Boydell, K., Sawatzky, B., … Young, N. L. (2010). Contrasting Internet and face-to-face focus groups for children with chronic health conditions : Outcomes and participant experiences. International Journal of Qualitative Methods, 9(1), 105–122.

Image captured from: https://uwm.edu/studentinvolvement/student-organizations-2/our-communityinvolvement/

The Use of Quotes & Bringing Transparency to Qualitative Analysis

The use of quotes or verbatims from participants is a typical and necessary component to any qualitative research report. It is by revealing participants’ exact language that the researcher helps the user of the research to understand the key takeaways by clarifying through illustration the essential points of the researcher’s interpretations. The idea is not to display an extensive list of what people said but rather provide quotes that have been carefully selected for being the most descriptive or explanatory of the researcher’s conceptual interpretation of the data. As Susan Morrow has written

“An overemphasis on the researcher’s interpretations at the cost of participant quotes will leave the reader in doubt as to just where the interpretations came from [however] an excess of quotes will cause the reader to become lost in the morass of stories.” (Morrow, 2005, p. 256)

By embedding carefully chosen extracts from participants’ words in the final document, the researcher uniquely gives participants a voice in the outcomes while contributing to the credibility – and transparency – of the research. In essence, the use of verbatims gives the users of the research a peek into the analyst’s codebook by Read Full Text