A vibrant conversation is underway about how research data should be curated, managed, and shared. While these were not initially prominent questions in qualitative social science traditions, more recently, discussions have ensued across a wide range of scholarly contexts.1In the last three years, for example, see: Aremu, F. & MacLean, L.M. 2016; Batty, F. 2016; Blattman, C. 2015; Breuning, M. & Ishiyama, J. 2016; Capoccia, G. 2016; Carsey, T. 2014; Dafoe, A. 2014; Ellett, R. & Massoud, M.F. 2016; Elman, C. & Kapiszewski, D. 2014; Elman, C., & Lupia, A. 2016; Flaherty, C. n.d.; Fuji, L. A. 2016; Gastinger, M. 2015; Gelman, A. 2015; Grynaviski, E. 2016; Hall, P. A. 2016; Hayes, J. 2015; Htun, M. 2016; Isaac, J.C. 2016; Ishiyama, J. 2014; Janz, N. 2015; LeBas, A. 2016; Leeper, T. J. 2014; Lupia, A. & Alter, G. 2014; Lupia, & Elman, C. 2014; Lynch, M. 2016; McDermott, R. 2014; Moravcsik, A. 2014; Moravcsik, A. 2016; Morrison, 2016; Patty, J. 2015; Pepinsky, T. 2015 and 2016; Rohlfing, I. 2015; Sil, R., Castro, G., & Calasanti, A. 2016; Subotic, J. 2016; Tripp, A. 2016; Wilson, R. 2015a, 2015b, and 2015c; Yao, J. 2016; and Yashar, D. 2016.

Different approaches to research provide a variety of contributions to knowledge-making, and this methodological pluralism is a source of great strength in social inquiry. Not surprisingly, however, this diversity is also reflected in the wide range of scholars’ positions on openness. Our view is that particular research communities should reach their own consensus on these questions based on their members’ collective judgements. They are in the best position to decide what kinds of information they should provide to each other in order to enhance the credibility and legitimacy of their conclusions.

Often these local conversations are framed as cost-benefit calculations. On the benefit side, the call for transparency in part springs from the recognition that procedurally contingent knowledge claims are strongest when they can be reproduced, which in turn requires open data and materials (Miguel et al. 2014). This suggests that scholars should strive to share the data they used, and to describe how they generated and analyzed those data to reach their conclusions, or explain why they cannot (Lupia and Elman 2014; Elman and Kapiszewski 2014). Simultaneously, there is broad recognition that openness may sometimes be costly, particularly when social science data, both qualitative and quantitative, are infused with ethical and legal complications.

Where scholars stand on these cost-benefit calculations will strongly influence their position on transparency. Academics who think there are considerable knowledge-making advantages to openness and whose work involves fewer ethical and logistical constraints are likely to be more optimistic about the promise of openness. Those who believe openness yields meager epistemic dividends and produces substantial risks will likely have a negative view. It is thus important that scholars making such calculations rely on accurate information and make appropriate comparisons.

For example, one compelling statement about openness suggests that data, as well as documentation about how data are generated (i.e., “materials”), should be “FAIR”—that is, findable, accessible, interoperable, and reusable (Wilkinson et al. 2016).2Just as importantly, of course, FAIR-ness also ensures that data can be used for secondary analysis. Making data and documentation FAIR requires that scholars actively engage in data management planning. Scholars may consider generating and implementing a data management plan to be a significant burden. Yet scholars who collect data carefully and document their procedures are likely already undertaking a large part of what a formal data management plan calls for. Hence, the steps required to make their research more transparent may not impose unreasonable burdens.

Moreover, costs and benefits are likely to change as technology develops further. The notion that innovation can facilitate openness is hardly novel, and is widely shared among quantitative scholars. Technological development also holds great promise for augmenting openness in qualitative research. For instance, transparency requires that data and materials be available to readers in ways that facilitate understanding and evaluating research. Data and analyses are interwoven throughout the text of qualitative scholarship.3While we use “article” throughout this statement, the challenges hold equally true for, and our proposed solution works equally well with, any web-based digital scholarship including blogs, preprints, working papers, and so on. To optimize openness in such work, digital data sources (e.g., archival documents, audio recordings, interview transcripts, ethnographic field notes) and relevant analytic information must be immediately available where the data sources are invoked in the publication (i.e., across the span of an article). They also must be accessible from the article as it is displayed on a journal’s web page (i.e., on the publisher’s platform). However, making data proximate in these ways implies fiduciary and technical responsibilities that exceed what most publishers are willing or able to deliver.

A second transparency requirement concerns protection. Tensions can sometimes arise between transparency and the protection of human participants. Proponents of openness explicitly acknowledge this tension when calling for scholars to make their data “as open as possible [but] as closed as necessary” (ERAC 2016, 15; emphasis added). The key question scholars must ask themselves concerns what is “necessary.” Technology is providing an increasingly sophisticated toolbox of techniques for sharing sensitive data. For example, de-identification might allow some types of data to be shared while still protecting human participants. Sensitive data can be protected via encryption as they travel through cyberspace. Moreover, access controls can be applied to limit the number of people who can see sensitive data and how much of the data they can view. Safely sharing sensitive data—including developing and administering systems to authenticate users and encrypting data and materials—is another burden that few publishers are likely to want to assume.

FAIR-ness, proximity, and protection can be provided in qualitative research through digital enrichment using the open annotation framework. “Open annotation” allows for the generation, sharing, and discovery of digital annotations across the web (Sanderson et al. 2017). The Qualitative Data Repository (QDR, www.qdr.org) and Hypothes.is (https://hypothes.is/), a nonprofit technology firm that develops open source software enabling the creation, storage, and sharing of web-standard annotations, are collaborating to build on that framework. Specifically, they are developing a new approach to transparency in qualitative research: annotation for transparent inquiry (ATI)4ATI draws on an earlier technique, Active Citation (Moravcsik 2010), which fulfilled similar transparency goals, but lacked the FAIR-ness of ATI.

Using ATI allows authors to produce a data supplement to their article consisting of a set of digital annotations. Each annotation is anchored to a segment of article text published on the web and contains one or more of the following elements:

  • A source excerpt: typically 100 to 150 words from a textual source; for handwritten material, audiovisual material, or material generated through interviews or focus groups, an excerpt from the transcription;
  • A source excerpt translation: if the excerpt is not in English, a translation of the key passage(s);
  • An analytic note: discussion that illustrates how the data were generated and analyzed and how they support the empirical claim or conclusion being annotated in the text;
  • A persistent link to the underlying data

Digitally enriching qualitative scholarship using ATI requires participation by and partnership between publishers and repositories, and allows each stakeholder to do what it does best.

Repositories host data and materials (including annotations), making them FAIR and protecting them. Publishers publish articles and facilitate making the relevant data proximate on their platforms. Annotation software carries out the coordination between the two types of stakeholder.

Importantly, access to sensitive data and materials can be controlled, as they are stored in a trusted digital repository. Indeed, the utility of ATI is premised on a counterintuitive observation: controlling access to data will lead to more data being made accessible. By empowering scholars to provide unfettered access to as much data as possible (given legal and ethical imperatives) while protecting sensitive data as much as necessary, ATI should dramatically increase the amount of qualitative data and materials that are shared. As such, ATI has the potential to enhance the clarity with which descriptive and causal claims are made in qualitative research, to transform how transparency is achieved, and to impact the way qualitative social science scholarship is evaluated. This new approach to transparency should thus have a direct impact on the credibility and legitimacy of qualitative scholarship and its utility for evidence-based policy.

Further Reading
Aremu, Fatai & MacLean, Lauren M. 2016. “The Challenges of Collaboration among Africanist Scholars Conducting Field Research: Unintended Consequences of the DA-RT Initiative for the Study of African Politics.” Newsletter of the African Politics Conference Group 12 (2): 4–5.

Batty, Fodei. 2016. “Darting around DA-RT; Why Debates about Research Transparency will ultimately help the study of Africa.” Newsletter of the African Politics Conference Group 12 (2): 6–7.

Blattman, Chris. 2015. “Political scientists are debating a new initiative to make research more trustworthy. Here’s why I’m skeptical.” The Monkey Cage (blog), Washington Post, November 9. https://www.washingtonpost.com/news/monkey-cage/wp/2015/11/09/political-scientists-are- debating-a-new-initiative-to-make-research-more-trustworthy-heres-why-im-skeptical/.

Breuning, Marijke & Ishiyama, John. 2016.“Implementing DA-RT Principles in the American Political Science Review. Comparative Politics Newsletter 26 (1): 54–67.

Capoccia, Giovanni. 2016. “Deferred Automatic Disclosure: Ensuring Data Access and Protecting the “Right to First Use”.” International History and Politics Newsletter 1 (2): 7–9.

Carsey, Thomas M. 2016. “Making DA-RT a Reality.” PS: Political Science & Politics 47 (1): 72–77. doi:10.1017/S1049096513001753.

Elman, Colin & Kapiszewski, Diana. 2014. “Data access and research transparency in the qualitative tradition.” PS: Political Science & Politics 47 (1): 43–47.

Elman, Colin & Lupia, Arthur. 2016. “DA-RT: Aspirations and Anxieties.” Comparative Politics Newsletter 26 (1): 44–52.

European Commission Directorate-General for Research & Innovation. 2016. “H2020 Programme Guidelines on FAIR Data Management in Horizon 2020,” Version 3.0, July 26.

Flaherty, Colleen. 2015. “Political scientists seek delay of transparency standards for publications.” Inside Higher Ed (blog), November 16. https://www.insidehighered.com/news/2015/11/16/political- scientists-seek-delay-transparency-standards-publications/.

Fuji, Lee Ann. 2016. “The Dark Side of DA-RT”. Comparative Politics Newsletter 26 (1): 25–27.

Gastinger, Markus. 2015. “The DA-RT initiative — boon or bane?” November 10. http://markus-gastinger.eu/the-da-rt-initiative-boon-or-bane/.

Gelman, Andrew. 2015. “Political scientists are debating how to make research more transparent. Here’s a way forward.” The Monkey Cage (blog), Washington Post, November 13. https://www.washingtonpost.com/news/monkey-cage/wp/2015/11/13/political-scientists-are-debating-how-to-make-research-more-transparent-heres-a-way-forward/.

Grynaviski, Eric. 2016. “Thinking Holistically about Transparency.” International History and Politics Newsletter 1 (2): 4–7.

Hall, Peter A. 2016. “Transparency, Research Integrity and Multiple Methods.” Comparative Politics Newsletter 26 (1): 28–31.

Hayes, Jarrod. 2015. “Put a DA-RT in it.” Duck of Minerva (blog), November 4. http://duckofminerva.com/2015/11/put-a-da-rt-in-it.html

Htun, Mala. 2016. “DA-RT and the Social Conditions of Knowledge Production in Political Science.” Comparative Politics Newsletter 26 (1): 32–36.

Isaac, Jeffrey C. 2016. “In Praise of Transparency, But Not of DA-RT.” International History and Politics Newsletter 1 (2): 24–29.

Ishiyama, John. 2014. “Replication, Research Transparency, and Journal Publications: Individualism, Community Models, and the Future of Replication Studies.” PS: Political Science & Politics 47 (1): 78–83. https://doi.org/10.1017/S1049096513001765.

Janz, Nicole. 2015. “Political Scientists Trying to Delay Research Transparency.” Political Science Replication (blog), November 7.

https://politicalsciencereplication.wordpress.com/2015/11/07/political-scientists-trying-to-delay-research-transparency/.

King, Gary. 1995. “Replication, Replication.” PS: Political Science and Politics 28: 444–452.

LeBas, Adrienne. 2016. “Research transparency, DA-RT, and the Challenges of Fieldwork in Africa.” Newsletter of the African Politics Conference Group 12 (2): 11–12.

Leeper, T. J. 2014. “Don’t Fear DA-RT.” November 5. http://thomasleeper.com/2015/11/reproducible-qualitative-research/

Lupia, Arthur & Alter, George. 2014. “Data Access and Research Transparency in the Quantitative Tradition.” PS: Political Science & Politics 47 (1): 54–59.

Lupia, Arthur & Elman, Colin. 2014. “Openness in Political Science: Data Access and Research Transparency.” PS: Political Science & Politics, 47 (1): 19–42. https://doi.org/10.1017/S1049096513001716.

Lynch, Marc. 2016. “Area Studies and the Cost of Prematurely Implementing DA-RT.”
Comparative Politics Newsletter 26 (1): 36–39.

McDermott, Rose. 2014. “Research Transparency and Data Archiving for Experiments.” PS: Political Science & Politics 47 (1): 67–71.

Miguel, E., C. Camerer, K. Casey, J. Cohen, K. M. Esterling, A. Gerber, R. Glennerster, D. P. Green, M. Humphreys, G. Imbens, D. Laitin, T. Madon, L. Nelson, B. A. Nosek, M. Petersen, Sedlmayr, J. P. Simmons, U. Simonsohn, and M. Van der Laan. 2016. “Promoting Transparency in Social Science Research.” Science 343 (6166): 30–1. DOI: 10.1126/science.1245317.

Moravcsik, Andrew. 2010. “Active citation: A precondition for replicable qualitative research.” PS: Political Science & Politics 43 (1): 29–35.

———. 2014. “Transparency: The Revolution in Qualitative Research.” PS: Political Science & Politics 47 (1): 48–53.

———. “Qualitative Transparency: Pluralistic Humanistic and Policy-Relevant.” 2016. International History and Politics Newsletter 1 (2): 17–23.

Morrison, James A. “Dearly Bought Wisdom: My Experience with DA-RT.” 2016. International History and Politics Newsletter, 1 (2): 12–16.

Patty, John. 2015. “Responding to a Petition to Nobody (or Everybody).” The Math of Politics, November 6. http://www.mathofpolitics.com/2015/11/06/responding-to-a-petition-to-nobody-or- everybody/.

Pepinsky, Tom. 2015. “The DA-RT Petition.” November 5. http://tompepinsky.com/2015/11/05/the-da-rt-petition/.

———. 2016. “Qualitative Transparency Deliberations.” April 24. https://tompepinsky.com/2016/04/24/qualitative-transparency-deliberations/.

Rohlfing, Ingo. 2015. “Different tools, shared standards: The debate about DA-RT.” Politics, Science, Political Science (blog), November 13. https://ingorohlfing.wordpress.com/2015/11/13/different-tools-shared-standards-the-debate-about-da-rt/.

Sanderson, Robertson, Ciccarese, Paolo, and Young, Benjamin. 2017. “Web Annotation Data Model: W3C Proposed Recommendation,” January 17. https://www.w3.org/TR/2017/PR-annotation-model-20170117/.

Sil, Rudra, Castro, Guzman, & Calasanti, Anna. 2016. “Avant-Garde or Dogmatic? DA-RT in the Mirror of the Social Sciences.” Comparative Politics Newsletter 26 (1): 40–43.

Subotic, Jelena. 2016. “DA-RT Controversy: An Old Methodological War in New Clothing”. International History and Politics Newsletter 1 (2): 2–4.

Tripp, Aili Mari. 2016. “DA-RT and Publishing Research from Authoritarian and Conflict Settings.” Newsletter of the African Politics Conference Group 12 (2): 13–14.

Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J., Appleton, G., Axton, M., Baak, A. & Bouwman, J., et al. 2016. “The FAIR Guiding Principles for scientific data management and stewardship.” Scientific data 3.

Wilson, Rick. 2015. “Transparency Openness and Replication.” Social Science and the Academy (blog), May 21. https://rkwrice.wordpress.com/2015/05/21/transparency-openness-and-replication/.

———. 2015. “DA-RT, TOP and Rolling Back Transparency. Social Science and the Academy (blog), November 10. https://rkwrice.wordpress.com/2015/11/10/da-rt-top-and-rolling-back-transparency/.

———. 2015. “Setting the bar.” Social Science and the Academy (blog), December 15. https://rkwrice.wordpress.com/2015/12/15/setting-the-bar/.

Yao, Joanne. 2016. “Introduction: DA-RT in History and Politics.” International History and Politics Newsletter 1 (2): 2.

Yashar, Deborah. “Editorial Trust, Gatekeeping, and Unintended Consequences.” Comparative Politics Newsletter, 26 (1): 57–64.