Skip to main content
National Cancer Institute

Blog Disclaimer

Can Principles of Effective Team Science Promote More Robust and Reproducible Research?

As scientific philosopher, Karl Popper, once argued, progress in scientific knowledge can only occur if researchers openly share the details of their findings so that others can subject them to empirical scrutiny. To achieve these ends, scientists implicitly work together in their own "invisible college" (Crane, 1972) of interconnected interests to advance the collective knowledge base. They test and retest others' assumptions to find out which theories withstand efforts at falsification, and then report their empirical findings transparently in the open literature. Each member of these implicit distributed teams adheres rigorously to the standards and ethics of his or her discipline. When violations of these standards are uncovered, they are dealt with by professional societies in explicit and judicious ways.

If that is the way science is supposed to work, why then has the popular and scientific press been replete with stories of irreproducible findings, dead-end scientific endeavors, and stagnation in the scientific accretion of new knowledge? Might there be a hidden flaw in the ways in which science has been historically produced that may be standing in the way of cumulative progress? Is there a contribution that the theoreticians of team science can offer in resolving this issue?

These were the questions entertained, quite candidly, by a specially assembled working group of social scientists at the National Science Foundation (NSF) to discuss ways of promoting robust and replicable research across all disciplines. Chaired by social psychologist John Cacioppo, the committee reviewed the scope and extensiveness of the irreproducibility problem. What became apparent from the ensuing discussion was that the conduct of "normal science" (i.e., nondisrupted science, see Kuhn & Hacking, 2012) is, of course, a social enterprise.

That enterprise is, in turn, embedded within the historical context of a 19th or 20th century publishing, tenure, promotion, and funding environment. Systemic, unintended consequences from the constraints of that environment have resulted in subtle threats to reproducible science, such as the often unconscious manipulation of statistical tests and degrees of freedom to reach significance at the p < .05 level – a practice colloquially referred to as "p-hacking" (Neuroskeptic, 2012). Likewise, limitations from an era of print publication have prohibited authors from including the data they collected to reach their original conclusions, or even the instruments or measures used to collect those data, as an invitation for others to replicate their findings. Page restrictions have also put pressure on editors to publish only those papers that report findings that reach significance, creating a not-so-subtle "prejudice against the null hypothesis," as highlighted years ago by NSF workshop participant Anthony Greenwald (Greenwald, 1975).

These pressures on the invisible colleges make sense within the complex socioecological framework of factors influencing team science (Stokols, Misra, Moser, Hall, & Taylor, 2008). That framework also provides an entrée for creating changes in the context of science with the intent of nudging the larger system back toward more idealized goals.

One of these entrée points is within the context of the publishing paradigm. Psychologist Brian Nosek, who opened the NSF meeting, has argued that traditional scientific publishing has inadvertently created barriers among researchers studying the same phenomena (Nosek & Bar-Anan, 2013). Opening up that communication might be a necessary first step in addressing perennial threats to reproducibility such as the "file drawer problem," in which unpublished false negatives and methodological details are tucked away from further scrutiny or consideration in physical filing cabinets (Dalton, Aguinis, Dalton, Bosco, & Pierce, 2012).

Creating a virtual space for housing that information in a shared commons can be a first step in enabling communities of scientists to verify and build on each other's work (Nosek, 2014; Wineman, 2013). The American Psychological Association, arguably one of the largest publishers in the social sciences, has been experimenting with "open science" with a new online journal, "The Archives of Scientific Psychology." The journal is described as "an open methodology, collaborative data sharing open access journal."

Journal co-editor, Dr. Gary VandenBos, who was present at the NSF meeting, explained how the new journal can support better information sharing by: (a) providing an online, secure, repository for original data; (b) providing co-authorship incentives among data creators and secondary users; (c) reducing space limitations for methods and results descriptions; (d) opening up access to publications through a publishing, rather than subscription, fee; and (e) offering translated abstracts for the public and press to make the scientific content more universally accessible (cf., Wineman, 2013).

From the socioecological perspective of team science, it is easy to imagine how incentives could be further restructured via other modifications to the publishing environment, in a world of easy electronic access to scientific journals. In his book, "Reinventing Discovery: The New Era of Networked Science," Author Michael Nielsen refers to this type of online engineering as creating new "architectures of attention" (Nielsen, 2012). The effect of these small changes, he argued, is often felt in substantial changes to the culture of a community, as individual actions mount up over millions of interactions. Consider the figure below:

Figure 1. Improving the openness of scientific publication, as illustrated through Elsevier's vision for an "Article of the Future".

The figure builds on the "article of the future" project sponsored by the international publisher, Elsevier, but with highlights to the type of functionality that could be theorized to help nudge more openness and transparency. For example, electronic author links could tie into the personally identifying ORCID system for connecting research to researchers, while linking to researchers' virtual presence in their own professional social media spaces. Embedded visualizations could help make the contents of the article accessible to a wider audience. Other links could provide much more detail on protocols, measures, and findings; take readers to secure data enclaves for downloading the original data on which the article's analyses were performed; and connect to other published resources for immediate comparisons.

At this stage, all of these efforts to change the publishing paradigm in favor of a more open, interconnected, and accretive science are in their exploratory stages. As with any experiment, data will be needed to help designers adjust the environment to achieve desirable outcomes. The difference is that, in this case, those adjustments should be made transparently with an open invitation for other publishing entities to replicate or modify within their own spheres.

About the Author

Bradford W. Hesse, PhD, Chief, Health Communication and Informatics Research Branch, Behavioral Research Program, National Cancer Institute

References

Crane, D. (1972). Invisible colleges: diffusion of knowledge in scientific communities. Chicago. IL: University of Chicago Press.

Dalton, D. R., Aguinis, H., Dalton, C. M., Bosco, F. A., & Pierce, C. A. (2012). Revisiting the file drawer problem in meta-analysis: An assessment of published and nonpublished correlation matrices. 65, 221-249.

Greenwald, A. G. (1975). Consequences of prejudice against the null hypothesis. (82 doi:10.1037/h0076157), American Psychological Association, US.

Kuhn, T. S., & Hacking, I. (2012). The structure of scientific revolutions (Fourth edition. ed.). Chicago ; London: The University of Chicago Press.

Neuroskeptic. (2012). The Nine Circles of Scientific Hell. Perspectives on Psychological Science, 7(6), 643-644.

Nielsen, M. A. (2012). Reinventing discovery : the new era of networked science. Princeton, N.J.: Princeton University Press.

Nosek, B. A. (2014). Improving My Lab, My Science with the Open Science Framework. Observer, 27, 12-15.

Nosek, B. A., & Bar-Anan, Y. (2013). Scientific Utopia: 1. Opening Scientific Communication. . Psychological Inquiry, 23(3), 217-243.

Stokols, D., Misra, S., Moser, R. P., Hall, K. L., & Taylor, B. K. (2008). The ecology of team science: understanding contextual influences on transdisciplinary collaboration. Am J Prev Med, 35(2 Suppl), S96-115.

Wineman, L. (2013). Interesting results: Can they be replicated? Monitor on Psychology, 44, 38-41.

See all blog posts

If you are interested in contributing a column, please contact Amanda Vogel at Amanda.Vogel@nih.gov.

Comments

Add Comments