A Field in Crisis
In 2011, the social psychologist Daryl Bem published an article suggesting that either (a) ESP was a real phenomenon or (b) we needed to fix the way we conducted and published psychology research and re-evaluate the standards by which we accept evidence as credible (Bem, 2011). This was a historic moment for psychological science! In the period of self-reflection that followed the article’s publication, it became clear that many common research practices were contributing to a flimsy literature and a lack of self-correction. For instance:
As an illustrative example, the biggest effort to assess the replicability (the extent to which a new study can find the same results when using the same methods) of psychological research found that only 39% of original studies published in 2008 could be successfully replicated (Open Science Collaboration, 2015). Other collaborative efforts have since cast doubt on effects that were believed to be solid such as ego depletion (Hagger et al., 2016), facial feedback (Wagenmakers et al., 2016), and professor priming (O’Donnell et al., 2018). It was clear that change was needed.
- exercising undisclosed flexibility in data analyses (p-hacking; Simmons, Nelson, & Simonsohn, 2011),
- hypothesizing after the results are known (HARKing; Kerr, 1998),
- publishing only statistically significant results (Kühberger, Fritz, & Scherndl, 2014), and
- rarely attempting or publishing direct replications (Makel, Plucker, & Hegarty, 2012).
A Reform Movement
Since 2011, there has been a sea change in the way psychological research is conducted. Researchers routinely
Academic journals likewise have
- commit to their hypotheses and analysis plans prior to data collection (i.e., preregistration),
- share their study materials and stimuli on publicly available platforms (e.g., Open Science Framework), and
- publish preprints of their research to solicit feedback from peers prior to submitting their work for formal peer-review.
Finally, professional societies have featured symposia on open-science, replications, and meta-science; and a new professional organization has been created with the explicit purpose of improving psychological science (What Is SIPS?, n.d.).
- increased the options for “open-access” publishing, which increases the accessibility of published research;
- increased their willingness to publish replication studies and well-conducted research that fails to find evidence of an effect;
- offered innovative ways of publishing research such as Registered Reports; and
- incentivized “open science” practices by offering authors badges for using open-science as part of their research process.
Undergraduate researchers typically respond positively and enthusiastically to these changes. When my (Chris Chartier) students hear about these reforms, they almost universally respond with something akin to “of course we should preregister our hypotheses and share our data,” or simply “wait, psychologists haven’t been doing these things all along?” It seems that undergrads still imagine and hope that psychological science follows the idealized version of the hypothetico-deductive model of science they learned in middle school and are unaware of the problematic deviations from this model that were common practice in psychology (see Figure 1).
Research-focused faculty may perceive, incorrectly in our opinion, that they have little incentive to join large-scale collaborations and conduct replication studies. Undergraduate researchers, on the other hand, immediately see that they can gain valuable experience and CV lines that will help them get into graduate school and kick-start their careers while also contributing to the field of psychology. First, engaging in direct replications and joining collaborative projects offers a unique training opportunity that will be attractive to graduate programs. Involvement in such projects show ability to manage a study through its full life cycle and exposure to methodologies beyond the idiosyncratic techniques usually conducted in your mentor’s lab. Although faculty may be concerned with their number of high-profile, first-author publications, any publication, regardless of number or order of authors is impressive for an undergraduate.
Undergrads to the Rescue
Luckily, there are some very exciting ways that these undergraduate researchers can begin making significant contributions to psychological science, that will benefit both the field and their CVs.
The Collaborative Replication & Education Project (CREP; https://osf.io/wfc6u/) connects undergraduate researchers to high-impact published studies in need of direct replication attempts. CREP selects studies that students can replicate by reviewing well-cited and recent empirical articles in psychological science to determine the most feasible ones (Grahe, 2018). Students can then select from among the available studies to identify one that will work in their lab. Students from all around the world collect data on the same studies and have an opportunity to write and publish their findings as a group.
Having a large, global data set is beneficial to science; it provides more evidence on the question at hand, and increases can provide information on its generalizability across locations. These replication studies are also extremely beneficial for an undergrad researcher because they get to contribute their own small data set to a larger project that will eventually be published in a peer-reviewed journal and is likely to make an impact on the field.
The Psychological Science Accelerator (PSA) is a network of labs that devoted some of their research capacity (e.g., time, participants) to multisite, collaborative research projects. The goal of the PSA is to quickly gather large samples that are both geographically and demographically diverse, which will speed the accumulation of evidence about our psychological phenomena. If you want to learn more about the PSA, or if you want to contribute to a PSA project, visit https://psysciacc.wordpress.com/. Although PSA is not uniquely focused on undergraduate researcher involvement, these studies would be perfect projects for interested undergraduates to join.
StudySwap is a platform for researchers to find collaborators and exchange resources (McCarty & Chartier, 2017). Researchers from all around the world can post their research NEEDs (e.g., a lab for an independent replication) or HAVEs (e.g., capacity to collect data for another researcher). It has been called a “craigslist for researchers” (Chawla, 2017). StudySwap is a great resource for undergraduate students to post HAVEs so that they can collect data for researchers outside of their home institution.
Savannah Lewis, an undergraduate researcher at Ashland University, did just that and writes, “My experience working with Liam Satchell (a UK-based forensic psychologist) has been extremely interesting. It started with my research advisor (Chris Chartier) and I posting a HAVE (https://osf.io/pqwj4/) on StudySwap describing that a research assistant (myself) would be able to run participants during the Fall 2017 semester. Liam Satchell then responded, saying he had a perfect study for me to run. This was a great use of StudySwap because Liam had a study ready to go, but was changing universities and did not yet have his lab set up for data collection. I am glad that I volunteered to help Liam Satchell out because it allowed me to learn more about how research is conducted in other areas, and I got to work on a different style of research than what I typically work on with my current professors. The study I collaborated on explores if an individual’s perceptions of threat are influenced by walking gait. This collaboration has given me a chance to become a better researcher and allowed me to have a wider range of experiences on graduate school applications. This is all thanks to StudySwap.”
The Psi Chi Journal of Psychological Research also provides a great outlet for undergraduate work focused on replication and engaging in open science. The journal currently offers four badges: open data, open materials, preregistered, and replications. Undergraduate researchers can conduct independent studies, outside of the collaborative opportunities mentioned above, and then submit their work for publication in Psi Chi’s very own journal at https://www.psichi.org/?page=JN_Submissions. By offering these badges, the journal is incentivizing undergraduate researchers to engage in these excellent practices early in their research careers. This provide a great opportunity to engage in rigorous research and then disseminate in a peer-reviewed outlet!
We hope that the current generation of undergraduate researchers take the practices they learn in their undergraduate research experiences and “infect” their graduate school labs and future students. They are the next generation of psychological scientists, and they will establish the norms of researchers in the coming decades.
To undergraduate psychologists, we say, “you hold the power to move psychological science from a crisis to a golden age of discovery and reproducibility!”
Bem, D. J. (2011). Feeling the future: Experimental evidence for anomalous retroactive influences on cognition and affect. Journal of Personality and Social Psychology, 100, 407–425. https://doi.org/10.1037/a0021524
Chambers, C. D., Feredoes, E., Muthukumaraswamy, S. D., & Etchells, P. J. (2014). Instead of ‘playing the game’ it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1, 4–17. https://doi.org/10.3934/Neuroscience.2014.1.4
Hagger, M. S., Chatzisarantis, N. L. D., Alberts, H., Anggono, C. O., Batailler, C., Birt, A. R., . . . Zwienenberg, M. (2016). A multilab preregistered replication of the ego-depletion effect. Perspectives on Psychological Science, 11, 546–573. https://doi.org/10.1177/1745691616652873
Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2, 196–217. https://doi.org/10.1207/s15327957pspr0203_4
Kühberger, A., Fritz, A., & Scherndl, T. (2014). Publication bias in psychology: A diagnosis based on the correlation between effect size and sample size. PLoS ONE, 9(9). https://doi.org/10.1371/journal.pone.0105825
Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in psychology research: How often do they really occur? Perspectives on Psychological Science, 7, 537–542. https://doi.org/10.1177/1745691612460688
O’Donnell, M., Nelson, L. D., Ackermann, E., Aczel, B., Akhtar, A., Aldrovandi, S., Alshaif, N., . . . Zrubka, M. (2018). Registered replication report: Dijksterhuis and van Knippenberg (1998). Perspectives on Psychological Science. Advance online publication. https://doi.org/10.1177/1745691618755704
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349. https://doi.org/10.1126/science.aac4716
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366. https://doi.org/10.1177/0956797611417632
Wagenmakers, E.-J., Beek, T., Dijkhoff, L., Gronau, Q. F., Acosta, A., Adams, Jr., R. B., . . . Zwaan, R. A. (2016). Registered replication report: Strack, Martin, & Stepper (1988). Perspectives on Psychological Science, 11, 917–928. http://journals.sagepub.com/doi/full/10.1177/1745691616674458
What Is SIPS? (n.d.). Society for the Improvement of Psychological Science. Retrieved March 2, 2018, from https://improvingpsych.org/
Christopher R. Chartier, PhD, is an associate professor of psychology at Ashland University (OH) where he supervises undergraduate research as the Director of the Ashland University International Collaboration Research Center. He is also the Director of the Psychological Science Accelerator and cofounder of StudySwap. His work focuses on increasing the collaborative nature of psychology research, and creating tools that can make psychological science more diverse and representative.
Savannah Lewis is an undergraduate student and psychology researcher at Ashland University. She is majoring in psychology with minors in child and family studies, and religion. She hopes to become either a psychology professor, or an abuse and addiction counselor. She is also a research assistant at the Ashland University International Collaboration Research.
Randy McCarthy, PhD, is a research associate at Northern Illinois University’s Center for the Study of Family Violence and Sexual Assault. He studies the cognitive contributors to aggression, family violence, and spontaneously-formed trait inferences (STIs) that are based on observations of behaviors. And he cofounded StudySwap with Dr. Chris Chartier.
RELATED ARTICLES | VIEW DIGITAL PUB | VIEW PDF ISSUE | TABLE OF CONTENTS
Copyright 2018 (Vol. 22, Iss. 4) Psi Chi, the
International Honor Society in Psychology
Eye on Psi Chi isa magazine designed to keep members
and alumni up-to-date with all the latest information about Psi Chi’s programs,
awards, and chapter activities. It features informative articles about careers,
graduate school admission, chapter ideas, personal development, the various
fields of psychology, and important issues related to our discipline.
Eye on Psi Chi is published quarterly: