810513 research-article2018 EDRXXX10.3102/0013189X18810513Educational ResearcherEducational Researcher REVIEWS/ESSAYS REES: A Registry of Efficacy and Effectiveness Studies in Education Dustin Anderson1, Jessaca Spybrook1, and Rebecca Maynard2 Calls for improving transparency across the social science are increasing. One strategy that is gaining momentum in the quest to increase transparency is the practice of preregistering a study and analysis plan prior to conducting a study. In this article, we examine the potential of preregistration as a strategy for increasing transparency in education studies. We review existing registries within the social sciences and provide a rationale for why we need a registry specifically geared towards education studies. Finally, we introduce the Registry of Efficacy and Effectiveness Studies (REES), developed by the Society for Research on Educational Effectiveness (SREE). The goal of REES is to increase transparency for studies seeking to draw causal conclusions within the education research and evaluation community. Keywords: decision making; educational policy; evaluation; planning; policy analysis Openness and transparency have long been recognized as research community.1 We conclude by exploring the role of jour- vital for science (e.g., Nosek, Spies, & Motyl, 2012). nals and funders in incentivizing researchers to preregister However, calls to reexamine foundational practices impact studies in REES. We hope this article highlights the around these norms are becoming prevalent in social science importance of increasing transparency in the education research research. The calls for increased transparency stem from con- community and how the implementation of REES has the cerns that the growing knowledge base may be skewed, incom- potential to help achieve this goal. plete, or untrustworthy (e.g., Kepes, Bennett, & McDaniel, 2014). For example, in a recent systematic review of studies on Reporting Bias questionable research practices, 91% of studies found severe evi- dence of such practices (Banks, Rogelberg, Woznyj, Landis, & The Cochrane Handbook defines reporting bias as the system- Rupp, 2016). Many are concerned with the legitimacy of the atic error associated with reported and unreported findings knowledge base as a result of some of these questionable research (Higgins, Altman, & Sterne, 2011). Reporting bias may result practices (Gehlbach & Robinson, 2018; Ioannidis, 2005; John, from choices researchers make (a) while conducting a study or Loewenstein, & Prelec, 2012). (b) during the dissemination stage. We briefly review practices in In this article, we briefly review practices that may bias the each phase that may contribute to reporting bias and the factors knowledge base that have been documented across the social sci- that may inflate these practices. Note that we focus strictly on ences. Then we examine calls for the adoption and use of pub- studies seeking causal conclusions or studies testing the impact licly available registries and preanalysis plans as a strategy to of an intervention. broaden the scope of accessible research and help mitigate the Researchers are faced with many choices while conducting a influences of these practices. Next, we review major registries in study that may lead to the manipulation of effect sizes (Miguel the social sciences and provide an argument for why we need an et al., 2014). These choices are particularly apparent during the independent registry within education. Then we introduce the design phase and the data analysis phase. In the absence of Registry of Efficacy and Effectiveness Studies (REES) (https:// prespecified design plans or a priori decision rules, there are www.sreereg.org), developed by the Society for Research on Educational Effectiveness (SREE), to increase transparency for 1Western Michigan University, Kalamazoo, MI studies seeking to draw causal conclusions within the education 2University of Pennsylvania, Philadelphia, PA Educational Researcher, Vol. 48 No. 1, pp. 45 –50 DOI: 10.3102/0013189X18810513 Article reuse guidelines: sagepub.com/journals-permissions © 2018 AERA. http://er.aera.net JANUARY/FEBRUARY 2019 45 opportunities for researchers to engage in practices that are Registries and Preanalysis Plans problematic. For example, researchers may need to decide what In an effort to minimize practices that may contribute to report- groups to compare if there are more than two conditions, what ing bias and increase transparency, we have started to see an observations to exclude, what outcome measures to analyze, increase in attention to publicly available registries of studies and so forth. If they make these decisions on the basis of (Miguel et al., 2014). A registry is a public database where obtaining statistically significant findings, often known as researchers register their studies before the study begins, during p-hacking, they may produce results that appear to be more the study, or upon completion (Banks & McDaniel, 2011). If favorable than they actually are. P-hacking has been attributed registries are searchable by intervention and study characteris- to distortions in the distribution of p-values in the published tics, such registration of studies can help mitigate the effects of literature (Simonsohn, Nelson, & Simmons, 2014). A second publication bias within the knowledge base by increasing and more extreme practice that can lead to bias in the knowl- researcher access to information on all studies, including those edge base is the fabrication of data, which involves intention- with findings that are not statistically significant, which may not ally creating new, false data or modifying existing data to be published and otherwise be difficult to find (Casey, produce statistically significant results (National Academy of Glennerster, & Miguel, 2012; Ioannidis, Munafò, Fusar-Poli, Sciences, 1992). Instances of this are rare but do exist in the Nosek, & David, 2014). literature base (Fanelli, 2009). A basic registry may include information such as name of Practices that occur while reporting findings from a study, or principal investigator, funder, dates of study, and so forth. A during the dissemination process, can also contribute to bias in more in-depth prospective registration might also include a pre- the knowledge base. Selective outcome reporting, HARKing, analysis plan (PAP), or a prestudy plan that explicates details of and the file drawer problem are three well-known reporting the analysis protocol including but not limited to planned pri- practices that lead to publication bias. Selective outcome report- mary and secondary outcome variables, outcome measures, ing occurs when only a select number of outcomes measured in covariates, and/or plans to handle missing data or multiple com- a study are reported as opposed to reporting on the full set of parisons in the same analytic domain (Gelman & Loken, 2013; outcomes that were measured (Norris et al., 2013). Typically, the Olken, 2015). outcomes that are reported are those that are statistically signifi- A PAP allows the prespecified plans and any subsequent post cant. HARKing (Kerr, 1998) refers to an instance when a hoc exploratory analyses to be distinguishable (Ioannidis et al., researcher presents a post hoc hypothesis as an a priori hypoth- 2014). In addition, it minimizes the researcher’s flexibility esis or, similarly, presents exploratory results as if they are confir- around analysis of confirmatory research questions, which matory results. Often these post hoc hypotheses are those that reduces the likelihood of p-hacking or fabrication (Brodeur resulted in statistically significant findings. The file drawer prob- et al., 2013). In essence, specification of a PAP increases the con- lem refers to the lack of null findings being reported and/or pub- fidence in the findings (Miguel et al., 2014; Olken, 2015). lished and can lead to an increase of positive results in published literature (Franco, Malhotra, & Simonovits, 2014). The dissemination process may also be influenced by external Current Registries in the Social Sciences structures/pressures that contribute to a biased knowledge base. Studies have revealed evidence of journals’ propensity to favor The calls for transparency have led to the launch of various studies that show statistically significant positive effects for pub- registries across the social sciences. Currently, there are four lication (Franco et al., 2014). This trend has been particularly primary registries in the social sciences including (a) the evident in top-tier journals, as they are in a competitive market American Economic Association’s registry of Randomized and seek to publish studies that would be highly cited (Gerber, Control Trials known as the AEA RCT Registry, (b) the Malhotra, Dowling, & Doherty, 2010). Concurrent with jour- International Initiative for Impact Evaluation’s Registry for nals favoring significant effects, there are often external incen- International Development Impact Evaluations (RIDIE), (c) tives and pressure on researchers to publish (Leis-Newman, the Evidence in Governance and Politics (EGAP) registry, and 2011; Pigott, Valentine, Polanin, Williams, & Canada, 2013). (d) the Open Science Framework (OSF) Registry launched by Faculty positions specifically incentivize publication through the Center for Open Science (see Table 1). Although each of employment actions such as tenure and promotion, which also these four registries seeks to promote information sharing and carry financial benefits (Brodeur, Lé, Sangier, & Zylberg, 2013; increase transparency and accountability, we assert that the Gerber et al., 2010). Further, in a world where the employment education research community will benefit from a stand-alone, of many researchers rests on the acquisition of external grant independent registry, much like the field of medicine. Our funding, statistically significant results may also affect further rationale is based on the theory that a registry will be easier to funding opportunities. That is, publishing positive, statistically use and more useful to the larger community if (a) it has a rel- significant findings may positively influence one’s career evant, targeted substantive focus, (b) it includes all pertinent advancement or help increase the chances of funding for a designs for the substantive field, and (c) it allows for easy and follow-up study or a different study. These external structures efficient searching and exporting of relevant studies. As we and systems might attract researchers to use methods that describe next, while each of these four registries has strengths, increase the probability of publication (John, Loewenstein, & none of the four registries in Table 1 meet all three of these Prelec, 2012; Nosek et al., 2012). criteria for education impact studies. 46 EDUCATIONAL RESEARCHER Table 1 Basic Information on the Four Primary Existing Registries in the Social Sciences American Economic Registry for Association Registry of International Evidence in Governance Open Science Randomized Control Trials Development Impact and Politics Registry Framework (AEA RCT Registry) Evaluations (RIDIE) (EGAP) Registry (OSF)a Sponsoring group American Economic Association International Initiative for Evidence in Governance and Center for Open Science Impact Evaluation (3ie) Politics Substantive focus Economics, political science, and other International development Governance and politics Any topic area social sciences Types of study Impact studies Impact studies All types of studies All types of studies Designsb RCT RCT Experiments Experiments RDD Field experiments Observational study Matching Lab experiments Meta-analysis Dif in dif/FE Mixed method Other Natural experiment Statistics IV Survey methodology Regression with controls Other Website https://www.socialscienceregistry.org/ http://www.ridie.org/ http://egap.org/content/registration https://osf.io/registries/#! aThere are multiple registry types available within the OSF Registry such as AsPredicted Preregistration, Election Research Preacceptance Competition, and so forth. The information in this table is based on the Prereg Challenge, the OSF Registration type that is the most comprehensive for preregistration of causal impact studies. bDesign names are RCT = randomized controlled trial; RDD = regression discontinuity design; IV = instrumental variable; Dif in dif/FE = difference in differences, fixed effects. We first examine the adequacy of RIDIE and EGAP against education impact studies, making the AEA RCT Registry too these criteria. RIDIE and EGAP are focused on studies related to narrow to meet the needs of the education research community. international development and governance/politics, respectively. Finally, we examine the OSF option in light of the three cri- Researchers searching for impact studies in the domains of inter- teria. The OSF Registry, and specifically the Prereg Challenge we national development and government/politics, respectively, highlight in Table 1, includes a broad substantive focus and a would likely find these registries very useful. However, an educa- wide variety of design options that may be appealing for some tion researcher searching for impact studies would not likely researchers. However, the breadth of the registry made it chal- search RIDIE or EGAP since education is not the substantive lenging to search and identify a specific set of studies, or educa- focus of these registries. Hence, it would be unlikely that regis- tion impact studies using an RCT, QED, or SCD, quickly and tering an education impact study with either of these registries efficiently (Criteria 3). For example, we conducted a search would make the study more visible within the education research within the OSF Registries for education impact studies, narrow- community. This lack of substantive relevance makes RIDIE and ing results to those registered with the Prereg Challenge. We EGAP weak fits for education impact studies. tried several search strings including education AND impact, edu- Next we consider the AEA RCT Registry. One could argue cation AND random*, elementary AND education AND random*. that the AEA RCT Registry has a targeted substantive focus, the Each search yielded a large number of studies. However, in many social sciences; hence, education falls within that domain. cases the titles did not appear relevant. Further, in order to deter- Following that line of reasoning, we consider the second criteria: mine whether or not a study was relevant and to learn more All relevant design options are available. From Table 1, we can about the study details such as grade level, outcome domain, and see that the AEA RCT Registry is limited to researchers planning so forth, a user must click on the study itself and read through a randomized controlled trial (RCT). In education impact stud- each entry, which can be very time-consuming. In terms of ies, we see RCTs, or studies in which units are randomly assigned export options after a search is conducted, OSF allows a user to to condition; quasi-experimental designs (QEDs), or studies in view and print individual entries. However, there is no option to which the treatment and comparison groups are not formed by export data from multiple studies into a usable format, which random assignment; and single case designs (SCDs), or studies can often help a user quickly summarize the findings from the with an experiment where an outcome is measured multiple search. As such, we assert that while the breadth of substantive times across various phases, which are defined by whether or not areas and designs in the OSF result in an extensive database, this an intervention is present. In fact, all three of these designs are database can be challenging to navigate and export when there is deemed as eligible designs for assessing the impact of an inter- interest in one particular type of study—in this case, an educa- vention by the What Works Clearinghouse (https://ies.ed.gov/ tion impact study. ncee/wwc/).2 Limiting the registry to only RCTs, as would be To illustrate the value of a registry that meets all three criteria, the case in the AEA RCT Registry, would exclude many we briefly turn to the field of medicine and clinicaltrials.gov, a JANUARY/FEBRUARY 2019 47 registry that has a clear substantive focus, accommodates rele- efficient planning and expedite the process of conducting (a) vant designs, and is easily searchable. Clinicaltrials.gov is the research syntheses, as studies that are not in the published litera- central source for researchers conducting clinical trials to prereg- ture will be more easily located; and (b) replication studies, since ister their studies. The questions are tailored to clinical trials and study details and PAPs are a part of a REES entry. Fourth, and use language that is familiar to researchers planning and con- also contingent on growing the REES database, we believe it has ducting clinical trials, which makes it easier for researchers to the potential to provide a valuable mechanism for assessing the enter their studies. Further, researchers, practitioners, and indi- extent and nature of publication bias in education research. viduals searching for clinical trials know to go to ClinicalTrials. As noted above, REES was designed specifically to accom- gov to search for clinical trials and are able to easily search for modate impact studies in education. In an effort to make the in-process or completed trials on a given outcome domain (con- registry accessible to education researchers, the language used is dition or disease) within a relevant sample age group. Just like in similar to that used by key infrastructures in education such as medicine, by creating one registry of impact studies for educa- the WWC, IES, and the National Science Foundation (NSF). tion, we aim to make it easy for those entering studies and those Recognizing that studies often undergo changes, REES allows searching for studies. As we discuss in detail in the next section, users to update study entries and chronicles study changes in a the targeted substantive focus of REES allows for the use of lan- clear and nonjudgmental manner. Changes are time-stamped, guage that is familiar to education researchers and relevant to the and researchers are encouraged to include a narrative description designs of education impact studies in an effort to make the pro- of the changes. cess of entering a study quick and easy. Further, because research- REES is an interactive website. It was designed with the goal ers must describe their studies using a limited set of categorical of enabling researchers to quickly and easily create a registry terms, searching REES is intended to be easy and efficient, and entry. Any designated study administrator can enter study data export options include individual registry entries or Excel-based into REES and make updates at any time in the future. A desig- spreadsheets with data from multiple studies. nated collaborator can view the entry while the study data are entered but cannot make changes to the entry. Registry entries The REES can be started and stopped at any time, and a portable docu- ment format (pdf) version of a partially complete or fully com- SREE, with the support of the Institute of Education Sciences (IES) plete entry can be saved, downloaded, and printed at any point. (R305U150001), developed and launched REES, a registry for The aim is that a study with a detailed proposal, such as an impact studies in education. The vision for REES is to be a reliable IES-funded Goal 3, efficacy or replication project, should trans- source for identifying all impact studies in education, including fer easily into a REES entry. Entries within REES are searchable planned, in-process, or completed studies. We define impact studies and can be exported into an Excel file. as those seeking to determine the efficacy or effectiveness of an edu- A registry entry includes basic study information as well as cational intervention or strategy (Institute of Education Sciences and details related to the design and analysis plan, or the PAP. A com- National Science Foundation, 2013). Consistent with the designs plete REES entry includes eight sections and the following deemed eligible by the What Works Clearinghouse (WWC) information: Standards Handbook Version 4.0 (2018), REES accepts RCTs, QEDs, Regression Discontinuity Designs (RDD), and SCDs. Both - Section 1: General Study Information RCTs and QEDs are commonly used in impact studies in education || Study title, principal investigator(s) names and affili- and are considered acceptable designs by the WWC, although the ations, registration date, funder(s), award number, two types of studies differ in terms of their highest potential rating institutional review board (IRB) approval date and under the WWC Group Design Standards with RCTs having the number, any other registration numbers, study start potential to meet standards without reservations and QEDs having and end date, intervention start and end date, phase the potential to meet standards with reservations. RDDs are also of study, brief abstract, keywords used in impact studies in education and can meet WWC RDD stan- - Section 2: Description of Study dards with or without reservations. SCDs have a set of pilot stan- || Type of intervention, topic area, number of inter- dards through the WWC, and an SCD has the potential to either vention arms, target school level, target school type, meet the WWC Pilot SCD standards with or without reservations. locations of implementation, brief description of In addition to the goal of trying to increase transparency and intervention condition(s), brief description of com- potentially reduce reporting bias, we are optimistic that the estab- parison condition lishment of REES will improve education research, policy, and - Section 3: Research Questions practice in several other ways. First, completing a REES entry || Description of confirmatory and exploratory research compels researchers to think carefully about all aspects of the question(s) study as a PAP is included in registering a study. We believe this - Section 4: Study Design has the potential to improve the overall methodological rigor and || Identification of research design including presence quality of the study design and analysis. Second, it allows research- of blocking, unit of assignment, probability of assign- ers, policymakers, and funders to easily identify studies that are in ment, unit outcome data is measured process or complete, which we anticipate will make it easier to - Section 5: Sample Characteristics identify gaps in the research and areas to invest resources. Third, || Number of units in the intervention(s) conditions we hope that as the REES database grows, it will facilitate more at each level, number of units in the comparison 48 EDUCATIONAL RESEARCHER condition at each level, sample exclusion criteria at education repository (e.g. see the SREE Registry of Efficacy and each level, sample inclusion criteria at each level Effectiveness Studies)” (p. 79). This represents the first year that - Section 6: Outcomes plans for preregistration were included as part of the RFA. Journal || Number of outcome(s) for each confirmatory ques- editors also play a key role in how they support preregistration. At tion, for each outcome—domain, name of outcomes the extreme end, journals may require preregistration for publica- measure, scale associated with outcome measure, tion. For example, in medicine, the International Committee of whether same outcomes are being collected in both Medical Journal Editors released a statement that studies must be groups preregistered in order to be considered for publication (De - Section 7: Analysis Plan Angelis et al., 2005). For journals not yet ready to require prereg- || Description of baseline measures, identification of istration, badges of transparency for reported results that were covariates to be included in the model at each level, pre-registered may be added to publications. We have seen exam- description of analytic model, plan for missing data ples of this in some areas of the social sciences including - Section 8: Additional Materials Psychological Science, which gives authors an opportunity to earn || Links to study data, links to reports or study websites, transparency badges if they meet established criteria (Kidwell links to publications, or upload of relevant files such et al., 2016). It is important to note new funding or publication as study proposals, findings, data, measures, and so policies around preregistration would require structural changes forth as each would need qualified reviewers in charge of verifying such preregistration. Lastly, support for preregistration from research- A unique feature of REES is the manner in which informa- ers planning impact studies is critical. This support may come in tion is captured. To the extent possible, the information is col- the form of not only preregistering their own studies but also lected through questions with discrete response categories. This encouraging others to preregister relevant studies and embracing serves three purposes. First, it promotes consistency in language the push towards more transparency to improve education across REES entries. For example, in a narrative description of research. Like other areas of the social sciences, we are at a critical the design, a user might talk about how clusters are randomly time, and we believe that active engagement and participation in assigned to conditions and call the design a cluster randomized REES across the education research community has the potential trial. A different user may call the same design a group random- to improve the rigor and credibility of education research in the ized trial or a field trial. In REES, once a user selects RCT as the future. design option, she or he answers a series of questions to identify the specific type of RCT, resulting in the same design names NOTES being used across all entries. Second, it allows users to more eas- Dustin Anderson and Jessaca Spybrook are joint first authors. ily search the database by study characteristics such as design, 1The authors of this article led the design and development of topic, or grade level since responses are recorded and stored REES. REES was supported by a grant from the Institute of Education using the same categories for all studies. Third, it ensures a mini- Sciences (R305U150001). 2Note that single-case design standards and procedures are still in mum depth of information for each study. For example, in a the pilot phase. narrative form, the information related to the primary outcomes for a study could vary greatly across studies. In REES, a standard REFERENCES set of information, such as the number of outcome measures per Banks, G. C., & McDaniel, M. A. (2011). The kryptonite of evidence- research question and the outcome domain for each research based I–O psychology. Industrial and Organizational Psychology, question, is elicited using discrete responses for all studies. 4(1), 40–44. https://doi.org/10.1111/j.1754-9434.2010.01292.x Additional information can be included in Section 8 of the reg- Banks, G. C., Rogelberg, S. G., Woznyj, H. M., Landis, R. S., & istry, but completing Sections 1 through 7 should provide a Rupp, D. E. (2016, September). Editorial: Evidence on question- similar level of detail for all studies. able research practices: The good, the bad, and the ugly. Journal of Business and Psychology, 31(3), 323–338. Engaging the Research Community Brodeur, A., Lé, M., Sangier, M., & Zylberg, Y. (2013). Star wars: The empirics strike back (IZA Discussion Paper No. 7268). Bonn, REES was launched in October 2018. Engagement by the Germany: Institute for the Study of Labor (IZA). research community is critical in the success of REES. This Casey, K., Glennerster, R., & Miguel, E. (2012). Reshaping institu- includes funders, journal editors, and researchers. Funders of tions: Evidence on aid impacts using a pre-analysis plan. Quarterly education research play a key role in the likelihood researchers Journal of Economics, 127(4), 1755–1812. will preregister impact studies in education in REES. The role of De Angelis, C. D., Drazen, J. M., Frizelle, F. A., Haug, C., Hoey, J., Horton, R., . . . International Committee of Medical Journal funders could vary from encouraging researchers to preregister Editors. (2005). Is this clinical trial fully registered?—A statement studies in REES to mandating preregistration. As an example, from the International Committee of Medical Journal Editors. EGAP’s Metaketa initiative mandates preregistration of analysis New England Journal of Medicine, 352(23), 2436–2438. https://doi details prior to collecting outcome data in order to obtain fund- .org/10.1056/NEJMe058127 ing (Dunning, 2016). In education, the Field Year 2019 IES Dunning, T. (2016). Transparency, replication, and cumulative learn- Request for Applications (RFA) (https://ies.ed.gov/funding/ ing: What experiments alone cannot achieve. Annual Review ncer_progs.asp) states that Data Management Plans for Goal 3 of Political Science, 19(1), S1–S23. Retrieved from https://doi studies should include “Plan for pre-registering the study in an .org/10.1146/annurev-polisci-072516-014127 JANUARY/FEBRUARY 2019 49 Fanelli, D. (2009, May 29). How many scientists fabricate and falsify National Academy of Sciences. (1992). Responsible science: Vol. I. research? A systematic review and meta-analysis of survey data. PLoS Ensuring the integrity of the research process. Washington DC: one, 4(5), e5738. https://doi.org/10.1371/journal.pone.0005738 National Academy Press. Retrieved from https://www.nap.edu/ Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in catalog/1864/responsible-science-volume-i-ensuring-the-integrity- the social sciences: Unlocking the file drawer. Science, 345(6203), of-the-research 1502–1505. Norris, S. L., Moher, D., Reeves, B. C., Shea, B., Loke, Y., Garner, S., Gehlbach, H., & Robinson, C. D. (2018). Mitigating illusory results . . . Wells, G. (2013, March). Issues relating to selective reporting through preregistration in education. Journal of Research on when including non-randomized studies in systematic reviews on Educational Effectiveness, 11(2), 296–315. the effects of healthcare interventions. Research Synthesis Methods, Gelman, A., & Loken, E. (2013). The garden of forking paths: Why mul- 4(1), 36–47. doi:10.1002/jrsm.1062 tiple comparisons can be a problem, even when there is no “fishing expe- Nosek, B. A., Spies, J. R., & Motyl, M. (2012, November 7). Scientific dition” or “p-hacking” and the research hypothesis was posited ahead utopia: II. Restructuring incentives and practices to promote truth of time. Retrieved from http://www.stat.columbia.edu/~gelman/ over publishability. Perspectives on Psychological Science, 7(6), 615– research/unpublished/p_hacking.pdf 631. doi:10.1177/1745691612459058 Gerber, A., Malhotra, N., Dowling, C. M., & Doherty, D. Olken, B. A. (2015). Promises and perils of pre-analysis plans. Journal (2010). Publication bias in two political behavior literatures. of Economic Perspectives, 29, 61–80. American Politics Research, 38(4), 591–613. doi:http://dx.doi Pigott, T. D., Valentine, J. C., Polanin, J. R., Williams, R. .org/10.1561/100.00008024 T., & Canada, D. D. (2013). Outcome-reporting bias Higgins, J. P., Altman, D. G., & Sterne, J. A. (2011). Chapter 8: in education research. Educational Researcher, 42(8), Assessing risk of bias in included studies. In J. P. Higgins & 424–432. S. Green (Eds.), Cochrane handbook for systematic reviews of inter- Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014). P-curve: A ventions (5.1.0 ed.). The Cochrane Collaboration. Retrieved from key to the file drawer. Journal of Experimental Psychology: General, www.handbook.cochrane.org 143, 534–547. doi:10.1037/a0033242 Institute of Education Sciences and National Science Foundation. What Works Clearinghouse (2018). Standards handbook (Version 4.0). (2013). Common guidelines for education research and develop- Retrieved from https://ies.ed.gov/ncee/wwc/Handbooks ment. Retrieved from http://ies.ed.gov/pdf/CommonGuidelines.pdf Ioannidis, J. P. (2005, August). Why most published research findings are false. PLoS Med, 2(8), 0696–0701. AuThORS Ioannidis, J. P., Munafò, M. R., Fusar-Poli, P., Nosek, B. A., & David, S. DUSTIN ANDERSON is an assistant director of research for the High P. (2014). Publication and other reporting biases in cognitive sciences: Impact Leadership Project at Western Michigan University, 1903 W. detection, prevalence and prevention. Trends in Cognitive Science, 18(5), Michigan Ave, Kalamazoo, MI 49008-5283; dustin.anderson@wmich 235–241. Retrieved from http://doi.org/10.1016/j.tics.2014.02.010 .edu. His research focuses on strategies for increasing transparency John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prev- across education research and increasing leadership capacity within alence of questionable research practices with incentives for truth schools that support school renewal initiatives. telling. Psychological Science, 23(5), 524–532. Kepes, S., Bennett, A. A., & McDaniel, M. A. (2014). Evidence-based JESSACA SPYBROOK is a professor of evaluation, measurement, management and the trustworthiness of our cumulative scien- and research at Western Michigan University, 1903 W. Michigan Ave, tific knowledge: Implications for teaching, research, and practice. Kalamazoo, MI 49008-5283; [email protected]. Her Academy of Management Learning & Education, 13, 446–466. research focuses on improving the design of large-scale field trials in Retrieved from http://dx.doi.org/10.5465/amle.2013.0193 education. Kerr, N. L. (1998). HARKing: Hypothesizing after the results are REBECCA MAYNARD is the university trustee professor of educa- known. Personality and Social Psychology Review, 2, 196–217. tion and social policy at the University of Pennsylvania, 3700 Walnut Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Street, Philadelphia, PA 19104; [email protected]. Her research Piechowski, S., Falkenberg, L.-S., . . . Nosek, B. A. (2016). focuses on methods for integrating program evaluation and improve- Badges to acknowledge open practices: A simple, low-cost, effec- ment science and on improving the quality and utility of research tive method for increasing transparency. PLOS Biology, 14(5), syntheses. e1002456. https://doi.org/10.1371/journal.pbio.1002456 Leis-Newman, E. (2011). Securing tenure: On the tenure track? Here are four keys to making sure you get the ultimate prize. Monitor on Manuscript received December 4, 2017 Psychology, 42(5), 7. Revisions received April 11, 2018, and Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, September 14, 2018 A., . . . Van der Lann, M. (2014). Promoting transparency in social Accepted October 8, 2018 science research. Science, 343(30), 30–31. 50 EDUCATIONAL RESEARCHER