Chang, Andrew C., and Phillip Li (2015). “Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say ”Usually Not”,” Finance and Economics Discussion Series 2015-083. Washington: Board of Governors of the Federal Reserve System.
We attempt to replicate 67 papers published in 13 well-regarded economics journals using author-provided replication files that include both data and code. Some journals in our sample require data and code replication files, and other journals do not require such files. Aside from 6 papers that use confidential data, we obtain data and code replication files for 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files. We successfully replicate the key qualitative result of 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors. Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable. We conclude with recommendations on improving replication of economics research.
Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say “Usually Not” – Introduction
In response to McCullough and Vinod (2003)’s failed replication attempt of several articles in the American Economic Review (AER), then-editor of the AER Ben Bernanke strengthened the AER’s data and code availability policy to allow for successful replication of published results by requiring authors to submit to the AER data and code replication files (Bernanke, 2004). Since the AER strengthened its policy, many of the other top journals in economics, such as Econometrica and the Journal of Political Economy, also started requiring data and code replication files.
There are two main goals of these replication files: (1) to bring economics more in line with the natural sciences by embracing the scientific method’s power to verify published results, and (2) to help improve and extend existing research, which presumes the original research is replicable. These benefits are illustrated by the policy-relevant debates between Card and Krueger (1994, 2000) and Neumark and Wascher (2000) on minimum wages and employment; Hoxby (2000, 2007) and Rothstein (2007) on school choice; Levitt (1997, 2002) and McCrary (2002) on the causal impact of police on crime; and, more recently, Reinhart and Rogoff (2010) and Herndon, Ash, and Pollin (2014) on fiscal austerity. In extreme cases, replication can also facilitate the discovery of scientific fraud, as in the case of Broockman, Kalla, and Aronow (2015)’s investigation of the retracted article by LaCour and Green (2014).
This article is a cross-journal, broad analysis of the state of replication in economics. We attempt to replicate articles using author-provided data and code files from 67 papers published in 13 well-regarded general interest and macroeconomics journals from July 2008 to October 2013. This sampling frame is designed to be more comprehensive across wellregarded economics journals than used by existing research. Previous work has tended to focus on a single journal, such as McCullough, McGeary, and Harrison (2006), who look at the Journal of Money, Credit and Banking (JMCB); McCullough and Vinod (2003), who attempt to replicate a single issue of the AER (but end up replicating only Shachar and Nalebuff (1999) with multiple software packages); or Glandon (2010), who replicates a selected sample of nine papers only from the AER.
Using the author-provided data and code replication files, we are able to replicate 22 of 67 papers (33%) independently of the authors by following the instructions in the authorprovided readme files. The most common reason we are unable to replicate the remaining 45 papers is that the authors do not provide data and code replication files. We find that some authors do not provide data and code replication files even when their article is published in a journal with a policy that requires submission of such files as a condition of publication, indicating that editorial offices do not strictly enforce these policies, although provision of replication files is more common at journals that have such a policy than at journals that do not. Excluding 6 papers that rely on confidential data for all of their results and 2 papers that provide code written for software we do not possess, we successfully replicate 29 of 59 papers (49%) with help from the authors. Because we successfully replicate less than half of the papers in our sample even with assistance from the authors, we conclude that economics research is usually not replicable.
See full PDF below.