According to a new study (September 4th, 2015) from Federal Reserve analysts Andrew C. Chang and Phillip Li, more than two-thirds of the peer-reviewed research in well-known economics journals is not replicable with the data provided. Even more problematic, the key research results in these published economics studies cannot be replicated 51% of the time even given consultation with the authors.
Chang and Li offer an overview of their disturbing conclusions about economics studies: “We successfully replicate the key qualitative result of 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors. Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable. We conclude with recommendations on improving replication of economics research.”
More on non-replicability of modern economics studies
The new research is a cross-journal analysis of the current state of replication in economics. The authors try to replicate articles with author-provided data and code files from 67 papers published in 13 well-known general and macroeconomics journals from July 2008 to October 2013. They note that this sample of studies is designed to be more comprehensive across more economics journals than the samples of prior research.
Chang and Li were able to replicate 22 of 67 papers (33%) independently of the authors by following the instructions in the author-provided data and readme files. They note the most common reason they were not able to replicate the other 45 papers is that data and code replication files were not provided. In fact, some authors dis not provide data and code replication files even when their article was published in a journal that requires submission of such files, suggesting that journals do not strictly enforce their policies.
Not counting six papers that use on confidential data for all of their results and two papers that provide code written for software the authors did not have, they managed to replicate 29 of 59 papers (49%) with consultation from the authors.
Chang and Li note: “Because we successfully replicate less than half of the papers in our sample even with assistance from the authors, we conclude that economics research is usually not replicable.”
However, despite their worrisome findings, Chang and Li point out that their replication success rates are still a good bit higher than the rates found in earlier studies of replication in economics. For example, McCullough, McGeary, and Harrison (2006) determined a replication success rate for articles published in the JMCB of 14 of 186 papers (8%), with access to appropriate software, the original article’s use of non-proprietary data, and without assistance from the original article’s authors. When you require that the JMCB archive have both data and code replication files, their success rate moves up to 14 of 62 papers (23%).
The success rates for Chang and Li are 22 of 59 papers (37%), having appropriate software and non-proprietary data, and 22 of 38 papers (58%) when they add the extra requirement of having data and code files. Dewald, Thursby, and Anderson (1986) successfully replicated 7 of 54 papers (13%) from the JMCB having data and code files, the original article’s use of non-confidential data, help from the article’s authors and the necessary software. The comparable figure for the new study is 29 of 38 papers (76%).
See full study below.