Article ID Journal Published Year Pages File Type
5055144 Economic Modelling 2010 17 Pages PDF
Abstract
The paper focuses on how the traditional textbook approach to econometrics, by conflating statistical and substantive information, has contributed significantly to the mountains of untrustworthy evidence accumulated over the last century. In a nutshell, the problem is that when one's favorite theory is foisted on the data, the end result is invariably an empirical model which is both statistically and substantively misspecified, but one has no way to disentangle the two sources of error in order to draw reliable inferences. It is argued that ignoring statistical misspecification, and focusing exclusively on the evaluation of the statistical results - taken at face value - on substantive grounds, has proved a disastrous strategy for learning from data. Moreover, the traditional textbook stratagems of error-fixing designed to alleviate statistical misspecification often make matters worse. Instead, the paper proposes a number of strategies that separate the statistical and substantive sources of information, ab initio, and address the problem by replacing goodness-of-fit with statistical adequacy to secure the statistical reliability of inference, and then proceed to pose questions of substantive adequacy.
Keywords
Related Topics
Social Sciences and Humanities Economics, Econometrics and Finance Economics and Econometrics
Authors
,