Search:           


Econometric Modeling as Junk Science

Originally printed in condensed form in the Spring 2002 edition of the Skeptical Inquirer as Myths of Multiple Regression & Murder: Econometric Modeling as Junk Science. Reprinted by the author’s permission.

Do you believe that every time a prisoner is executed in the United States, eight future murders are deterred? Do you believe that a 1% increase in the percentage of a state's citizens carrying concealed weapons causes a 3.3% decrease in the state's murder rate? Do you believe that 10 to 20% of the decline in crime in the 1990s was caused by an increase in abortions in the 1970s? Or that the murder rate would have increased by 250% since 1974 if the United States had not built so many new prisons? Did you believe predictions that the welfare reform of the 1990s would force 1,100,000 children into poverty?

If you were misled by any of these studies, you may have fallen for a pernicious form of junk science: the use of mathematical modeling to evaluate the impact of social policies. These studies are superficially impressive. Produced by reputable social scientists from prestigious institutions, they are often published in peer reviewed scientific journals. They are filled with statistical calculations too complex for anyone but another specialist to untangle. They give precise numerical "facts" that are often quoted in policy debates. But these "facts" turn out to be will o' the wisps. Often before the ink is dry on one apparently definitive study, another appears with equally precise and imposing, but completely different, "facts." Despite their numerical precision, these "facts" have no more validity than the visions of soothsayers.

These predictions are based on a statistical technique called multiple regression that uses correlational analysis to make causal arguments. Although economists are the leading practitioners of this arcane art, sociologists, criminologists and other social scientists have versions of it as well. It is known by various names, including "econometric modeling," “structural equation modeling,” “path analysis” and simply “multivariate analysis.” All of these are all ways of using correlational data to make causal arguments.

The problem with this, as anyone who has studied statistics knows, is that correlation is not causation. A correlation between two variables may be “spurious” if it is caused by some third variable. Multiple regression researchers try to overcome the spuriousness problem by including all the variables in analysis. The data available for this purpose simply is not up to this task, however, and the studies have consistently failed. But many social scientists have devoted years to learning and teaching regression modeling. They continue to use regression to make causal arguments that are not justified by their data, but that get repeated over and over in policy arguments. I call these arguments the myths of multiple regression.

Five Myths of Multiple Regression

Myth One: More Guns, Less Crime

John Lott, an economist at Yale University, used an econometric model to argue that “allowing citizens to carry concealed weapons deters violent crimes, without increasing accidental deaths.” Lott estimated that each one percent increase in gun ownership in a population causes a 3.3% decrease in homicide rates. Lott and his co-author, David Mustard released the first version of their study on the Internet in 1997, and tens of thousands of people downloaded it. It was the subject of policy forums, newspaper columns, and often quite sophisticated debates on the World Wide Web. The debate followed predictable ideological lines, with one prominent critic denouncing the study as methodologically flawed before she had even received a copy. In a book with the catchy title More Guns, Less Crime, Lott taunted his critics, accusing them of putting ideology ahead of science.

Lott's work is an example of statistical one-upmanship. He has more data and a more complex analysis than anyone else studying the topic. He demands that anyone who wants to challenge his arguments become immersed in a very complex statistical argument, based on a data set that is so large that it cannot even be manipulated with the desktop computers most social scientists use. He is glad to share his data with any researcher who wants to use it, but most social scientists have tired of this game. How much time should researchers spend replicating and criticizing studies using methods that have repeatedly failed? Most gun control researchers simply brushed off Lott and Mustard's claims and went on with their work. Two highly respected criminal justice researchers, Frank Zimring and Gordon Hawkins (1997: 57) wrote an article explaining that:

“just as Messrs. Lott and Mustard can, with one model of the determinants of homicide, produce statistical residuals suggesting that 'shall issue' laws reduce homicide, we expect that a determined econometrician can produce a treatment of the same historical periods with different models and opposite effects. Econometric modeling is a double-edged sword in its capacity to facilitate statistical findings to warm the hearts of true believers of any stripe.”



Top

Page:      1    2    3    4    5    6    7    8    9       Next >>
Caring for our Communities
Environmental Justice
Poverty
Reproductive Health
Adoption
Female Genital Mutilation
Guns & Crime
Indigenous People
Christianity & the Environment
Climate Change
Global Warming Skeptics
The Web of Life
Managing Our Impact
The Far-Right
Ted Williams Archive