How many recessions has North America had in the last two decades? How long has the Eurozone crisis been going on? Does anyone know anymore? It’s been nothing but doom and gloom for years. Doom and gloom which immediately followed periods of growth that was too rapid or optimism that was too unfounded. What the heck happened?
We can blame the governments for failing to keep the currencies in check and failing to invest in innovation and jobs, we can blame the private sector for trying to rampage out of control, or we can blame the economists who give everyone bad advice. Maybe that’s what we should be doing. According to some very recent research, by Emre Soyer and Robin Hogarth, which is being published in a special section in the July-September 2012 issue of the International Journal of Forecasting, we have The Illusion of Predictability [preprint] (How Regression Statistics Mislead Experts) which can be succinctly summarized by saying economists are overconfident [and] so are you (as summarized by Justin Fox over on the HBR blogs).
Soyer and Hogarth did a study with 257 economists who were asked to read about a regression analysis that related independent variable X to dependent variable Y and then answer questions about the probabilities of various outcomes. When the results were presented in the typical manner (as average outcomes followed by a few error terms), the economists did a really bad job of answering the questions. They paid too much attention to the averages, and too little to the uncertainties inherent in them, thereby displaying too much confidence. Moreover, they did only slightly better when they were shown the numerical results plus scatter graphs. Only the economists who were shown only the graphs actually got most of the answers [close to] right.
In other words, when the data is presented in standard form, statistically literate experts are just as likely to glom (glom glom) onto the point estimate and discount the uncertainty as innumerate journalists and make the same mistakes. (They could use Pinky and the Brain’s refresher refresher lesson on statistics.) Ouch!
We in Supply Management know that the world is often much less predictable than economists would lead us to believe — having to deal with the effects of demand spikes, supply shortages, currency fluctuations, labour strikes, and natural disasters on a(n almost) weekly basis — and that no economic model is going to capture the full extent of the reality of the situation. It’s too bad that an average economist doesn’t, because if (s)he did, then maybe the advice wold be better, and the markets would, as a result of more rational actions, be more stable and make our job a little easier. In the interim, we can do our part by making sure that we help procure any services that require an economist or economic analysis and insure that such economist or group has a tendency for presenting, and analyzing data, the right way without unnecessary exuberance, one way or the other. Because this result, captured in the preprint, is scary:
72% of the participants believe that for an individual to obtain a positive outcome with 95% probability, a small X (X < 10) would be enough, given the regression results. A majority state that any small positive amount of X would be sufficient to obtain a positive outcome with 95% probability. However, in order to obtain a positive outcome with 95% probability, a decision maker should choose approximately X=47.
Simple math says that the majority of the participants were off by a factor of 5 (or more). Ouch! Late last year the BBC ran a point of view article that said we should beware of experts when it comes to running things. Maybe they were right!