In general, people are terrible at forecasting. Each day we forecast a wide range of things and we get most of them wrong. In particular, companies and individuals are notoriously bad at judging the likelihood of uncertain events, and there are numerous studies showing this all too well.
But improving a firm’s forecasting competence even a little can yield a competitive advantage. A company that is right three times out of five on its judgement calls is going to have an ever-increasing edge on a competitor that gets them right only two times out of five. It doesn’t take an expert forecaster to predict that!
Most predictions made in companies, whether they concern project budgets, sales forecasts, or the performance of potential hires or acquisitions, are not the result of statistical analysis and data driven calculations. They are coloured by the forecaster’s understanding of basic statistical arguments, susceptibility to cognitive biases, desire to influence others’ thinking, and concerns about reputation. Indeed, predictions are often intentionally vague to maximize wiggle room should they prove wrong. The good news is that training in reasoning and debiasing can reliably strengthen a firm’s forecasting competence. The Good Judgment Project demonstrated that as little as one hour of training improved forecasting accuracy by about 14% over the course of a year.
About the Good Judgment Project
In 2011, Philip Tetlock teamed up with Barbara Mellers, of the Wharton School, to launch the Good Judgment Project. The goal was to determine whether some people are naturally better than others at prediction and whether prediction performance could be enhanced. The GJP was one of five academic research teams that competed in an innovative tournament funded by the Intelligence Advanced Research Projects Activity (IARPA), in which forecasters were challenged to answer the types of geopolitical and economic questions that U.S. intelligence agencies pose to their analysts.