In September 2015 Phil Tetlock and Dan Gardner released Superforecasting: The Art and Science of Prediction. The book summarizes findings from about 5 years of the Good Judgment Project (GJP; one of five participants in the Aggregative Contingent Estimation program of the IARPA). The overall goal was to investigate the forecasts made by the U.S. Intelligence Community (a federation of 17 intelligence agencies, e.g., NSA and CIA) and compare it to forecasts made by lay people without access to classified information and done in their spare time.
For me the book includes 3 key learnings:
- a good introduction into the Brier Score: this scoring method measures the accuracy of probabilistic predictions by combining “Calibration” and “Resolution” of a series of forecasts
- Calibration is a measurement that compares predicted probability vs. actual outcome; example: if a meteorologists predicts 70% chance of rain will it actually rain in 70% of the cases (if it only rains in 50% of the cases it is called over-confident; if it rains in 90% of the cases this person is under-confident)
- Resolution is the decisiveness of a predictor, i.e., saying “this will happen” and it does
- Traits of people with exceptional forecasting skills, most importantly:
- number affine: often with a background in math, science, or computer programming
- thinking style: starting from the “outside” and only then working towards the specific question to avoid confirmation bias
- active open-mindedness: from the book – “for superforecasters, beliefs are hypotheses to be testet, not treasures to be guarded”
- Teams are better than individuals: predictions from people working in groups were about 10% more accurate than those working alone
If you do not want to buy the book here is a link to a great paper with the most important aspects on this topic by the same author.