Are you a hedgehog or a fox?
Tuesday, March 3, 2009 at 11:21PM
Melody in Decision making, Wisdom of Crowds, experts, judgment, prediction markets

Philip Tetlock's book Expert Political Judgment seeks to explore what constitutes good judgment in predicting future events. His 20 years of research finds that when asked to forecast political phenomena experts fare only slightly better than informed dilettantes and worse than simple extrapolation models based upon current trends.

This is bad news for experts, pundits, and expensive consultants paid to give their opinions on what is going to happen. Indeed, Dr. Tetlock found that the more famous the expert, the less accurately he/she forecasted. He also found education and experience had little to say about a forecaster's skill. Forecasters systematically over-predicted unlikely events (longshot bias), exaggerate d the degree to which they "saw it coming all along" (hindsight bias) and committed a number of other mistakes, relying too often on intuition over logic.

This does not mean that expert opinions are only as reliable as a chimp throwing random darts (although it is close...). One thing does make a real impact on forecasting skill: how an expert thinks.

To explore this idea, Dr. Tetlock borrows the famous line from Isaiah Berlin (who in turn had borrowed from a Greek poet): "The fox knows many things, but the hedgehog knows one big thing." According to the book, the better forecasters were foxes: self-critical, eclectic thinkers willing to update their beliefs when faced with contrary evidence, doubtful of grand schemes and  modest about their predictive ability. The less successful forecasters were hedgehogs: thinkers who tended to have one big  idea that they loved to stretch, sometimes to the breaking point. They tended to be articulate and very persuasive as to why their idea explained everything.

Unfortunately, because they often offer simpler, easily digestible messages, hedgehogs are far more likely to be the faces that we see interviewed on the major news circuits with theories of the future of global finance, American decline and other complex topics that often fit a little too neatly with past analogies.

Do prediction markets help at all with this problem? Can they help reduce forecasting bias? One challenge I see is that many of the incentivizing constructs offered on current platforms are leaderboards, appealing to the ego of the participants, and in turn, I would assume, increasing longshot bias. Experts make their names from BIG predictions that no one else has the "courage" to make. Sometimes they are right but usually not, and the payoff is far greater for eventually being right than the penalty is for continuously being wrong. If the same lopsized incentives exist in prediction markets, then the predictions made there will presumably also tend far more towards the extremes.

Here is some good advice from Dr. Tetlock on how amateurs like you and I can test our own hunches:

"Listen to yourself talk to yourself. If you're being swept away with enthusiasm for some particular course of action, take a deep breath and ask: Can I see anything wrong with this? And if you can't, start worrying; you are about to go over a cliff."

Flickr credit: ·Will·

Flickr credit: mikebaird

Article originally appeared on (
See website for complete article licensing information.