Daniel Kahneman was once asked how we should respond when we make an analysis mistake. He said:
Whenever we are surprised by something, even if we admit that we made a mistake, we say, ‘Oh I’ll never make that mistake again.’ But, in fact, what you should learn when you make a mistake because you did not anticipate something is that the world is difficult to anticipate. That’s the correct lesson to learn from surprises: that the world is surprising.
That last line should be written on forecasters’ walls: The correct lesson to learn from surprises is that the world is surprising.
Isn't this why we run simulations and calculate odds? The world is surprising. Still, if you stick to the odds and something breaks, maybe your model really needs tweaking. We have, wisely, gone back and changed thousands of formal models in fields from physics to economics to social science for this reason: We don't want to make that mistake again. And in many of those cases we didn't make that particular mistake again.
So it seems like while it's worth understanding that the world is surprising, this doesn't diminish the value of fixing a broken model or preparing better next time.
I think this shows the difference between a regime like physics where we expect models to generally work, versus another kind of regime like the stock market where it's unclear whether they work.
The point of the above quote is that in some regimes, models [that we can currently build] don't work. So the lesson to take away from a bad prediction isn't "this model is bad", it's "any model I build will be bad". There might not be enough signal in the data you have (or even is possible to have). It's exacerbated in cases where it's hard to assess the model's performance, e.g. the stock market mostly goes up so any model that predicts that will look pretty good most of the time.
Sure, but in assessing risk, it pays to not overestimate our ability to predict the world. Some model subjects, such as the stock market and the weather, exhibit chaotic behavior. Those who try to ignore that fact get burned.
I only take predictions seriously if you have some stake in the outcome. Otherwise, it's easy to just casually predict something outrageous like "Trump will probably quit before his term is over", and turn around and say you were kidding if it's not right, but take all the credit if it comes true. Go put $10,000 on some prediction market contract, or make a side bet with someone.. anything.
One of my favorite quotes about predicting comes from Taleb: "Don't tell me what you think, tell me what you have in your portfolio."
I like the concept of predictions for the same reason. Per Philip Tetlock's relentless crusading on the topic, it at least implies some sense of rigor, measurability, and accountability.
This is one of the distinctions I make between predictions and forecasts in a business context (and one I've been trying to impress upon stakeholders in my company.) A prediction implies accountability.
I mean you could attach consequences to a forecast (Tetlock himself likes the term "Superforecasting) and likewise ignore them with predictions. It's all just words in an end and everything is a human construct. But still I haven't given up completely.
Predictions can be a tool to test your understanding, I guess maybe like a fantasy investing site to see how well you would do if you were investing real money.
I've been playing a bunch of Metaculus[1] recently for this reason.
I do agree that there are too many folks (pundits and the like) who make predictions all the time with no consequences, and I wish there was a way to hold bad predictors to account. I've thought about making tools that automatically or with crowd sourcing track the predictions of public personas.
Recently, I had a similar question about websites or repositories that keep a track record of predictions/promises. If interested [0], one individual pointed to Long Bets.
Long Bets has the famous Buffet/hedge fund wager, as well as the Ted Danson Red Sox World Series/USMT World Cup wager. Both now settled! But otherwise, it's looked a bit moribund lately to me.
Here's a more active prediction site in this vein associated with Philip Tetlock of Superforecasting fame:
Whenever we are surprised by something, even if we admit that we made a mistake, we say, ‘Oh I’ll never make that mistake again.’ But, in fact, what you should learn when you make a mistake because you did not anticipate something is that the world is difficult to anticipate. That’s the correct lesson to learn from surprises: that the world is surprising.
That last line should be written on forecasters’ walls: The correct lesson to learn from surprises is that the world is surprising.