The superforecasters are a numerate bunch: Posted on February 4, by Scott Alexander Philip Tetlock, author of Superforecastinggot famous by studying prediction.
The correlation between well-informedness and accuracy was about the same as the correlation between IQ and accuracy. Tetlock was one of these scientists, and his entry into the competition was called the Good Judgment Project.
Are superforecasters just really well-informed about the world? First of all, is it just luck? This is definitely a real thing.
Poor forecasters do the same thing on their predictions. Superforecasters, in contrast, showed much reduced scope insensitivity, and their probability of a war in five years was appropriately lower than of a war in fifteen.
Although this was generally true, he was able to distinguish a small subset of people who were able to do a little better than chance. Are superforecasters just really smart? Most interesting, they seem to be partly immune to cognitive bias.
The superforecasters whom Tetlock profiles in his book include a Harvard physics PhD who speaks 6 languages, an assistant math professor at Cornell, a retired IBM programmer data wonk, et cetera. The year-to-year correlation in who was most accurate was 0.
Or they might break the problem down into pieces: Anyway, the Good Judgment Project then put these superforecasters on teams with other superforecasters, averaged out their decisions, slightly increased the final confidence levels to represent the fact that it was 60 separate people, all of whom were that confidentand presented that to IARPA as their final answer.
Cut to the late s. So what are they really good at? Having established that this is all pretty neat, Tetlock turns to figuring out how superforecasters are so successful. So it seems you can get people to change their estimate of the value of bird life just by changing the number in the question.
They set up an Intelligence Advanced Research Projects Agency to try crazy things and see if any of them worked. Superforecasters seem especially good at this.
The strongest predictor of forecasting ability okay, fine, not by much, it was pretty much the same as IQ and well-informedness and all that — but it was a predictor was the Cognitive Reflection Testwhich includes three questions with answers that are simple, obvious, and wrong.
Maybe all this stuff about probability calibration, inside vs. For example, how much should an organization pay to save the lives of endangered birds? It is important that the Good Judgment Project exists. But the average superforecaster is only at the 80th percentile for IQ — just under Tetlock found that the hedgehogs did worse than the chimp and the foxes did a little better.
The test seems to measure whether people take a second to step back from their System 1 judgments and analyze them critically. None of them are remarkable for spending every single moment behind a newspaper, and none of them had as much data available as the CIA analysts with access to top secret information.
Part of it is just understanding the basics.“So as I said before, Superforecasting is not necessarily too useful for people who are already familiar with the cognitive science/rationality tradition, but great for people who need a high-status and official-looking book to justify it.”.
I Want a Dog: My Opinion Essay (The Read and Write Series Book 1) - Kindle edition by Darcy Pattison, Ewa ONeill. Download it once and read it on your Kindle device, PC, phones or tablets. Use features like bookmarks, note taking and highlighting while reading I Want a Dog: My Opinion Essay (The Read and Write Series Book 1).Download