So.
Julia Galef has this lovely book, The Scout Mindset, and there’s a delightful game in chapter six, “How sure are you?”, where you attempt to calibrate your uncertainty through a quick series of trivia questions. Just answer the questions below and then indicate your confidence. Per Galef:
“As you go through the list, you should notice your level of certainty fluctuating. Some questions might feel easy, and you’ll be near certain of the answer. Others may prompt you to throw up your hands and say, ‘I have no idea!’ That’s perfectly fine. Remember, the goal isn’t to know as much as possible. It’s to know how much you know.”
Someone with a good uncertainty calibration will miss around half of the questions they put in the “55%” uncertainty bucket while only missing one
out of twenty questions they put in the “95%” bucket. The scatter plot between their predicted versus actual uncertainty will
follow the x=y diagonal line below. When you’re done answering all questions, the plot will update and show you your
performance as well as the ninety-percentile and fifty-percentile confidence intervals. If your red dots are all within the fifty-percentile
confidence interval, you’re well-calibrated!
After reading Galef’s 2021 book, check out (a) Philip Tetlock and Dan Gardner’s Superforecasting (2015) for more on calibrating predictions
in general (which include uncertainty of course), and (b) Duncan Watts’ Everything is Obvious (Once You Know the Answer) (2011)
for the big picture on the limits to what you can know.
And if you’re a developer (one of us!), you might be interested in this little app’s source
code.
Further reading…