FiveThirtyEight Assess their Accuracy

  • Thread starter BillTre
  • Start date
  • #1
BillTre
Science Advisor
Gold Member
2022 Award
2,207
7,340
This is a podcast and it can be found here. Basically, an interview of Nate Silver, the stat head behind FiveThirtyEight.com.
They have made thousands of predictions on politics and sports (which they publish) and consider their competition to be betting sites and other modelers. Nate used to play professional poker.

What I found interesting was how they go about calibrating their statistical models by comparing predictions.
Because they have so many results (large numbers of predictions paired with results; sports is good for this), and how successful they have been (by some statistical analysis) they can look at how accurate their predictions were.
From that point, they try to improve their model.
 

Answers and Replies

  • #2
BWV
1,246
1,424
Thanks, have to listen to the podcast. If this is all they are benchmarking themselves to
https://projects.fivethirtyeight.com/checking-our-work/presidential-elections/
then its incomplete. The interesting questions are how FiveThirtyEight compares to prediction markets and, now that they are widely followed, their impact on prediction markets. One has to assume that FiveThirtyEight's predictions are incorporated and discounted in prediction markets like Predictit, making FiveThirtyEight just one piece of information that the market price incorporates.

The other interesting issue is the impact of FiveThirtyEight's predictions on actual events. If FiveThirtyEight says candidate X has a better chance of winning, that information now taken seriously by donors, party insiders and voters.
 
  • #3
gleem
Science Advisor
Education Advisor
2,084
1,512
Kind of self fulfilling prophecy scenario?
 
  • #4
Ygggdrasil
Science Advisor
Insights Author
Gold Member
3,522
4,181
Here is the written article, published last week by FiveThirtyEight, on their backtesting of their predictions: https://fivethirtyeight.com/features/when-we-say-70-percent-it-really-means-70-percent/

It discusses two main measures by which they are evaluating their predictions. First, are the predictions well calibrated? When for all events where they predicted at a 70% probability, do 7 out of ten of those events occur while the remaining three fail to occur? Second, they examine the skill of their prediction. For example, predicting that each of the 68 teams in the NCAA men's basketball tournament has a 1/68 chance of winning the tournament produces a well calibrated forecast, but not a particularly skillful one.

Here are some independent evaluations/commentary on FiveThirtyEight's methods:
https://www.buzzfeednews.com/article/jsvine/2016-election-forecast-gradeshttps://towardsdatascience.com/why-...lver-vs-nassim-taleb-twitter-war-a581dce1f5fc
 
  • Like
Likes scottdave, WWGD and BillTre
  • #5
BWV
1,246
1,424
FiveThirtyEight does not beat prediction markets

 
  • #6
BillTre
Science Advisor
Gold Member
2022 Award
2,207
7,340
Here is another article from FiveThirtyEight on the same issue, but with results from the 2019 NCAA B-ball results.
They also have some graphs of their measures of skill, uncertainty, and resolution of their forecasts.
The graphs also compare their political and sports forecasts.
 
  • #7
scottdave
Science Advisor
Homework Helper
Insights Author
Gold Member
1,859
848
I have heard Nate's book The Signal and the Noise is good. I found a used copy for $5.
 

Suggested for: FiveThirtyEight Assess their Accuracy

  • Last Post
Replies
12
Views
1K
Replies
2
Views
194
Replies
3
Views
260
Replies
1
Views
244
Replies
17
Views
792
Replies
3
Views
741
  • Last Post
Replies
0
Views
1K
Top