We Tied Vegas in Our First Attempt at Predicting NFL Game Winners Using Machine Learning

How Raw, Unedited Machine Learning Models With Information Known Early in the Week Tied Final Vegas Game Winner Projections and the Spread

Chris Seal
Fantasy Outliers
5 min readDec 6, 2018

--

We, at Fantasy Outliers, are a small team who, up until now, have focused solely on predicting fantasy football yearly and weekly performance using a combination of machine learning and human expertise. Earlier this season, on a whim, we decided to try our hand at predicting actual real NFL team point totals just to see how well we could do. It turns out that our first version of our model isn’t too bad — in that it more or less has matched Vegas’s final game winner projections so far this year by using much less information and resources. By “more or less”, I simply mean that there doesn’t seem to be a statistical difference when comparing results in 2018.

Just to be clear, we are not betters and we do this in our spare time just so we can write articles like this that claim we could have won some money...

Fantasy Outliers versus Vegas — Dataset versus Goliath

It is our general belief that data science and machine learning combined with human expertise will produce better results than either by itself — especially in complex topics like predicting NFL game winners. That said, the following analysis compares only the raw, unedited results of our machine learning model. As such, we go into the comparison with several disadvantages:

  • Our projections only use data available on midnight, Tuesday morning, wherears, we’re comparing to Vegas’s final point spreads that could be updated until kickoff on Sunday (or Monday or Thursday).
  • Our model may know the starting quarterback, but it doesn’t know that a star wide receiver got injured in practice, the head coach was fired, the stud running back got concussed in the 4th quarter, a blizzard is forecast to start during the game, etc
  • Our dataset is limited to mostly traditional statistics (no “NextGen” stuff here). Additionally, we put this model together in our spare time, leveraging the foundation we’d previously built to predict player-level fantasy football performance. There are many opportunities to improve the model we are presenting today (we have day jobs, after all).

So given that this analysis only uses raw projections based on information available Tuesday morning, one could reasonably assume that if human expertise were strategically used to curate these projections — ignoring some while emphasizing others — one should theoretically be able to improve results over that of raw projections alone.

Predicting NFL Game Winners with Machine Learning

Our model was trained on data through 2017. It predicts team point totals, which we used to extract game winners. We then compared our win rate to that of Vegas’s final pre-kickoff point spreads (ignoring games that ended in a tie), as shown below:

Win Rate for Fantasy Outliers and Vegas Game Winners in Weeks 1–13 of 2018

As you can see, out of 192 eligible games, MathBox (MB) got 123 correct and Vegas got 124 games correct resulting in win rates of 64.1% and 64.6%, respectively. So basically, this model performed on par with Vegas so far this year when it comes to predicting game winners.

What is even more interesting is when MathBox and Vegas disagreed on the projection. What was our win rate then? It turns out, MathBox has been about 50/50 in projections so far this year when Vegas disagrees. Out of 27 cases, MathBox was right on 13 of them and Vegas was correct on the remaining 14, resulting in a 48% win rate for these special cases. I could be wrong, but it seems to me that a near 50% win rate when going against the odds could win you some dough.

MathBox has been about 50/50 in projections so far this year when Vegas disagrees. I could be wrong, but it seems to me that a near 50% win rate when going against the odds could win you some dough.

Can MathBox’s Artificial Intelligence Beat the NFL Point Spread?

The short answer is not yet, but it’s close. In comparing the results of the aforementioned model to Vegas’s final point spread, MathBox has performed at about a 48% win rate so far this year. Out of 184 eligible games, where the point difference wasn’t exactly equal to the spread and our projections didn’t tie Vegas, MathBox’s raw, unedited projections have gone 89W-95L so far this season. Results by week are shown below:

Games where MathBox’s raw projections beat Vegas’s final point spread

While did not expect MathBox to beat the spread, we do think it’s kinda cool that it’s close. Remember, these projections are made with information available on Tuesday morning, and we’re comparing them to Vegas’s final point spreads. With some more model tuning and added human expertise, could we do better? I’m pretty confident we could.

This is just our initial analysis. At a future date, we will break down games into certain groups and drill down further to see which types of games MathBox is better at predicting than others, if any. It doesn’t have to be good at everything to be useful.

To stay in touch, please, join us by following us on Twitter (@fantasyoutliers) or subscribing to our weekly newsletter. We’re a small team, so let’s grow together!

--

--

Chris Seal
Fantasy Outliers

Chief Data Scientist at Whitetower Capital Management; Co-Founder, Lead Data Scientist at Fantasy Outliers