Forecasting the Oscars Like a Boss: The Predictions

UPDATE

This post has been covered on the Wall Street Journal!

Last post, I described my mildly obsessive strategy for making predictions in my Oscar pool this year. I’ve been driven to such measures by the repeated and humiliating losses to my brother Jon for the past quarter-score.

To recap that post: the common wisdom among Oscar pundits is that the “precursor” awards which happen before the Academy Awards (e.g. the Golden Globes, Screen Actors / Directors / Producers Guild, BAFTA, and the Critics Choice awards) tend to correlate with who wins Oscars. From what I understand, Jon looks over these when choosing a winner. I tried the same thing last year, and was on track to win until Meryl Streep won Best Actress in an upset (jerk). That night i made a vow – never again.

So, I decided to take the same approach this year, but make it more systematic. I grabbed 20 years worth of award data from the Internet Movie Database, and built a model that takes into account the degree to which each precursor Ceremony predicts the Oscars (different ceremonies do better in different categories). My theory is that this might provide an edge for close calls. Our Oscar pool also incorporates an interesting twist, in that we have some freedom to down-weight predictions that we aren’t sure of. My model makes probabilistic estimates, and thus also gives a strategy for how to weight each prediction.

Now that all of the precursor awards have taken place, I’m able to apply the model for the 2013 awards. Here are the main results (who really cares about Best Live Action short? Sorry, guy nominated for Best Live Action Short), with some punditry for good measure:

Best Picture

Argo (58%)

Les Miserables (12%)

Argo has swept the dramatic awards, and Les Mis won for best Golden Globe (Comedy/Musical). The last time a movie swept the dramatic awards and lost Best Picture was when Brokeback Mountain lost to Crash.

Best Actor

Daniel Day Lewis (65%)

Hugh Jackman / Denzel Washington (10%)

Another straightforward call, as Daniel Day Lewis has swept the dramatic acting categories this year. Plus, people love that guy. A DDL loss would be a repeat of 2002, when Denzel Washington (Training Day) unexpectedly beat Russel Crowe (A Beautiful Mind), who also swept the dramatic awards. The SAG and Critics Choice awards best predict this category.

Best Actress

Jennifer Lawrence (70%)

Jessica Chastain (20%)

Sorry, Quvenzhané Wallis. You may be who the Earth is for, but you aren’t who this award is for. Choosing between Lawrence and Chastain is tricky. The former won the Golden Globe comedy award and the SAG award. Jessica Chastain won the Golden Globe drama award and the critics choice award. The SAG is the best predictor, and thus the model prefers Jennifer Lawrence. If I hadn’t used a model, I would have guessed Jessica Chastain, thinking that dramatic movies have a better shot at winning Oscars than rom-coms. C’mon, math…

Supporting Actress

Anne Hathaway (87%)

Everyone else (2-5%)

The easiest prediction of the bunch. She’s been unbeatable in other ceremonies, so there’s no reason not to pick her based on the data.

Supporting Actor

Tommy Lee Jones (47%)

Christoph Waltz (30%)

This seems to be the most controversial category. The New York Times is predicting that Robert DeNiro will win, based on his aggressive oscar campaigning and his icon status (two factors not present in my model, which thinks he has a 5% shot). Tommy Lee Jones won the SAG, which best predicts the acting categories. Christoph Waltz won both the Golden Globe and BAFTA. Historically, this is a difficult category to predict based on precursor ceremonies.

Director

Who knows? Ben Affleck has swept the other ceremonies, but was notoriously not nominated for an Oscar this year. Thus, there’s very little award information to go off of.

My model gives a slight preference to Ang Lee/Life of Pi (45%) over Steven Spielberg/Lincoln (40%), based on which ceremonies each was nominated for. My model isn’t really precise to within 5%, so it isn’t a statistically significant edge. Personally, I’m inclined to think that Steven Spielberg will win (since, you know, he’s Steven Spielberg, and it’s been a while since he won. Poor little guy.)

Animated Feature

Another tossup between Wreck-it Ralph (47%) and Brave (40%). Both BAFTA and the critics choice awards tend to predict the correct winner about 90% of the time but, this year, they awarded different films (BAFTA->Brave, and Critics Choice -> Wreck-It Ralph). I love Pixar, but their recent movies aren’t as good as they were 5 years ago, and I loved Wreck It Ralph. I’m rooting for that movie, and its sweet 80s Nintendo soundtrack.

Foreign Film

Amour (56%)

A Royal Affair (15%)

Historically, this is a hard award to predict from precursor ceremonies. This year Amour won BAFTA, the Critics Choice Award, and the Golden Globe, so I think its odds are pretty good. It is also nominated for Best Picture, for which it doesn’t stand a chance. I think voters will feel bad and give it the Foreign Oscar instead.

Original Screenplay

Django Unchained (73%) (58%)

Flight (15%) Zero Dark 30 (36%)

Update (Feb 24, 7PM): I noticed an error in my Writers Guild Award data (1 hour before the ceremony!). I had incorrectly stored the Original Screenplay winner as Flight instead of Zero Dark 30, and the Adapted category as The Silver Linings Playbook instead of Argo. This changes the predictions in the two writing categories

Adapted Screenplay

The Silver Linings Playbook (65%) Argo (65%)

Argo, Lincoln (12% each) Lincoln, Silver Linings Playbook (11% each)

Update: See above.

 

Thats about it for the major-ish categories. Most of the minor categories don’t have many equivalent awards in other ceremonies, so the model predictions aren’t very compelling (sorry, lady nominated for Best Makeup and Hairstyling)

This was an interesting exercise — one of the key lessons (if you see this stuff as teaching moment material) is that there is a fairly high amount of unpredictability in predicting Oscar winners based on the other awards — typical forecast accuracies are around 60% (clearly better than random guessing from a field of 5-7 nominees, but nowhere near a lock).

Perhaps models with more information could do better (genre information seems particularly relevant). However, even the people who do this stuff for a living usually only get ~75% of the categories right. So maybe it’s just hard to predict.

Or maybe we just haven’t seen the Nate Silver of Oscar forecasting yet.

Update

Apparently, Nate Silver is the Nate Silver of Oscar forecasting. His method and conclusions are largely the same as what’s posted here. That’s encouraging.


3 Comments on “Forecasting the Oscars Like a Boss: The Predictions”

  1. Rob says:

    While Spielberg seems like the safe pick, I find it hard to believe that he’ll win Best Director, mainly because if he has the support of the Academy enough to win Best Director, why wouldn’t Lincoln be favorite to win Best Picture? Say he wins Best Director, then in additional Lincoln has the most nominations, the highest gross of all Best Picture films, and we know Daniel Day-Lewis is gonna win Best Actor. That’s the formula to take Best Picture, but we know Argo will win.

    This just makes me believe Spielberg (and Ang Lee) won’t win. I feel David O. Russell is the front-runner because Silver Linings Playbook has a lot of support including acting nominees in all the categories, which usually is a reflection of the director. Then again, I do have Michael Haneke predicted to win as my upset pick, but your guess is as good as mine.

  2. Well, I don’t think Lincoln has enough support for Best Picture because it can’t compete against Argo. However, it doesn’t have to compete in Best Director, so that’s why I think it has a shot.

    A win by David O. Russel or Michael Haneke would be very exciting, and quite an upset — neither movie was nominated for a Directors Guild award, and none of the movies in my ~20 year sample have won best picture without at least a DG nom (and only 2 movies — Braveheart and the Pianist, won without a DG win. This year will add a third)

  3. […] to the winners; I know you’ve all been eagerly awaiting my approval) — how did my grand forecasting experiment turn […]


Leave a reply to Chris Beaumont Cancel reply