In the 2007 publication Applying discrete choice models to predict Academy Award winners statisticians Iain Pardoe and Dean K. Simonton built a formula for accurately predicting the outcome of winners in the four major Academy awards-best picture, best director, best lead actor and best actress.
Using data culled from The Internet Movie Database the researchers accounted for numerous variables to predict Oscar winners for each year ranging from 1938 to 2006.
Some of their initial observations indicated higher chances of winning based on multiple nominations. This is based on the fact that only two directors have won a Best Director Oscar for a movie that was not nominated for Best Picture (Lewis Milestone for Two Arabian Nights in 1928 and Frank Lloyd for The Divine Lady in 1929). Other variables utilized included, Golden Globe wins, Guild awards, previous nominations and Oscar wins.
The success of their predictions for Oscar wins increased dramatically in the last 30 years of the period analyzed with each category becoming more predictable over time.
Pardoe and Simonton restricted the data used in the predictions to only the information that would have been available before the night of the Oscars during the year in question. This provided a more legitimate way to measure the true success of their formula. In certain instances this resulted in incorrect predictions. Pardoe and Simonton explain, “clearly, the model was unable to make use of the late surge that Crash made and the apparent backlash against Brokeback Mountain (in unquantifiable ‘Hollywood buzz’ terms) as the Oscar ceremony drew near.”
Their total results for 1938-2011 are impressive.
The overall success for their predictions across all of the four major categories since 1982 has been 82% accurately predicting 98 of 120 Oscar wins.
The formula has also uncovered some interesting aspects of previous Academy decisions. They write, “the analysis also reveals which past nominees have really upset the odds (winners with low estimated probability of winning), and which appear to have been truly robbed (losers with high estimated probability of winning). Examples include their prediction that The Aviator would win with a .97 probability when in fact the Best Picture Oscar went to Million Dollar Baby in 2004 with only a .01 probability according to their calculations.
As the 2013 Oscars approach they have offered their predictions. The category of best picture is especially difficult to estimate given the fact that Argo received two important indicators of Oscar success with its Producer’s Guild and Golden Globe Drama wins yet failed to secure a matching Best Director nomination.
Click “Read More” below for answers.
Click “Read More” below for the answers.
Click “Read More” for the answers.
1. 2 LITER COKE
2. CHEMISTRY 101 TEXT BOOK
3. HAND SAW
4. COILED ROPE
5. FANGORIA MAGAZINE
6. TAPE MEASURE
7. EIGHT SHOT GUN SHELLS
8. TOOL BOX
9. JUMPER CABLES
10. BOOK ON STEAM POWER
11. FIFTH ANNIV ISSUE “DARK HORSE PRESENTS”(1986)
12. METAL GAS CAN
13. REPLACEMENT HEADLIGHT