Tuesday, January 01, 2013

Crude NFL Ratings, 2012

Since I have a ranking system for teams and am somewhat interested in the NFL, I don’t see any reason not to take a once a year detour into ranking NFL teams (even if I’d much rather I have something useful to contribute regarding the second best pro sport, thoroughbred racing).

As a brief overview, the ratings are based on win ratio for the season, adjusted over the course of several iterations for opponent’s win ratio. They know nothing about injuries, about where games were played, about the distribution of points from game to game; nothing beyond the win ratio of all of the teams in the league and each team’s opponents. The final result is presented in a format that can be directly plugged into Log5. I call them “Crude Team Ratings” to avoid overselling them, but they tend to match the results from systems that are not undersold fairly decently.

First, I’ll offer ratings based on actual wins and losses, but I would caution against putting too much stock in them given the nature of the NFL. Ratios of win-loss records like 2-14 and 15-1 which pop up in the NFL are not easily handled by the system. In order to ensure that there are no divide by zero errors, I add half a win and half a loss to each team’s record. This is not an attempt at regression, which would require much more than one game of ballast. This year the most extreme records were 2-14 and 13-3, so the system produced fairly reasonable results:



In the table, aW% is an adjusted W% based on CTR. The rank order will be exactly the same, but I prefer the CTR form due to its Log5 compatibility. SOS is the average CTR of a team’s opponents, rk is the CTR tank of each team, and s rk is each team’s SOS rank.

The rankings that I actually use are based on a Pythagorean estimated win ratio from points and points allowed:



Seattle’s #1 ranking was certainly a surprise, but last year Seattle’s 92 CTR ranked 13th in the league, reflecting a little better than their 7-9 record. When I have posted weekly updates on Twitter, I’ve gotten a few comments on the high ranking of the Bears. CTR may like Chicago more than some systems, but comparable systems with comparable inputs also hold them in high regard. Wayne Winston ranks them #5; Andy Dolphin #7; Jeff Sagarin #7; and Football-Reference #6. Chicago ranked sixth in the NFL in P/PA ratio, which is the primary determinant of CTR, and played an above-average schedule (they rank 10th in SOS at 116, which means that their average opponent was roughly as good as the Vikings). The NFC North was the second-strongest division in the league, with Green Bay ranking #6, Minnesota #9, and Detroit #17. They played the AFC South, which didn’t help, although it was marginally better for SOS than playing the West. Their interdivisional NFC foes were Arizona (#24), Carolina (#16), Dallas (#19), Seattle (#1), San Francisco (#3), and St. Louis (#13) which is a pretty strong slate.

Obviously the Bears did not close the season strong, but the system doesn’t know the sequence of games and weights everything equally. Still, their losses came to #1 Seattle, #3 San Francisco, twice to #6 Green Bay, #7 Houston, and #9 Minnesota. I didn’t check thoroughly, but I believe that no other team save Denver was undefeated against the bottom two-thirds of the league (the Broncos’ losses came to #2 New England, #7 Houston, and #8 Atlanta). Even the other top teams had worse losses--for instance, Seattle and New England both lost to #24 Arizona, San Francisco lost to #13 St. Louis, Green Bay and Houston lost to #23 Indianapolis, and Atlanta lost to #20 Tampa Bay.

Last year I figured the CTR for each division and conference as the arithmetic average of the CTRs of each member team, but that approach is flawed. Since the ratings are designed to be used multiplicatively, the geometric average provides a better means of averaging. However, given the properties of the geometric average, the arithmetic average of the geometric averages does not work out to the nice result of 100:



The NFC’s edge here is huge--it implies that the average NFC team should win 64% of the time against an average AFC team. The actual interconference record was 39-25 in favor of the NFC (.609). The NFC’s edge is naturally reflected in the team rankings; 7 of the top 10 teams are from the NFC with 7 of the bottom 8 and 10 of the bottom 12 from the AFC.

This exercise wouldn’t be a lot of fun if I didn’t use it to estimate playoff probabilities. First, though, we need regressed CTRs. This year, I’ve added 12.2 game of .500 to each team’s raw win ratio based on the approach outlined here. That produces this set of ratings, which naturally result in a compression of the range between the top and bottom of the league, and a few teams shuffling positions:



The rank orders differ not because the regression changes the order of the estimated win ratios fed into the system (it doesn’t), but because the magnitude of the strength of schedule adjustment is reduced.

Last year I included tables listing probabilities for each round of the playoffs, but I will limit my presentation here to the first round and the probabilities of advancement. After each round of the playoffs, the CTRs should be updated to reflect the additional data on each team, and thus the extensive tables will be obsolete (although I will share a few nuggets). This updating might not be particularly important for MLB, since a five or seven game series adds little information when we already have a 162 game sample on which to evaluate a team. But for the more limited sample available for the NFL, each new data point helps.

In figuring playoff odds, I assume that having home field advantage increases a team’s CTR by 32.6% (this is equivalent to assuming that the average home W% is .570). Here is what the system thinks about the wildcard round:



The home team is a solid favorite in each game except for Washington, which faces the top-ranked team in the league. Houston is the weakest favorite; the Texans would be estimated to have a 54% chance on a neutral field and 47% at Cincinnati.

The overall estimated probabilities for teams to advance to each round are as follows:



San Francisco, Denver, and New England are all virtually even at 20% to win the Super Bowl. The Patriots are the highest ranked of the three, but San Francisco benefits from the weak NFC and Denver from home field advantage. CTR would naturally pick Seattle to win it all if they weren’t at a seeding disadvantage; however, their probability of winning the Super Bowl given surviving the first round is 14%, greater than Atlanta’s 12%.

The most likely AFC title game is Denver/New England (48% chance), with Denver given a 54% chance to win (it would be 47% on a neutral field and 40% at New England); the least likely AFC title game is Indianpolis/Cincinnati (1% chance). The most likely NFC title game is Atlanta/San Francisco (34%), with a 53% chance of a 49ers road win; the least likely matchup is Washington/Minnesota (2%). The most likely Super Bowl matchup is Denver/San Francisco (14% likelihood and 54% chance of a 49er win); the least likely is Indianapolis/Washington (.1%). The NFC is estimated to have a 51% chance of winning the Super Bowl, lower than one might expect given the NFC’s dominance in the overall rankings. However, the NFC’s best team has to win three games on the road (barring a title game against Minnesota) while the probability of New England or Denver carrying the banner for the AFC is estimated to be 77%.

Of course, all of these probabilities are just estimates based on a fairly crude rating system, and last year the Giants were considered quite unlikely to win the Super Bowl (although I didn’t regress enough in calculating the playoff probabilities last year, resulting in overstating the degree of that unlikelihood).

No comments:

Post a Comment

I reserve the right to reject any comment for any reason.