Wednesday, March 26, 2014
Monday, March 17, 2014
Army Changes their Head Football Coach
Army has fired head football coach Rich Ellerson, and hired Jeff Monken who was the head football coach at Georgia Southern. Given head coach Ellerson's departure, here is an analysis of the Army Black Knights using the Complex Invasion College Football Production Model. First here is a snapshot of the Black Knights over the last five years during head football coach Ellerson's tenure. As you can see, the Black Knights were a team on the rise at the beginning and have been on a downward trend in terms of on-field production over the last few years.
2013
At the time of Ellerson's departure, not all the post-season bowl games were completed, so I will take a look at the Black Knights as of the end of the regular season (which is after the last game of the regular season Army-Navy). Pre-game analysis was done using the Complex Invasion College Football Production Model. Turning to the team, the Black Knights finished the regular season at 3-9 playing against an "easier" strength of schedule (SOS) as compare to the "league". Army's best game was a victory over #90 ranked Louisiana Tech and their worst game was a defeat to #112 ranked Hawaii. Army finished as the #102 ranked team out of 125 in terms of total production, with the #110 ranked offense and the #74 ranked defense. As you can see from the chart above, Army has faced a downward trend in terms of team production rank from 2010 and continued losses to Navy, made changing head coaches a way of trying to turn around the program.
2012
The Black Knights finished the regular season at 2-10 playing against an "average" SOS as compared to the "league". Army's best game was a victory over #81 ranked Air Force (other was a win over #100 Boston College) and their worst defeat was to FCS Stony Brook. Army finished as the #107 ranked team with the #93 ranked offense and the #105 ranked defense. It looks as if I did not write up an analysis of the Army-Navy game this year, but the trend of Army losing to Navy continued.
2011
Army finished the regular season at 3-9 and out of the bowl picture. Army played against an "average" strength of schedule and Army's best game was a win over #65 Northwestern and Army's worst game was a loss to #105 Ball State. In terms of overall production Army finished as the #76 ranked team with the #67 ranked offense and the #73 ranked defense. Army lost their game to Navy. Here is an analysis of the Army-Navy game.
2010
Ellerson's second year resulted in the Black Knights finishing the regular season at 6-6 and becoming bowl eligible. Army defeated #40 SMU in the Armed Forces Bowl to finish overall at 7-6. The Black Knights played against an "average" SOS as compared to the "league" overall. Army's best game was their defeat of #40 SMU and their worst game was a loss to #93 ranked Rutgers. Overall Army was the #57 ranked team in total production with the #68 ranked offense and the #38 ranked defense. Army lost to Navy, and here is an analysis of this year's Army-Navy game.
2009
In Ellerson's first year at the helm of the Black Knights, Army finished the regular season at 5-7 and out of post-season bowl contention. The Black Knights played against an "easier" SOS as compared to the league as a whole (meaning that their SOS was between one and two standard deviations above the "league" average SOS). Another way of thinking about this is that Army played eight of their twelve games against teams in the bottom 25% of the league (including one of those games against FCS VMI and they went 5-3 against the bottom 25%. The Black Knights best game was against #88 ranked Vanderbilt and their worst was a one point loss to #116 Tulane. Overall the Black Knights were the #106 ranked team in total production with the #118 ranked offense (out of 120). On the bright side Army had the #26 ranked defense. Given their terrible offense, this is even more impressive. Probably of most interest to Army fan's, Army lost to Navy.
Analysis of 2013 NCAA FBS Head Coach Changes
Vanderbilt and James Franklin
Penn State and Bill O'Brien
UMass and Charley Molnar
Boise State and Chris Petersen
Texas and Mack Brown
Washington and Steve Sarkisian
Wake Forest and Jim Grobe
Wyoming and Dave Christensen
Eastern Michigan and Ron English
Florida Atlantic and Carl Pelini
Miami of Ohio and Don Treadwell
UConn and Paul Pasqualoni
USC and Lane Kiffen
2013
At the time of Ellerson's departure, not all the post-season bowl games were completed, so I will take a look at the Black Knights as of the end of the regular season (which is after the last game of the regular season Army-Navy). Pre-game analysis was done using the Complex Invasion College Football Production Model. Turning to the team, the Black Knights finished the regular season at 3-9 playing against an "easier" strength of schedule (SOS) as compare to the "league". Army's best game was a victory over #90 ranked Louisiana Tech and their worst game was a defeat to #112 ranked Hawaii. Army finished as the #102 ranked team out of 125 in terms of total production, with the #110 ranked offense and the #74 ranked defense. As you can see from the chart above, Army has faced a downward trend in terms of team production rank from 2010 and continued losses to Navy, made changing head coaches a way of trying to turn around the program.
2012
The Black Knights finished the regular season at 2-10 playing against an "average" SOS as compared to the "league". Army's best game was a victory over #81 ranked Air Force (other was a win over #100 Boston College) and their worst defeat was to FCS Stony Brook. Army finished as the #107 ranked team with the #93 ranked offense and the #105 ranked defense. It looks as if I did not write up an analysis of the Army-Navy game this year, but the trend of Army losing to Navy continued.
2011
Army finished the regular season at 3-9 and out of the bowl picture. Army played against an "average" strength of schedule and Army's best game was a win over #65 Northwestern and Army's worst game was a loss to #105 Ball State. In terms of overall production Army finished as the #76 ranked team with the #67 ranked offense and the #73 ranked defense. Army lost their game to Navy. Here is an analysis of the Army-Navy game.
2010
Ellerson's second year resulted in the Black Knights finishing the regular season at 6-6 and becoming bowl eligible. Army defeated #40 SMU in the Armed Forces Bowl to finish overall at 7-6. The Black Knights played against an "average" SOS as compared to the "league" overall. Army's best game was their defeat of #40 SMU and their worst game was a loss to #93 ranked Rutgers. Overall Army was the #57 ranked team in total production with the #68 ranked offense and the #38 ranked defense. Army lost to Navy, and here is an analysis of this year's Army-Navy game.
2009
In Ellerson's first year at the helm of the Black Knights, Army finished the regular season at 5-7 and out of post-season bowl contention. The Black Knights played against an "easier" SOS as compared to the league as a whole (meaning that their SOS was between one and two standard deviations above the "league" average SOS). Another way of thinking about this is that Army played eight of their twelve games against teams in the bottom 25% of the league (including one of those games against FCS VMI and they went 5-3 against the bottom 25%. The Black Knights best game was against #88 ranked Vanderbilt and their worst was a one point loss to #116 Tulane. Overall the Black Knights were the #106 ranked team in total production with the #118 ranked offense (out of 120). On the bright side Army had the #26 ranked defense. Given their terrible offense, this is even more impressive. Probably of most interest to Army fan's, Army lost to Navy.
Analysis of 2013 NCAA FBS Head Coach Changes
Vanderbilt and James Franklin
Penn State and Bill O'Brien
UMass and Charley Molnar
Boise State and Chris Petersen
Texas and Mack Brown
Washington and Steve Sarkisian
Wake Forest and Jim Grobe
Wyoming and Dave Christensen
Eastern Michigan and Ron English
Florida Atlantic and Carl Pelini
Miami of Ohio and Don Treadwell
UConn and Paul Pasqualoni
USC and Lane Kiffen
Saturday, March 15, 2014
Franklin Leaves Vanderbilt
After Bill O'Brien left for the NFL, Penn State hired James Franklin who was the former head football coach at Vanderbilt over the past three seasons. So here is a look at the Vanderbilt Commodores of the Southeastern Conference during Franklin's tenure as head football coach using the Complex Invasion College Football Production Model. First here is a snapshot of the Commodores over the last three years.
2011
Under Franklin's first year as head football coach Vanderbilt finished the regular season bowl eligible at 6-6, losing their bowl game to #16 ranked Cincinnati. Vandy played against an "average" strength of schedule as compared to the league average. The Commodores best game was a 41-7 victory over #68 ranked Wake Forest and their worst was a 21-27 loss to #77 ranked Tennessee. Overall, Vanderbilt was the #35 ranked team in all of the Football Bowl Subdivision, with the #54 most productive offense and the #24 ranked defense.
2012
This was a good season for Vanderbilt. They finished the regular season at 8-4 after starting 1-3 and defeated #69 ranked North Carolina State in their bowl game to finish at 9-4, while playing against an average SOS as compared to the league as a whole. Of course, six of their nine victories were against ranked #90 or higher (including one FCS game). Vanderbilt's best game was a victory over #45 Mississippi and their worst game was to #37 Northwestern. The Commodores finished the regular season as the #38 ranked team overall and had the #58 ranked offense (slightly above average) and the #21 ranked defense.
2013
The Commodores finished the regular season at 8-4 and defeated #22 ranked Houston to finish at 9-4 overall. Vanderbilt played against an "average" strength of schedule as compared to the "league" average SOS. Yes, an SEC team having an average SOS due to the fact that they played four teams that were ranked over 100 (Austin Peay, Massachusetts and UAB - all non-conference games) and Kentucky which is a conference game. That said, Vandy's best game was their bowl game win against Houston and their worst loss was to #45 Mississippi at the beginning of the season. Overall, Vanderbilt was the #57 ranked team in total production, with the #73 ranked offense and the #25 ranked defense.
Analysis of 2013 NCAA FBS Head Coach Changes
Penn State and Bill O'Brien
UMass and Charley Molnar
Boise State and Chris Petersen
Texas and Mack Brown
Washington and Steve Sarkisian
Wake Forest and Jim Grobe
Wyoming and Dave Christensen
Eastern Michigan and Ron English
Florida Atlantic and Carl Pelini
Miami of Ohio and Don Treadwell
UConn and Paul Pasqualoni
USC and Lane Kiffen
2011
Under Franklin's first year as head football coach Vanderbilt finished the regular season bowl eligible at 6-6, losing their bowl game to #16 ranked Cincinnati. Vandy played against an "average" strength of schedule as compared to the league average. The Commodores best game was a 41-7 victory over #68 ranked Wake Forest and their worst was a 21-27 loss to #77 ranked Tennessee. Overall, Vanderbilt was the #35 ranked team in all of the Football Bowl Subdivision, with the #54 most productive offense and the #24 ranked defense.
2012
This was a good season for Vanderbilt. They finished the regular season at 8-4 after starting 1-3 and defeated #69 ranked North Carolina State in their bowl game to finish at 9-4, while playing against an average SOS as compared to the league as a whole. Of course, six of their nine victories were against ranked #90 or higher (including one FCS game). Vanderbilt's best game was a victory over #45 Mississippi and their worst game was to #37 Northwestern. The Commodores finished the regular season as the #38 ranked team overall and had the #58 ranked offense (slightly above average) and the #21 ranked defense.
2013
The Commodores finished the regular season at 8-4 and defeated #22 ranked Houston to finish at 9-4 overall. Vanderbilt played against an "average" strength of schedule as compared to the "league" average SOS. Yes, an SEC team having an average SOS due to the fact that they played four teams that were ranked over 100 (Austin Peay, Massachusetts and UAB - all non-conference games) and Kentucky which is a conference game. That said, Vandy's best game was their bowl game win against Houston and their worst loss was to #45 Mississippi at the beginning of the season. Overall, Vanderbilt was the #57 ranked team in total production, with the #73 ranked offense and the #25 ranked defense.
Analysis of 2013 NCAA FBS Head Coach Changes
Penn State and Bill O'Brien
UMass and Charley Molnar
Boise State and Chris Petersen
Texas and Mack Brown
Washington and Steve Sarkisian
Wake Forest and Jim Grobe
Wyoming and Dave Christensen
Eastern Michigan and Ron English
Florida Atlantic and Carl Pelini
Miami of Ohio and Don Treadwell
UConn and Paul Pasqualoni
USC and Lane Kiffen
Tuesday, March 11, 2014
OBrien Leaves Penn State
At the beginning of this year, Bill O'Brien was hired as the new head football coach of the Houston Texans after being the head football coach at Penn State for two years. Given that O'Brien has now left Happy Valley, let's take a look back at the Nittany Lions under head football coach Bill O'Brien using the Complex Invasion College Football Production Model. Below is a snapshot of the team for the last two seasons along with how the worst team would have performed as a guide.
2012
In Bill O'Brien's first year as head football coach, Penn State finished with an 8-4 regular season. The Nittany Lions played an "average" strength of schedule (SOS) as compared to the "league" average. Penn State's best game this season was a victory over #33 ranked Wisconsin and their worst was a loss to #86 ranked Virginia. Overall, the Nittany Lions had the #34 ranked team with the #65 ranked offense and the #14 ranked defense.
2013
In what ended up as O'Brien's last season as head football coach, the Nittany Lions finished the regular season at 7-5 and were still on post-season probation due to the Sandusky incident. Penn State played an "average" strength of schedule (SOS) as compared to the "league" average. PSU's best game was a victory over #9 ranked Wisconsin and their worst was a loss to #77 ranked Indiana. Overall, the Nittany Lions had the #59 ranked team with the #64 ranked offense and the #43 ranked defense. Given the decline in the team's defense from 2012 and that the offense was about average over the two years, we see that PSU declined to be just above average in 2013. Whether that will continue or not is outside of the model's design.
Sometime in the future, but not soon, I will come back and look at Penn State under the last few seasons with Joe Paterno, but that is a blog for a different time.
Up next Nittany Lion fans is how Vanderbilt has performed under Franklin's tenure.
Analysis of 2013 NCAA FBS Head Coach Changes
UMass and Charley Molnar
Boise State and Chris Petersen
Texas and Mack Brown
Washington and Steve Sarkisian
Wake Forest and Jim Grobe
Wyoming and Dave Christensen
Eastern Michigan and Ron English
Florida Atlantic and Carl Pelini
Miami of Ohio and Don Treadwell
UConn and Paul Pasqualoni
USC and Lane Kiffen
2012
In Bill O'Brien's first year as head football coach, Penn State finished with an 8-4 regular season. The Nittany Lions played an "average" strength of schedule (SOS) as compared to the "league" average. Penn State's best game this season was a victory over #33 ranked Wisconsin and their worst was a loss to #86 ranked Virginia. Overall, the Nittany Lions had the #34 ranked team with the #65 ranked offense and the #14 ranked defense.
2013
In what ended up as O'Brien's last season as head football coach, the Nittany Lions finished the regular season at 7-5 and were still on post-season probation due to the Sandusky incident. Penn State played an "average" strength of schedule (SOS) as compared to the "league" average. PSU's best game was a victory over #9 ranked Wisconsin and their worst was a loss to #77 ranked Indiana. Overall, the Nittany Lions had the #59 ranked team with the #64 ranked offense and the #43 ranked defense. Given the decline in the team's defense from 2012 and that the offense was about average over the two years, we see that PSU declined to be just above average in 2013. Whether that will continue or not is outside of the model's design.
Sometime in the future, but not soon, I will come back and look at Penn State under the last few seasons with Joe Paterno, but that is a blog for a different time.
Up next Nittany Lion fans is how Vanderbilt has performed under Franklin's tenure.
Analysis of 2013 NCAA FBS Head Coach Changes
UMass and Charley Molnar
Boise State and Chris Petersen
Texas and Mack Brown
Washington and Steve Sarkisian
Wake Forest and Jim Grobe
Wyoming and Dave Christensen
Eastern Michigan and Ron English
Florida Atlantic and Carl Pelini
Miami of Ohio and Don Treadwell
UConn and Paul Pasqualoni
USC and Lane Kiffen
Monday, March 10, 2014
Middle Tennessee State 2008-2013
Since I am now followed on Twitter by Middle Tennessee State's Offense, I was curious as to how the Blue Raiders have performed on the field using the Complex Invasion College Football Production Model, both on offense and defense. Here is a quick graph using the information presented below. I will present the analysis starting with the 2008 season.
2008
Middle Tennessee State finished the season at 5-7 and ineligible for a bowl spot. The Blue Raiders played an "easier" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #70 ranked Maryland and their worst was a loss to #100 ranked Mississippi State. Overall, the Blue Raiders had the #76 ranked team with the #79 ranked offense and the #63 ranked defense.
2009
The Blue Raiders had their best season during this time period finishing the regular season at 9-3 and defeating #36 ranked Southern Mississippi in their bowl game. The Blue Raiders played an "much easier" strength of schedule (SOS) as compared to the "league" average, meaning that the Blue Raiders SOS was more than two standard deviations greater than the "league" average. MTSU's best game was their bowl win over Southern Mississippi and their worst was a loss again to #87 ranked Mississippi State. Overall, the Blue Raiders had the #28 ranked team with the #25 ranked offense and the #43 ranked defense.
2010
MTSU finished the regular season at 6-6 and after losing their bowl game finished 6-7 overall. The Blue Raiders again played an "much easier" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #66 ranked Florida International University and their worst was a loss to #118 ranked Memphis. Overall, the Blue Raiders had the #74 ranked team with the #75 ranked offense and the #60 ranked defense; very similar to their 2008 season.
2011
This was a horrible year for the Blue Raiders, finishing the regular season at 2-10 while playing against an "easier" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #111 ranked Memphis and their worst was a loss to #106 ranked Troy. Overall, the Blue Raiders had the #114 ranked team with the #108 ranked offense and the #114 ranked defense.
2012
Middle Tennessee State bounced back with an 8-4 regular season, but no bowl bid. The Blue Raiders played an "easier" strength of schedule (SOS) as compared to the "league" average. Middle Tennessee's best game was a victory over #56 ranked Western Kentucky and their worst was a loss to FCS McNeese State. Overall, the Blue Raiders had the #77 ranked team with the #81 ranked offense and the #59 ranked defense.
2013
The Blue Raiders finished the regular season at 8-4 and lost their bowl game to #44 ranked Navy. The Blue Raiders played an "average" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #8 ranked Marshall and their worst was a loss to #44 ranked Navy. Overall, the Blue Raiders had the #40 ranked team with the #49 ranked offense and the #54 ranked defense.
2008
Middle Tennessee State finished the season at 5-7 and ineligible for a bowl spot. The Blue Raiders played an "easier" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #70 ranked Maryland and their worst was a loss to #100 ranked Mississippi State. Overall, the Blue Raiders had the #76 ranked team with the #79 ranked offense and the #63 ranked defense.
2009
The Blue Raiders had their best season during this time period finishing the regular season at 9-3 and defeating #36 ranked Southern Mississippi in their bowl game. The Blue Raiders played an "much easier" strength of schedule (SOS) as compared to the "league" average, meaning that the Blue Raiders SOS was more than two standard deviations greater than the "league" average. MTSU's best game was their bowl win over Southern Mississippi and their worst was a loss again to #87 ranked Mississippi State. Overall, the Blue Raiders had the #28 ranked team with the #25 ranked offense and the #43 ranked defense.
2010
MTSU finished the regular season at 6-6 and after losing their bowl game finished 6-7 overall. The Blue Raiders again played an "much easier" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #66 ranked Florida International University and their worst was a loss to #118 ranked Memphis. Overall, the Blue Raiders had the #74 ranked team with the #75 ranked offense and the #60 ranked defense; very similar to their 2008 season.
2011
This was a horrible year for the Blue Raiders, finishing the regular season at 2-10 while playing against an "easier" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #111 ranked Memphis and their worst was a loss to #106 ranked Troy. Overall, the Blue Raiders had the #114 ranked team with the #108 ranked offense and the #114 ranked defense.
2012
Middle Tennessee State bounced back with an 8-4 regular season, but no bowl bid. The Blue Raiders played an "easier" strength of schedule (SOS) as compared to the "league" average. Middle Tennessee's best game was a victory over #56 ranked Western Kentucky and their worst was a loss to FCS McNeese State. Overall, the Blue Raiders had the #77 ranked team with the #81 ranked offense and the #59 ranked defense.
2013
The Blue Raiders finished the regular season at 8-4 and lost their bowl game to #44 ranked Navy. The Blue Raiders played an "average" strength of schedule (SOS) as compared to the "league" average. MTSU's best game was a victory over #8 ranked Marshall and their worst was a loss to #44 ranked Navy. Overall, the Blue Raiders had the #40 ranked team with the #49 ranked offense and the #54 ranked defense.
Friday, March 7, 2014
Guide to Calculating Player Consistency
In the previous blog I talked about how consistent MLB batters were in back-to-back seasons. Here I want to present the step-by-step process to do this type of analysis.
Step 1: You have to have an algorithm to evaluate team performance that allows you to take the impact of various team statistics such that you can evaluate player performance. For MLB batters I use Runs Created and for NHL Goalies I use the marginal value of a goal against.
Step 2: You have to have an algorithm to evaluate player performance, whether it is Runs Created, Wins Above Average, WP48, QB Score or another metric. I will assume that you have a method to evaluate player performance.
(I assume that you have the team data from step 1 to calculate the marginal value of the team statistics and that you have used those marginal team values to calculate the player performance measure in step 2. Once this is done then we can get to step 3. If not - see the links above to replicate the analysis for either MLB batters or NHL goalies).
Step 3: I use a cut-off point for the players (but this is not required). If you use a cut-off point such as minimum AB's for MLB batters, sort the data and eliminate the data that does not meet your minimum requirements. Then re-sort the data by season and by your player performance measure from highest to lowest.
Step 4a: Calculate the order from 1 to n (n being the lowest player performance number for that season). If the order is in column A, then in cell A2 type in a 1, and in cell A3 =A2+1, and copy the formula and paste down for each player in that season. Repeat for each season.
Step 4b: Calculate the rank for each player Calculate each quintile (range of 20%) for each season. In Excel I use a formula like this: =IF(A2/$A$438<=0.2,1,IF(A2/$A$438<=0.4,2,IF(A2/$A$438<=0.6,3,IF(A2/$A$438<=0.8,4,5)))), where A2 is the highest ranked player by number (i.e. 1) and A438 would be the last player for that season.
Step 5: Sort the data by player name and player year (lowest to highest).
Step 6: Create a formula such that only the same player will be listed if they met the player performance criteria in consecutive seasons. Thus for each variable you are interested in, I use the following formula in Excel in row 3: =IF($D4-$D3=1,IF($C3=$C4,B4,""),""), where column D is the column that has the season played and column C has the players name and column B has the players rank. Since you are looking at row 4, that should be the next season. If it is not, then the formula above reports a blank cell. Copy and paste for all the players.
Step 7: Calculate consistency (and near consistency) as follows: =IF(I3="","",IF(I3=B3,1,0)) which says that if the column with the next years rank (column I in this case) is blank, to leave it blank, otherwise to put a one if they are the equal and a zero if they are not equal. For near consistency I use the following formula: =IF(I3="","",IF(I3+1=B3,1,0)) if I want to know how many have increased their rank by one and =IF(I3="","",IF(I3-1=B3,1,0)) if I want to know how many have decreased their rank by one. Copy and paste for all the players.
Step 8: Finally I calculate the percentage of players who are consistent and near consistent and then you can find out how many are not even nearly consistent from one season to another.
Step 1: You have to have an algorithm to evaluate team performance that allows you to take the impact of various team statistics such that you can evaluate player performance. For MLB batters I use Runs Created and for NHL Goalies I use the marginal value of a goal against.
Step 2: You have to have an algorithm to evaluate player performance, whether it is Runs Created, Wins Above Average, WP48, QB Score or another metric. I will assume that you have a method to evaluate player performance.
(I assume that you have the team data from step 1 to calculate the marginal value of the team statistics and that you have used those marginal team values to calculate the player performance measure in step 2. Once this is done then we can get to step 3. If not - see the links above to replicate the analysis for either MLB batters or NHL goalies).
Step 3: I use a cut-off point for the players (but this is not required). If you use a cut-off point such as minimum AB's for MLB batters, sort the data and eliminate the data that does not meet your minimum requirements. Then re-sort the data by season and by your player performance measure from highest to lowest.
Step 4a: Calculate the order from 1 to n (n being the lowest player performance number for that season). If the order is in column A, then in cell A2 type in a 1, and in cell A3 =A2+1, and copy the formula and paste down for each player in that season. Repeat for each season.
Step 4b: Calculate the rank for each player Calculate each quintile (range of 20%) for each season. In Excel I use a formula like this: =IF(A2/$A$438<=0.2,1,IF(A2/$A$438<=0.4,2,IF(A2/$A$438<=0.6,3,IF(A2/$A$438<=0.8,4,5)))), where A2 is the highest ranked player by number (i.e. 1) and A438 would be the last player for that season.
Step 5: Sort the data by player name and player year (lowest to highest).
Step 6: Create a formula such that only the same player will be listed if they met the player performance criteria in consecutive seasons. Thus for each variable you are interested in, I use the following formula in Excel in row 3: =IF($D4-$D3=1,IF($C3=$C4,B4,""),""), where column D is the column that has the season played and column C has the players name and column B has the players rank. Since you are looking at row 4, that should be the next season. If it is not, then the formula above reports a blank cell. Copy and paste for all the players.
Step 7: Calculate consistency (and near consistency) as follows: =IF(I3="","",IF(I3=B3,1,0)) which says that if the column with the next years rank (column I in this case) is blank, to leave it blank, otherwise to put a one if they are the equal and a zero if they are not equal. For near consistency I use the following formula: =IF(I3="","",IF(I3+1=B3,1,0)) if I want to know how many have increased their rank by one and =IF(I3="","",IF(I3-1=B3,1,0)) if I want to know how many have decreased their rank by one. Copy and paste for all the players.
Step 8: Finally I calculate the percentage of players who are consistent and near consistent and then you can find out how many are not even nearly consistent from one season to another.
Thursday, March 6, 2014
MLB Batter Consistency
In chapter 9 of our book, The Wages of Wins, we talk about player consistency and find that some sports players are more consistent and some sports players are less consistent. For MLB we found that batter are more consistent than NFL QB's and less consistent than NBA players. Here what I want to do is update the book for MLB batters. So I am taking the 2000 - 2012 seasons (actually the pair of MLB seasons from 1999-2000 to 2012-2013 and looking at how consistent MLB batters are over this time period.
First let me start by acknowledging that I used Sean Lahman's MLB database to do this analysis. Second, I restricted the MLB batters to only those who had 100 AB's (at bats) in an individual season and then looked at whether players had two consecuative seasons in which they had 100 AB's. If this requirement is met, then they are being evaluated. We decided to restrict the number of AB's given that some very good players may have been injured for a substantial portion of the season or that some players may have been called up at the end of one season and given a full time roll in the next season. In both cases these players would be judged inconsistent, but not because of their play but rather due to circumstances beyond their control.
Given the AB restriction, I ended up with 4694 players who had back-to-back seasons of at least 100 AB's from the 1999-2000 to 2012-2013 seasons.
MLB batter consistency is measured by where they rank in terms of runs created in a given season. For simplicity in The Wages of Wins, we took a player who was in the top 20% of the league in runs created as having a rank equal to 1, the next highest 20% with a rank of 2, all the way to the lowest 20% with a rank of 5. If a player maintains their rank from one season to another they are considered to be consistent and if not, they are considered to be inconsistent. (As you will see I also looked at players who only moved up or down one rank from one season to the next).
So how consistent are MLB players over this time period? I find that 38% (1774 observations) of MLB batters consistently perform as measured by runs created from one season to the next. Additionally, 39% (1827 observations) move either up or down one rank from one season to the next, meaning that during the 1999/00 to 2012/13 MLB seasons 24% (1093 observations) of MLB batters with back-to-back seasons of 100 AB's moved up or down two or more ranks from one season to another. Compared to NHL goalies, MLB batters are more consistent.
Of those who were consistent from one season to another, that could mean they were in the top 20% or could be in the bottom 20%. So what is the distribution of consistently "good" MLB batters as opposed to consistently "bad" MLB batters. Of the 1774 observations of MLB batters that keep their same rank from one season to the next, 697 (or 39%) of those MLB batters were in the top 20% of MLB batters and only 208 (or almost 12%) were in the bottom 20% of MLB batters.
The next blog will be a guide to calculate player consistency using the 20% dividing line.
First let me start by acknowledging that I used Sean Lahman's MLB database to do this analysis. Second, I restricted the MLB batters to only those who had 100 AB's (at bats) in an individual season and then looked at whether players had two consecuative seasons in which they had 100 AB's. If this requirement is met, then they are being evaluated. We decided to restrict the number of AB's given that some very good players may have been injured for a substantial portion of the season or that some players may have been called up at the end of one season and given a full time roll in the next season. In both cases these players would be judged inconsistent, but not because of their play but rather due to circumstances beyond their control.
Given the AB restriction, I ended up with 4694 players who had back-to-back seasons of at least 100 AB's from the 1999-2000 to 2012-2013 seasons.
MLB batter consistency is measured by where they rank in terms of runs created in a given season. For simplicity in The Wages of Wins, we took a player who was in the top 20% of the league in runs created as having a rank equal to 1, the next highest 20% with a rank of 2, all the way to the lowest 20% with a rank of 5. If a player maintains their rank from one season to another they are considered to be consistent and if not, they are considered to be inconsistent. (As you will see I also looked at players who only moved up or down one rank from one season to the next).
So how consistent are MLB players over this time period? I find that 38% (1774 observations) of MLB batters consistently perform as measured by runs created from one season to the next. Additionally, 39% (1827 observations) move either up or down one rank from one season to the next, meaning that during the 1999/00 to 2012/13 MLB seasons 24% (1093 observations) of MLB batters with back-to-back seasons of 100 AB's moved up or down two or more ranks from one season to another. Compared to NHL goalies, MLB batters are more consistent.
Of those who were consistent from one season to another, that could mean they were in the top 20% or could be in the bottom 20%. So what is the distribution of consistently "good" MLB batters as opposed to consistently "bad" MLB batters. Of the 1774 observations of MLB batters that keep their same rank from one season to the next, 697 (or 39%) of those MLB batters were in the top 20% of MLB batters and only 208 (or almost 12%) were in the bottom 20% of MLB batters.
The next blog will be a guide to calculate player consistency using the 20% dividing line.
Labels:
MLB
Wednesday, March 5, 2014
2012 and 2013 MLB Player Production with Runs Created
In my previous blog I looked at the impact that various team batting statistics have on the ability of a team to score runs. Now what I want to do is to determine how individual players produce runs using the regression results from the team runs created model. Since it looks like I did not do this for the 2012 MLB season, I will report the top 10 players for both the 2012 and 2013 seasons using this runs created methodology. (If you are interested, I have blogged previously on how to calculate MLB Player Production using MLB Team Runs Created).
Taking the data from Sean Lahman's database, here are the top 10 batters for 2012:
Again, taking the data from the same source, here are the top 10 batters for 2013:
As you can tell Cabrera, Troutman and McCutchen both show up in the top 10 in each year. So the next blog will look at how consistent are MLB players with 100 AB's since 2000.
Taking the data from Sean Lahman's database, here are the top 10 batters for 2012:
playerID | yearID | teamID | Runs Scored | |
1 | cabremi01 | 2012 | DET | 111.51 |
2 | braunry02 | 2012 | MIL | 110.95 |
3 | troutmi01 | 2012 | LAA | 109.20 |
4 | mccutan01 | 2012 | PIT | 104.01 |
5 | fieldpr01 | 2012 | DET | 102.52 |
6 | encared01 | 2012 | TOR | 98.66 |
7 | beltrad01 | 2012 | TEX | 98.31 |
8 | canoro01 | 2012 | NYA | 97.68 |
9 | poseybu01 | 2012 | SFN | 96.04 |
10 | headlch01 | 2012 | SDN | 95.94 |
Again, taking the data from the same source, here are the top 10 batters for 2013:
playerID | yearID | teamID | Runs Scored | |
1 | troutmi01 | 2013 | LAA | 122.32 |
2 | cabremi01 | 2013 | DET | 118.15 |
3 | davisch02 | 2013 | BAL | 111.98 |
4 | vottojo01 | 2013 | CIN | 105.13 |
5 | goldspa01 | 2013 | ARI | 103.18 |
6 | choosh01 | 2013 | CIN | 100.50 |
7 | carpema01 | 2013 | SLN | 98.38 |
8 | mccutan01 | 2013 | PIT | 96.15 |
9 | donaljo02 | 2013 | OAK | 91.85 |
10 | canoro01 | 2013 | NYA | 91.51 |
As you can tell Cabrera, Troutman and McCutchen both show up in the top 10 in each year. So the next blog will look at how consistent are MLB players with 100 AB's since 2000.
Labels:
Data Analysis,
MLB
Monday, March 3, 2014
2013 MLB Team Runs Created Productivity
In my Sports Economics course we look at how to estimate the productivity of MLB players. In order to do that, we first look at estimating the productivity of MLB teams using a model created by Asher Blass that was published in 1992. Almost every year I update his model using MLB data from Sean Lahman's database. So I did the MLB Team Runs Created analysis (step-by-step procedure to estimate MLB Team Runs Created) for the 2013 season, and here are the results. In step 8, I talk about the statistical analysis called a linear regression adjusting for heteroskedasticity, which is what I am reporting below.
Here are the estimated results for these 14 seasons of data (2000-2013).
A few observations: first, that other than the constant term, each of the variables is statistically significant at the 99% confidence level and of the correct sign. Only GIDPCS and OUTS are negative, which both are using up the finite and scarce resource in baseball: outs. Second,the coefficient on HR's is greater than the coefficient on Singles, which means that a HR will on average generate more runs than a single. All the other coefficients seem to make sense as well. Finally, note that the coefficient on grounded into a double play plus caught stealing (GIDPCS) is greater in absolute value than a stolen base (SB), which means that if a player made two stolen base attempts, and was successful on one attempt but not on the other, that player on average would have cost his team more than he benefited his team.
Up next is the best hitters for the 2012 and 2013 seasons.
Here are the estimated results for these 14 seasons of data (2000-2013).
Variable | Coefficient | Std. Error | t-Statistic | Prob. |
SINGLE | 0.523 | 0.020 | 26.460 | 0.000 |
DOUBLE | 0.723 | 0.041 | 17.710 | 0.000 |
TRIPLE | 1.116 | 0.127 | 8.785 | 0.000 |
HR | 1.423 | 0.039 | 36.291 | 0.000 |
HBP | 0.425 | 0.080 | 5.303 | 0.000 |
SB | 0.112 | 0.038 | 2.931 | 0.004 |
SF | 0.621 | 0.161 | 3.863 | 0.000 |
NBB | 0.330 | 0.019 | 17.252 | 0.000 |
GIDPCS | -0.167 | 0.063 | -2.671 | 0.008 |
OUTS2 | -0.144 | 0.025 | -5.857 | 0.000 |
Constant | 152.521 | 107.078 | 1.424 | 0.155 |
A few observations: first, that other than the constant term, each of the variables is statistically significant at the 99% confidence level and of the correct sign. Only GIDPCS and OUTS are negative, which both are using up the finite and scarce resource in baseball: outs. Second,the coefficient on HR's is greater than the coefficient on Singles, which means that a HR will on average generate more runs than a single. All the other coefficients seem to make sense as well. Finally, note that the coefficient on grounded into a double play plus caught stealing (GIDPCS) is greater in absolute value than a stolen base (SB), which means that if a player made two stolen base attempts, and was successful on one attempt but not on the other, that player on average would have cost his team more than he benefited his team.
Up next is the best hitters for the 2012 and 2013 seasons.
Labels:
Data Analysis,
MLB
Sunday, March 2, 2014
Mark Cuban D-League vs NCAA
Mark Cuban argues that the NBA D-League would be better for elite men's college basketball players than going to the NCAA. I have been making this type of argument for years. Nice to see someone else is on board.
Labels:
NBA
Subscribe to:
Posts (Atom)