Last week, I asked if a player with a great peak is worth more than conventional metrics might suggest. I re-introduced a concept known as Pennants Added, which tells us how many playoff appearances a player is worth to a randomly chosen team, and is often used by sabermetricians as “proof” that a great peak is more valuable, all else being equal. Actually, it turned out you’d rather have a good player for many seasons than a great player for fewer years, because after four wins above replacement, the effect of each additional win is smaller and smaller.
The response to that article was overwhelming, which was surprising given that I invoked the word “derivative” multiple times in the piece. My sharp-thinking readers made many interesting suggestions, two of which I would like to explore here.
How Times Have Changed
The first suggestion I received was that perhaps the value of each additional win, in terms of playoff appearances, and ultimately, world championships, has changed in recent years. When I calculated Pennants Added in last week’s column, I only used data from 1995-2005, and perhaps the value of high-win players has declined in recent years, when eight teams out of 30 make the playoffs.
I think that’s a reasonable suggestion, so I decided to go back further in the data, all the way back to 1904, the first year in which every team played at least 154 games. I split up the data based on the number of teams and playoff spots in the league, and ended up with the following combinations:
8 Teams, 1 Playoff Spot: 1904-1960 AL and NL, 1961 NL
10 Teams, 1 Playoff Spot: 1962-1968 AL and NL, 1961 AL
12 Teams, 2 Playoff Spots: 1969-1976 AL and NL, 1977-1992 NL
14 Teams, 2 Playoff Spots: 1993 AL and NL, 1977-1992 AL
14 Teams, 4 Playoff Spots: 1996-1997 AL and NL, 1998-2005 AL
16 Teams, 4 Playoff Spots: 1998-2005 NL
As you may have noticed, I removed the strike years of 1972, 1981, 1994, and 1995, because the smaller number of games played in those seasons results in a wider distribution of wins.
I then expressed each team’s record as wins per 162 games, and ran six separate binary logistic regressions to figure out a team’s probability of making the playoffs based on the number of games it wins:
The results jive with our expectations: More teams make it harder to make the playoffs as do fewer playoff spots. Just to put this all in perspective, here is how many games a team would have to win in each of these leagues to have a better than 50% chance of going to the postseason:
8 Teams, 1 Playoff Spot: 100
10 Teams, 1 Playoff Spot: 98
12 Teams, 2 Playoff Spots: 93
14 Teams, 2 Playoff Spots: 96
14 Teams, 4 Playoff Spots: 90
16 Teams, 4 Playoff Spots: 90
So it’s easier to make the playoffs these days than at any other point in baseball history; think that has anything to do with the record profits major league teams are raking in?
There are some other interesting points on this graph, but I’ll leave those for other commentators to discuss. What’s most important is that the critics’ theory seems right: If it’s easier to make the playoffs today than ever before, the value of a great player is mitigated by the fact that his team’s chances of winning it all were already not so bad.
In fact, let’s calculate just how many World Series championships a player would add to a randomly chosen team in each of these six leagues, by calculating playoff appearances added and simply dividing by the number of playoff teams (note: technically, this is not quite right because better teams have a better chance of winning the World Series, but the difference is rather trivial, so I won’t complicate this any more than I have to).
The reason you only see five lines is that the World Series Added (WSA, for short) of a player in a 14-team league with four playoff spots is nearly identical to the WSA of a player in a 16-team league with four playoff spots, so one line covers the other.
As you can see, the value of an individual player, especially a great one, has fallen as baseball has expanded and added playoff teams. Babe Ruth in his prime could increase a randomly chosen team’s chances of winning it all by 35%; Barry Bonds helped by merely 7%. Strong evidence indeed that a great player has become less valuable in recent years.
But does it mean that in earlier times we would prefer a few seasons of greatness over many years of mere good play? To answer that question, we have to find the derivative of these lines; to make that easier, I fitted a polynomial equation to each of those lines and differentiated that. In laymen’s terms, a derivative tells us the change in value at each point; as it goes up, each win becomes more and more valuable in terms of World Series Added, and vice-versa.
Would you look at that? Through 1968, the value of each win above replacement (WAR) peaked about seven WAR, but as the leagues expanded, that number dropped to four or five. Indeed, in earlier days, a great peak did have more value than it does today, though I would be careful not to overstate it.
Addition or Subtraction?
One reader suggested that perhaps Pennants Added was not the correct approach, and instead we should be looking at Pennants Subtracted. The idea is the same, but instead of looking at how much a team’s playoff chances improve when we add a player, perhaps we should look at much they decline when we subtract that player.
I thought it was an interesting idea, so I decided to take a look. The following graph shows the original Pennants Added graph that appeared in last week’s column (blue), and a graph of Pennants Subtracted (red):
Needless to say, Pennants Subtracted punish great players a lot more than Pennants Added (which are kind of neutral, actually). I am convinced this is not the right method, and I am certain that the people who voted Sandy Koufax into the Hall of Fame are too.