Are Teams Benefiting From Relievers Pitching Less?

Yesterday, Brian Kenny and I spent a few minutes talking about relief pitchers on Clubhouse Confidential, and specifically, about the differences in the role of a middle reliever versus a closer. Both Kenny and I believe that the idea of a “closer mentality” is mostly a myth, but we do spend some time talking about why some guys aren’t cut out for the traditional closer role. If you want to watch the segment, I’ve embedded it after the jump, and will expand on one of the things I said on the show below that.

When Kenny asked me what the best way to gain value from a relief ace was, I pointed out that I preferred the method of bullpen usage that was in place before the rise of the save as a statistic of importance. Obviously, the structure of the bullpen has changed a lot over the last 30 years, and the adoption of specialist relievers and one-inning stints have led to larger pitching staffs and far more frequent pitching changes. Managers are being more aggressive than ever in exploiting platoon advantages and trying to limit the amount of innings their relievers work in order to increase effectiveness when they do pitch.

Back before the creation of the modern bullpen, it wasn’t at all unusual to see a reliever throw 100+ innings in a season. In fact, the 1982 season from Bob Stanley is perhaps one of the most interesting years a pitcher has had in quite a while – he appeared in 48 games, went 12-7, racked up 14 saves, and threw 168 1/3 innings in the process. He didn’t start a single game the whole year, but he finished fourth on the team in totals innings pitched and threw just seven fewer innings than Mike Torrez, who started 31 games for the Red Sox that season. For comparison, Stanley faced 694 batters in ’82, while Jeff Samardzija led all Major League relievers in 2011 with 380 batters faced.

The change in bullpen usage is the biggest difference in the sport now compared to 30 years ago. For reference, here’s the average number of batters faced per relief appearance for each year since 1982:

Season BF/G
1982 7.23
1983 7.09
1984 6.94
1985 6.74
1986 6.63
1987 6.51
1988 6.38
1989 6.31
1990 6.08
1991 5.80
1992 5.54
1993 5.39
1994 5.40
1995 5.38
1996 5.39
1997 5.14
1998 5.08
1999 5.17
2000 5.19
2001 4.91
2002 4.92
2003 4.93
2004 4.85
2005 4.65
2006 4.71
2007 4.60
2008 4.65
2009 4.58
2010 4.41
2011 4.37

The downward trend is so strong that the totals are lower than the previous year for almost every single pair of seasons during the timeframe. And, despite the fact that modern bullpen roles have been well established for quite a while, the dwindling rate of batters faced per appearance shows no signs of slowing down. While the drop from 1982-1991 was the most extreme, the last two decades have each seen the league shed an additional half a batter per reliever appearance, and given that we’ve seen teams now expand to carrying 13 pitchers at times, there seems to be no end in sight to this trend.

Teams have transitioned away from a few pitchers carrying large loads (the 1982 Red Sox got 1,453 innings by using just 14 different pitchers, while the 2011 Red Sox got 1,457 innings from 27 pitchers) into a model where a lot of pitchers carry significantly smaller loads. In 1982, there 64 relievers who faced 300 batters or more – in 2011, there were 25. And yet, the relative amount of innings pitched by relievers as a whole hasn’t changed all that much, increasing from 30.6% in 1982 to 32.6% last year. The change in bullpen management has been much more about redistributing relief innings pitched from few to many, rather than asking bullpens to carry a larger share of the load, even while they take up a larger portion of the roster.

Since teams have dedicated more rosters spots to relief pitchers in order to help facilitate more situational match-ups and minimize the wear and tear on their best relief pitchers, they’d clearly be expecting some kind of return on that investment. In order to justify the extra roster spots and the redistribution of innings, they’d need to see some kind of performance improvement in order to make the change pay off. And, really, with pitchers facing fewer batters, you’d expect them to be able to throw harder and exploit platoon advantages for better results overall. The trade-off should be more quality for less quantity.

But, looking at the numbers, we don’t really see much evidence that the modern bullpen has helped relievers perform better at all.

Season BB% K% HR/9 BABIP LOB% ERA- FIP-
1982 9.1% 14.7% 0.68 0.276 73.9% 89 95
1983 9.3% 14.7% 0.70 0.279 73.1% 93 97
1984 9.2% 15.2% 0.72 0.279 74.0% 91 98
1985 9.6% 15.1% 0.80 0.276 72.5% 96 99
1986 9.9% 16.2% 0.81 0.285 73.3% 95 98
1987 9.7% 16.9% 0.96 0.287 72.3% 94 96
1988 9.2% 15.9% 0.71 0.281 73.5% 95 99
1989 9.3% 16.7% 0.65 0.277 73.5% 92 95
1990 9.5% 16.1% 0.74 0.283 73.5% 93 98
1991 9.6% 16.5% 0.77 0.279 73.1% 95 99
1992 9.7% 16.1% 0.68 0.280 74.4% 94 99
1993 9.7% 16.9% 0.84 0.292 71.7% 96 98
1994 10.1% 17.1% 0.98 0.298 71.1% 98 100
1995 10.3% 17.8% 0.99 0.294 72.3% 96 99
1996 10.5% 18.3% 1.03 0.295 71.9% 94 98
1997 10.0% 18.5% 0.98 0.302 72.4% 97 99
1998 9.8% 18.2% 0.96 0.291 72.2% 95 98
1999 10.6% 18.1% 1.09 0.296 72.0% 95 98
2000 10.7% 17.9% 1.09 0.297 71.3% 96 97
2001 9.4% 19.2% 1.05 0.287 72.9% 93 96
2002 9.8% 18.5% 0.96 0.287 73.3% 94 97
2003 9.5% 18.1% 1.00 0.289 72.8% 94 97
2004 9.5% 18.7% 1.04 0.292 73.1% 93 96
2005 9.4% 18.1% 0.97 0.291 72.8% 96 98
2006 9.4% 18.6% 1.03 0.295 72.8% 93 97
2007 9.6% 18.8% 0.90 0.295 71.7% 94 96
2008 9.9% 19.2% 0.94 0.293 72.8% 95 98
2009 10.1% 19.5% 0.94 0.292 73.3% 94 98
2010 9.6% 20.3% 0.89 0.294 73.5% 96 98
2011 9.4% 20.6% 0.85 0.287 74.3% 94 97

Over the last thirty years, walk rates by relievers are essentially unchanged. They went up a bit when the home run barrage took over the late-1990s, but have gone back down as home runs have become less common. The ratio of walks to home runs is pretty steady and consistent over the last thirty years, and there’s certainly no evidence that the modern day bullpen has helped pitchers avoid the base on balls.

On the other hand, strikeout rate has skyrocketed, increasing by 40% since 1982. This would seem to support the idea that relievers can be more effective in shorter stints, and that playing the match-ups can help prevent run scoring. However, there’s a problem with that theory – the strikeout rate of starting pitchers has gone up 41% during the same time frame. While strikeout rate has been raising at the same time that the modern bullpen has been evolving, this seems to be a case where correlation is not causation. If starters are seeing the same rise in strikeout rate, that points to a more fundamental shift among hitters – more sluggers swinging for the fences, the rise in acceptance of the strikeout as just another out among organizations – rather than a specific benefit being given to relievers from their new roles.

Likewise, it doesn’t appear that relievers are really generating much of a benefit when hitters do put the bat on the ball. Home run rates have risen at a similar rate as what starting pitchers have experienced, and likewise, batting average on balls in play has increased significantly over the years. While relievers do post slightly lower BABIPs than starters (theorized to be the result of being able to throw harder in shorter stints), this was true even when relief pitchers were throwing multiple innings and carrying much heavier workloads.

In fact, if you look at the sum of the components (in the table above, that’s ERA- and FIP-), there’s just no evidence that bullpens are preventing runs at a better rate now than they were before the current roster construction norms came along. Any improvements in quality of performance by the elite relievers have been offset by the fact that more innings are now being given to inferior arms, so the trade-off has essentially resulted in a change of no real benefit.

We’re told that defined roles are supposed to make a reliever’s job easier by giving him a usage pattern he can adapt to. This makes sense from an intuitive standpoint, but the results don’t really show much of an effect. Teams have essentially taken two roster spots away from position players and handed them to the bullpen without seeing a tangible improvement in performance from their relievers overall.

The ROI on the modern bullpen just isn’t there. At some point, someone is going to get back to using relievers the way they were used 30 years. The current paradigm takes up too many roster spots and simply shifts innings from your best arms to your worst ones. There’s a time and place for playing the match-ups, but if you have a guy who can get batters out from both sides of the plate, there’s no reason he should only be used for an inning at a time. Instead of ridiculing the Braves for how they used Jonny Venters and Craig Kimbrel, perhaps we should be applauding them for refusing to give into a trend of roster usage that just hasn’t provided any real sustained benefit.




Print This Post



Dave is a co-founder of USSMariner.com and contributes to the Wall Street Journal.


86 Responses to “Are Teams Benefiting From Relievers Pitching Less?”

You can follow any responses to this entry through the RSS 2.0 feed.
  1. Marty says:

    This stuff really shows me that MLBN is starting to really “get it” in a way that ESPN and others absolutely do not. Great segment.

    +19 Vote -1 Vote +1

  2. bgrosnick says:

    Dave, I think this article is tremendous. I really enjoyed it, and would love to see continued delving into how relief pitching can deliver more ROI.

    I do, however, have a small question regarding Atlanta’s use of Venters and Kimbrel. In 2011, Venters faced 4.2 batters per game, which is less than league average. Kimbrel faced about 3.9, also less.

    Though these two pitchers pitched in a lot of games, they didn’t actually face a lot of batters, comparatively. Is it more important that elite relief pitchers be used in a large number of games, or face a large number of hitters?

    Vote -1 Vote +1

    • Marty says:

      This is an interesting point because it seems intuitively like there really should be a difference between throwing 100 innings in 100 games and 100 innings in 50 games because of days of rest, etc.

      Vote -1 Vote +1

    • Dave Cameron says:

      Right, I’d rather see longer appearances than more appearances. But, my point in bringing those guys was up was more about the reaction to increasing the workload on your best relievers and being willing to lean on them in a more significant way than we’ve seen recently, which was met with a large round of criticism. I think that kind of move should be applauded rather than laughed at.

      Vote -1 Vote +1

      • Jason says:

        I don’t think you can control for SO % by comparing to Starting Pitchers becuase SP have also seen their usage decline havnt they? Its possible that both SP & RP are missing more bats due to reduced usage.

        +5 Vote -1 Vote +1

  3. TheOneWhoKnocks says:

    The benefits I was looking for were things you didn’t explore in your research. Such as, does it keep your good relievers fresher down the stretch(sept-october), does it keep players like Venters and Kimbrel healthier long term? Does it mean you are using your worst relievers in the lowest leverage games and situations(games where you are winning big or losing big) meaning you get to use your best pitchers in close games? This shows us that bullpens as a whole are performing similarly, but I don’t think it’s anywhere close to enough information to say there has been no benefit.

    Vote -1 Vote +1

    • Dave Cameron says:

      The health thing is something we just can’t really answer, since no one has an historical injury database that let predates the modern bullpen. We’d have to use proxies like career longevity to stand in for health, but even then, we’d be dealing with issues of differing medical technologies, advances in surgeries, and other structural changes that we can’t really quantify. And, unfortunately, it’s not obvious one way or another whether pitchers are healthier now than they used to be. Clearly, we still have a lot of pitchers going down with arm problems, so the modern bullpen hasn’t fixed that issue.

      In terms of leverage, that’s actually been declining for a while as well. The average LI for when a reliever entered the game in 1982 was 1.42, but it was just 1.17 last year. The same trend holds true even when you filter by relief pitcher type. For example, I grabbed all reliever seasons since 1982 with 40+ IP and an ERA- of 85 or better, then clustered them together by year. In 1982, the average gmLI of that group was 1.58, while last year, it was 1.39.

      Vote -1 Vote +1

      • Phils_Goodman says:

        “The average LI for when a reliever entered the game in 1982 was 1.42, but it was just 1.17 last year.”

        Couldn’t that lower leverage be thanks to the effectiveness of the previous middle reliever, and the tendency of bullpens to run into high-leverage jams with less frequency nowadays?

        Vote -1 Vote +1

      • B N says:

        It might be worthwhile to look at it in terms of pitcher wear vs. performance: e.g. does a modern bullpen pitcher maintain his effectiveness longer into the season than the older style did. That should be possible by doing some general trend tests (e.g. Kendall) and seeing if the modern pen is significantly more effective in the latter parts of the season than the non-modern pen.

        It would be an interesting analysis, though I can’t say I’d expect the results to show any significant differences. Even if there were significant differences, that might just indicate that a playoff club might be best suited by babying their relievers for the first half of the season and riding them harder at the end. To an extent, this already happens as you tend to see playoff-contention and playoffs having longer stints by the better relievers (e.g. Mariano will do 2 innings in the playoffs, but never in the regular season).

        Vote -1 Vote +1

  4. Rob says:

    This is the best article I’ve ever seen on this website, and there have been hundreds of good ones.

    Vote -1 Vote +1

  5. Bryce says:

    One possible explanation is that the top relievers really are helped by throwing fewer innings, but in the aggregate data, this effect is washed out by the resultant increase in the use of bad pitchers. If the top relievers are appearing in the highest-leverage situations, this transition could be beneficial on net.

    One way to measure this would be to look at the variance in individual reliever performance: if it has increased (while the average remained roughly the same), it would seem like a good bet that high-leverage reliever performance has improved.

    +7 Vote -1 Vote +1

    • J-Dog says:

      I think this is a good point. I wonder if reliever performance in high leverage situations has improved over the years. This might suggest that the trend might create value after all.

      Vote -1 Vote +1

    • Brendan says:

      An excellent point, getting the same stats out of a group that includes a lower average talent level is improvement.

      Vote -1 Vote +1

      • B N says:

        I don’t know if that’s the case though. Getting the same stats out of more people is unlikely to improve your team. If I could roster one pitcher and he could pitch 9 innings every game, performing as well as my entire pitching staff used to… well, I’d have a lot of roster spots for OF platoons, I’ll tell you that.

        I think the aggregate performance is really what is important. I could care less if you’re maximizing the value of worse pitchers: this isn’t a tee-ball game where everyone should get a chance to play. If I could get equal total performance and not play those worse pitchers, I’d be a fool to play them just because I can get them to play as well as the guys I already have.

        Vote -1 Vote +1

  6. Richie says:

    Yes, leverage is the first thing to examine here. Better performance from your relief stars traded for the same amount of worse performance from your relief dregs is actually a productive tradeoff.

    Vote -1 Vote +1

    • JMag043 says:

      I agree. I think it would be important to study how much the best two or three relievers on a team have gotten. They are going to get the high leverage situations, whereas it is pretty irrelevant to count the stats of your worst reliever coming in to a game down by 6 runs and giving up 4 more. There must be some sort of minimum leverage cutoff.

      If your closer is protecting one run leads and your mop-up guy has a 5.00 ERA, you’re better off with them than having two guys who fall somewhere in the middle stat wise.

      Vote -1 Vote +1

      • Cidron says:

        might be hard to do, as “the best” may fluctuate during the course of the season due to injuries, matchups, and development, among other reasons.

        Vote -1 Vote +1

  7. Will says:

    But, what if shorter appearances are leading to greater career longevity? If true, those two extra pitchers wouldn’t necessarily be intended to improve bullpen performance on seasonal basis, but rather extend the useful life of effective relievers. In other words, having a few lower-end relievers may cost better relievers innings in the short-term, but end up allowing them to pitch more over their careers.

    Vote -1 Vote +1

    • Richie says:

      That should still then show up in better relief appearance over the years. Which we aren’t seeing in this data. And given how often setup men change teams, I’m not sure how important it is to Manager Richie to longevize the career of a guy just as likely to be pitching against me a few years’ hence.

      Vote -1 Vote +1

      • Will says:

        Not necessarily, especially if the better relievers are being saved for higher leverage. It’s kind of like throwing Boone Logan to the wolves so Mariano Rivera can hang back and protect the flock. Also, in this day and age of proactive GMs and influential agents, I don’t think managers have the leeway to overuse relievers.

        Vote -1 Vote +1

      • The Real Neal says:

        We are seeing better relief performance, a 40% increase in strikeout rate. It just so happens that the hitters have also increased their performance.

        Vote -1 Vote +1

      • The Real Neal says:

        I should have also added that if relievers are treading water over the last 30 years, it is while combating the efforts of MLB to make the game “more exciting” by increasing the scoring environment.

        It’s not just hitters that have decided to go with a Rake and Take approach, with the advent of weight lifting and kiln-dried maple bats, the rake and take approach becomes increasingly easier to pursue.

        Even if the strike zone is the same size as it was 30 years ago, I am willing to be a large amount of money that batted ball speeds are substantially higher than they once were.

        Vote -1 Vote +1

    • Dave Cameron says:

      The shelf life of a premier reliever is still pretty short, and with the way that salaries have dramatically increased for closers due to the rise of the save, this wouldn’t be a very good trade-off even if it was true. A team would be much better off getting 100 innings from a relief ace when he’s making the league minimum than getting 50 that year and then paying him $10 million to give them 50 more in six years when he’s a “proven closer”.

      Vote -1 Vote +1

      • chuckb says:

        Right, with the way relievers have been paid and the manner in which they move from team to team, what incentive does a team have to try to promote career longevity at the expense of immediate quality? In other words, why save a reliever’s innings for when he’ll be more expensive and/or with another team?

        Vote -1 Vote +1

      • Will says:

        I wonder if the shelf life is still as short as it has been in the past. Also, I think with relievers, there is a premium for consistency, which is why I think it can be better to pay more for a sure thing. The chances of finding a Papelbon at the league minimum isn’t great, so that’s why teams pay for the real thing. Do they wind up spending more? Sure. But they also mitigate the risk of winding up with an awful reliever. I don’t think you can analyze this question from a aggregate basis because with a late inning reliever, the specific is relevant.

        Vote -1 Vote +1

    • It’s great how an excellent article spawns so much intellectual discussion.

      Vote -1 Vote +1

    • Baltar says:

      Managing players for career longevity is a sucker’s move, maybe unless you’re the Yankees. After 6 years, he’s going someplace else anyway.
      In the case of relievers, it makes even less sense, as hardly any of them have long, successful careers anyway.

      Vote -1 Vote +1

      • Barkey Walker says:

        You do realize that how the team uses their players is broadcast on TV, right? Sometimes you have to feed the long term goal at the expense of the short term goal.

        Vote -1 Vote +1

  8. Richie says:

    Also, if starters have actually performed worse than 30 years ago while relievers have stayed the same, that would suggest that the new usage pattern is helping relievers better stave off the greater offensive forces.

    Vote -1 Vote +1

  9. Fred says:

    After extensive statistical research I think I found the perfect comp for Dave Cameron

    http://www.thehunchblog.com/wp-content/uploads/2011/05/d-quasi.png

    Vote -1 Vote +1

    • Kupe says:

      LOL, poor Dave’s not exactly an Adonis. Remarkable similarity.

      Vote -1 Vote +1

      • Parksie says:

        Too true! Also, what’s up with Brian Kenny’s grey, ruddy grill? Don’t get me wrong – compared to Cameron he’s basically a young Peter O’Toole – but some of these people are relegated to the internet & talk rado for a reason..

        Vote -1 Vote +1

  10. jim says:

    good work here; loving you on CHC, definitely the best baseball show on television

    Vote -1 Vote +1

  11. Neil says:

    Essentially it comes down to a simple quesiton. Which is more harmful to arms, pitching mutliple innings in less games, or less innings in more games.

    I think it is the latter. As warming up in the BP must be calculated into the process.

    Great article Dave.

    Vote -1 Vote +1

    • hk says:

      Agreed. I wonder how many pitches a typical reliever throws in the bullpen to get ready to enter a game and the amount of effort that goes into a warm-up pitch relative to a game pitch.

      Vote -1 Vote +1

  12. Bill@TPA says:

    I agree that this is great work, but this is a really interesting response by Tango: http://www.insidethebook.com/ee/index.php/site/comments/do_relievers_today_perform_better_because_they_have_shorter_outings/

    Seems like we’d need to isolate the top relievers somehow, to make sure it’s not just all the extra innings the fringy guys get to throw that’s dragging the modern numbers down.

    Vote -1 Vote +1

  13. razor says:

    The first thing I thought of after reading this article is how much money do the starting pitchers make (on average) over the last 30 years, compared to their relief counterparts? On a percentage basis, has the salary divide between starting pitchers and relievers grown?

    I suspect organizations are also trying to protect their bigger investments. I’m not saying I think it’s justifiable, I just think maybe that also has something to do with bullpen usage. Just a hunch really. I’d love to see the salary data.

    Vote -1 Vote +1

  14. Dan says:

    Dave, fantastic article. Wouldn’t your thesis actually justify contracts like those given to Papelbon, IFF the Phillies use him like a pre-1982 reliever? Or if the Yanks used Soriano in a similar way?

    The smaller the bullpens get, and the more innings relievers pitch, the bigger premium on effective relievers, right?

    Vote -1 Vote +1

  15. chuckb says:

    This is truly great stuff, without a doubt one of the best posts at fangraphs.

    I wonder if the explanation for this sort of usage might correspond to position players’ roster spots. So we know that relievers, by being used in these roles, aren’t performing any better. However, they’re not performing any worse either. The natural tendency, therefore, is to suggest that teams are misallocating their roster spots by having so many relievers and so few position players. I tend to believe this is probably the case.

    What if, however, the position players that would take the last few roster spots (if teams had 10, rather than 12 or 13, pitchers, for example) are worse than they were 30 years ago? In other words, by allocating more playing time to these 2 or 3 extra position players, teams perform worse than by allocating these roster spots to the additional pitchers. Having fewer position players forces teams to use their best players more often so maybe the “gain” from using the team’s bullpen in this manner is really in the fact that teams’ best players play more defensive innings and get more PAs. I’m just speculating here so I might be very wrong.

    Vote -1 Vote +1

  16. The Book says:

    I covered all of this several years ago. Not news.

    -11 Vote -1 Vote +1

  17. Jesse says:

    Seems like a tough way to study this issue; With fewer batters faced per pitcher, you gotta have more bodies in the pen. Ceterus paribus that should decrease the talent pool somewhat.

    The rest could impact the performance of individual relievers as your title seems to inquire about.

    Vote -1 Vote +1

    • Snapper says:

      That’s the point. You use your good relievers less, so even if they improve their per-IP performance, it gets washed out by the crappy 6th and 7th RPs you have to use.

      Vote -1 Vote +1

  18. Dave says:

    Surely graphic representations of the numbers would serve better than tables

    Vote -1 Vote +1

  19. BDF says:

    Just so cool to see you on TV, Dave.

    Vote -1 Vote +1

  20. BoSoxFan says:

    Dave, Carson noticed that you almost never blink http://www.fangraphs.com/not/index.php/dave-cameron-blink-watch/
    I was watching, and you did not blink in this video. What is your freakish capability to never blink?

    Vote -1 Vote +1

  21. Bath says:

    What about the fact that the league has become larger since the early 80s. With the addition of the 5th rotation slot and a handful of new organizations, the talent level that appears in bullpens has to be at least a little watered down.

    Vote -1 Vote +1

    • JMag043 says:

      Could it be that the best are getting better and the worst (who are used in lower leverage situations) are getting worse? That would leave the aggregate about the same as a whole.

      Vote -1 Vote +1

  22. Alfredo says:

    Interesting, one thing I’d wonder is if there’s any difference in how often relievers are getting hurt in the last 30 years. Teams might be incentivized to pursue this new strategy if they think it will keep their best relievers healthier. Obviously that shouldn’t matter if the results are the same, but it might explain why it’s happened this way.

    Vote -1 Vote +1

  23. Mike Green says:

    Well done. The pens of 2011 have another natural advantage over the pens of 1982 beyond the shorter stints. The fact that they are longer means that benches are shorter, which means that there are fewer ways that managers can gain platoon advantages against a pen. In 1982, it was common for a manager to bring on a LHH as a pinch-hitter for a weak right-handed hitter against a right-handed reliever, for the opposing manager to come on with a lefty and for the first manager then to pinch-hit for the pinch-hitter with a RH pinch-hitter. You could call it an Earl Weaver special. This happens very rarely now because of short benches and hence the prevalence of LOOGies…

    Vote -1 Vote +1

  24. DrBGiantsfan says:

    Are SP’s facing fewer batters/start and does this a causative factor in their increased K rate?

    Vote -1 Vote +1

  25. Cidron says:

    To properly finish this writeup, your next article should be along the lines of those two roster spots given to relief pitchers, and not the dregs of the bench. What is the lost bench benefit, production, etc of those two (or more) spots. Take it from the batters/bench angle. Less positional flexibility as well. A pair of articles (pitching angle here, and batters) would paint a more complete picture.

    Vote -1 Vote +1

  26. joser says:

    Great article. This leads me to wonder about what we might call the corollary: if the rise of large bullpens has cost teams bench spots for position players, to what extent has that harmed teams? If we assume all teams have more or less matched each other (over time) in robbing bench spots to make room for more relievers, have we seen a reduction in late-inning scoring due to the lack of bench bats or defensive replacements? Or have those thin benches been balanced out by fringe-y relievers on the other side?

    Vote -1 Vote +1

  27. Luke says:

    I knew Dave Cameron would look just like he did on TV, congrats DC, one day someone sauve and stat-centric will come in and handle things from here out.

    -10 Vote -1 Vote +1

  28. cowdisciple says:

    I wonder about pitch counts. The average number of pitches per PA has also increased over the same time period, right? How much of the variance is explained by that?

    Vote -1 Vote +1

    • cowdisciple says:

      According to B-R, pitches per game increased by a bit over 8% from ’88 to ’09. It makes sense that teams would need more pitchers, because they need more pitches thrown.

      Vote -1 Vote +1

  29. Scott Batura says:

    Normalizing the reliever’s K% against the starter’s K% might actually be drowning out one of the advantages of modern pitcher use. In general, more innings are being pitched by the relievers, which reduces the strain on the starters which could certainly raise their K%. So then maybe the shorter relief outings then raised the reliever’s K%.

    Hokey, I admit, but simply stating that the matched increases in K% between starters and relievers makes them a product of pitching environment might be missing something.

    Vote -1 Vote +1

  30. steve-o says:

    wow, some real sycophants haunting these parts.

    -11 Vote -1 Vote +1

  31. reillocity says:

    I keep thinking that we’ll soon see a renegade team experiment with going back in the other direction. I could see that team first go with a 11-man pitching staff and 14 position players with one of those position players acting as the emergency mop-up man given his recent background as a pitcher (think Brian Bogusevic or Adam Loewen). If that club were smart, they’d use that extra position player spot to create a formidable platoon at a position where they didn’t have a solid everyday guy (think something along the lines of Jeff Francoeur and Seth Smith sharing one corner outfield spot). The thinking would be that you could get such a bad-splits player 3 plate appearances with the platoon advantage and then you’d have a deep enough bench to take that player down for an opposite-handed bat if and when the opposing manager wanted to recapture the platoon advantage.

    Vote -1 Vote +1

  32. Rob says:

    The thing that bugs me most about how managers use their bullpen is they will NOT bring in the closer anymore before the 9th inning or their primary setup guy before the 8th (although you’ll see it happen some in playoffs). What if the game situation is more critical in the 7th inning, or 8th? Then you get to watch your 3rd or 4th best reliever blow the game wide open. ’84 Tigers essentially had a 2 man bullpen and Sparky would use Lopez or Hernandez wherever he felt he needed to in order to win. Of course he overdid it a little and their arms fell off later, but he won a World Series.

    Vote -1 Vote +1

    • Baltar says:

      A very good point, Rob.
      I just cannot understand why managers have made themselves slaves to the save rule.
      In a game a couple of years ago, the TV cameras would show the closer every time he got up and started warming up and every time he would sit down again. Whenever the save rule apllied, the closer got up; when it no longer did, he sat down.
      The announcers acted like that was smart managing, but to me it was like a clown act in a circus.
      In a football game, if a team’s starting QB got injured or was having a lousy day, would the coach put in his 3rd-string QB until the last 2 minutes, then bring in the 2nd-string QB (and only if his team was leading)?

      Vote -1 Vote +1

  33. Socrates says:

    Really good article. Very interesting and it clearly makes the case against a change in baseball that everyone pretty much just assumes is beneficial. I do have two questions. (for the record, I have not read all the comments so these points have already been addressed).

    1) the increase in K% is correctly discounted based on a similar increase from starters. I would be interested to see if BB% not changing much implies any change compared to starters.

    2) I am curious if the overall rates are affected by the fact that in 2011 most teams have pitchers that pitch ONLY in games where there is virtually no chance they are going to win. This is another level of specialization and it is rare that any one reliever throws more than 4 innings in a game but there are relievers who “suck” on every team and never pitch in close games. I am not sure these guys existed in 1982. For instance, clearly Kimbrel and Venters were VERY good in 2011, but are the improvement that the Braves got using them in a limited roll offset by the fact that they gave innings to guys like Proctor and Linebrink which in turn makes it look like there was no value preserving their better relievers (most teams have their versions of good and bad relievers)

    Vote -1 Vote +1

    • Socrates says:

      I ran the numbers on point #1.
      1982 Starters: BB% 7.91
      2011 Starters: BB% 7.46

      So BB% has improved among starters about as much as it improved among relievers.

      Of course after that exercise I realize that the only real important stat is run prevention vs league. ERA- and FIP- pretty much sum up the argument.

      On point #2, I am still curious if we can figure out innings in close games vs innings in games that are out of reach. Essentially, I wonder if we are essentially sacrificing quality in games that are out of reach so that we can have higher quality in games that are close.

      Vote -1 Vote +1

  34. Phils_Goodman says:

    I would have liked to see a WPA column on your tables, just in case there were real benefits to recent relief strategies that were being offset in the aggregate by “mop-up” innings.

    Vote -1 Vote +1

  35. Luke M. says:

    “Instead of ridiculing the Braves for how they used Jonny Venters and Craig Kimbrel, perhaps we should be applauding them for refusing to give into a trend of roster usage that just hasn’t provided any real sustained benefit. ”

    I might agree with this if the complete opposite weren’t true. Fredi does let the current trend of roster usage dictate his bullpen management. He refuses to let Venters or O’Flaherty pitch more than one inning at a time. He is a slave to the 7-8-9 delegation and refuses to deviate. I don’t mind the amount of appearances between those three guys, but I do mind that Fredi used all three on the same night when he could have gotten away with only using two (or none) of them.

    Vote -1 Vote +1

  36. Nick says:

    Following up on Luke’s excellent post, Fredo mismanaged the bullpen to the extent that it cost the Braves a playoff berth. While I will defer to Mr. Cameron regarding which stats and sample sizes are most meaningful, I think it’s worth looking at Venters’ and Kimbrel’s WPA for September relative to the rest of the season.

    The numbers don’t need to tell the story, as it was obvious to anyone who actually watched the Braves play that both of these guys were completely worn out due to overuse by the manager.

    One of these days Mr. Cameron might actually see the error in his opinion about how these guys were used in 2011 – maybe, but I won’t hold my breath.

    Vote -1 Vote +1

    • bstar says:

      No, the Braves team cost themselves a playoff berth. Their starting pitchers compiled a 4.55 ERA in Sep/Oct, their worst month of the year. The hitters compiled a .657 OPS in that same time period, also their worst of the year. To your point, the bullpen was just as bad: a 4.17 ERA, another monthly worst. What about O’Flaherty/Venters/Kimbrel? Their ERA combined was 3.12. While this number is off the spectacular results they produced the rest of the year, it kinda points to the three best relievers being the least of the Braves problems that last month.

      Great article, Dave.

      Vote -1 Vote +1

    • Cidron says:

      No, Just as a team that missed the final bucket in basketball, only to lose, “lost the game on the final bucket” isnt right, so is that statement. They had to play just bad enough to put themselves in a position to lose it in that manner. Had they put more distance between them and the next team…. Cant blame a 162 game season on a few outs basically.

      Vote -1 Vote +1

      • Luke M. says:

        I think losing Jurrjens and Hanson, plus having Lowe turn into a lemon was more detrimental than the tired bullpen, but I will not completely discount the fact that the three top bullpen arms were noticeably gassed.

        Vote -1 Vote +1

      • bstar says:

        Uhhhhhh, O’Flaherty didn’t allow a run the last month of the season.

        Vote -1 Vote +1

  37. SouthPawRyno says:

    Excellent article; definitely one of the best on this site (which is saying something). I’m also glad so see MLBN is finally starting to have more comprehensive shows on baseball rather than be an ESPN channel focused on baseball!

    Vote -1 Vote +1

  38. Interesting article. I’d like to make this point:

    You use the rising SO rate of both relievers and starters as part of your argument that nothing has changed for the latter, so why should it for the former. Yet starters have been pitching far fewer innings over the last two decades too, just like relievers. Isn’t it more likely that the changes in pitching, in general, including pitchers throwing harder as a group and increasing their SOs, led to a decrease in innings pitched for both starters AND relievers?

    Vote -1 Vote +1

  39. night_manimal says:

    Dave, I was wondering if how many innings starters went over the years had anything to do with the change in reduced batters faced by the relief pitchers. If starters are averaging 3 less batters faced each game that’s an additional 21 batters somebody needs to face each week. Maybe because starters aren’t lasting as long as they did in 1982 or finishing the games completely (734 CG in 82 vs 173 in 2011) it’s actually become just as much a necessity over a long season as it is a strategy.

    It would also be interesting to see the ERA- and FIP on inning by inning breakdown 6th+7th vs the 8th vs 9th. Usually there’s more changes made in the later innings to maximize match ups in these high leverage situations. Maybe WPA is they way to go instead since that does a good job of breaking things down. Are the long relief/swing men of today not up to snuff compared to the long relief/swing men of the 1980’s? Are the extra pitchers used for high leverage situations in the 8th and 9th better today than they were 30 years ago?

    Vote -1 Vote +1

  40. Doug says:

    I think the elimination of the bunt in baseball has had an impact on the pitching. Teams don’t see the advantage to bunting anymore. I think the use of the relief pitcher today is a positive thing, you can’t cross generation with any kind of accuracy. Teams that have won the series the last couple of years had good improved pens. The Cards and Giants beat the Phillies by out lasting them, not by their starting pitching. The is not garantee though, the Rangers beefed up their pen and the series came down to the last pitch twice. Ask them if relief pitching is important. There’s only one Mo, and he’s ready to retire. Off the subject, the biggest change they should make in baseball is sliding into home. They have changed the bean ball rule for pitchers, throwing them out after a warning, why not this rule.

    Vote -1 Vote +1

  41. Guy says:

    Isn’t WPA a good way to determine if relievers are producing more value now than in the past? Over the past 5 years, according to this site, relievers in MLB have generated an average WPA of about +50 per season. If we go back 30 years, and look at 1977-1981, relievers generated only about +10 WPA per season. I don’t know how reliable your historical WPA data is, but it sure looks like the current usage is generating a lot of additional value.

    Vote -1 Vote +1

  42. Guy says:

    Following up, Fangraphs has WPA data going back to 1974. This is the approximate average reliever WPA for MLB per season:
    1974-86: +12
    1987-99: +24
    2000-11: +47
    Assuming this data is correct, then Dave’s conclusion that changes in reliever usage have not benefitted teams is false. It could still be true that adding more bench position players would be even more valuable than having a 12th or 13th pitcher (though I tend to doubt it), but shorter appearances do appear to have made relievers more valuable.

    .

    Vote -1 Vote +1

  43. kick me in the GO NATS says:

    What you have failed to mention is that teams have far better information on pitchers today than the past. This better information is a huge change that has been offset by further specialization by pitchers. The fact that relief pitchers are not worse than they were 30 years ago is surprising to me. When you have a bigger roster of pitchers you are more likely to have less talented pitchers in your pen. Better information allows teams to better exploit the pitchers they face.

    this article would be more interesting if you looked at the 5 best relief pitchers on teams; Are they better or worse than years past. I suspect those pitchers are on average far more effective. Its the long relievers that are far worse regardless of specialization.

    Vote -1 Vote +1

  44. Andy says:

    Dave,

    Excellent analysis a always, but I think your analysis can be enhanced and more easily understood using better charts and graphs. This would aid the reader in understanding the story you’re trying to tell. Without your commentary, there’s no way anyone can read through the table and get those insights. This forces them to take your word for it. The human brain simply cannot find trends in a table of numbers.

    I’ve taken a stab at it for you here:

    http://vizwiz.blogspot.com/2012/03/are-teams-benefiting-from-relievers.html

    BTW, I ran into Theo Epstein on his first day as GM with the Cubs (we were both staying at the same hotel in Chicago). I recommended more visual analysis to him a well. I’m still waiting for his phone call.

    Keep up the great work!

    Vote -1 Vote +1

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>