It’s no secret that strikeouts are up, league-wide. And they aren’t only up from where they were in the past — they continue to rise, having risen eight consecutive years. The average batter last year struck out about once per five trips to the plate. Compared to 2005, last year there were five more strikeouts per two games. There is, of course, a limit to how high the strikeout rate can go, and strikeouts in 2013 were barely up from where they were in 2012, but still, it’s an important trend, and for some people it’s a worrisome one. I know that former colleague Rob Neyer has expressed much concern over how many strikeouts there are in the game today.
No one disagrees about the fact. Strikeouts are recorded by record-keepers, and the frequency of strikeouts is up. The real question concerns the explanation. Plenty of people have provided plenty of possible reasons for why the game has gone as it has. For example, a year ago, Dave noted a connection between the recent rise in strikeouts and the roll-out of PITCHf/x. The system allows for an accurate evaluation of the zone, and as it turns out, the zone has been growing. That’s some fascinating research, right there, but I wonder: how might we be able to isolate the strikeout effect of PITCHf/x? We can’t, really, but we can try. We just need to look somewhere else.
And by “somewhere else”, I mean we can look to the minor leagues. It’s obviously far from a perfect comparison, because it’s not like the minor leagues are the same as the majors, only without one variable. They’re very different player pools, and as we know, what a guy can do in Triple-A might not mean much when it comes to determining what he can do in the bigs. Also, the umpires are different. But, importantly: every park in the majors has a PITCHf/x setup. Few parks in the minors can say the same, and there’s certainly no harm in seeing what the numbers might tell us.
Below, a couple of graphs. I plotted the big-league strikeout rate for non-pitchers since 1996. I’ve also plotted the strikeout rates for Triple-A and Double-A since 1996, as I personally consider those to represent the high minors. Therefore, they ought to represent the closest approximations. Maybe there’d be something interesting in examining the data from Single-A, but that’s too far removed for my likings.
That y-axis ranges from 0 – 25%. Let’s pull things in a little bit more, to get a better look at the differences or the similarities:
Obviously, we already knew strikeouts have gone up in the majors. The increase was steady through 2011, then it suddenly jumped more than a full percentage point. Before PITCHf/x was widely installed, strikeouts had stayed more or less the same for a number of years. It’s certainly a curious link.
Quickly, of note: check out 2001. That year, big-league umpires were instructed to call the rulebook strike zone, with fewer strikes off the plate and more strikes up, above the belt. It generated a lot of conversation in spring training, and there was a spike in big-league strikeout rate. Interestingly, there was a bigger spike in minor-league strikeout rate, but then everything normalized back in 2002. It was certainly a short-lived adjustment, as we learned from PITCHf/x in its initial season.
Anyway, back to the point. You can see a recent increase in strikeouts in the majors, and in Triple-A, and in Double-A. In 2009, the rates were all basically identical. Since then, at all three levels, rates have gone up about two percentage points, give or take a little bit. Viewed in that way, you’d think PITCHf/x has just about nothing to do with this at all. How could we blame PITCHf/x, if there’s been something similar going on in places without it?
But look at the data before PITCHf/x. Strikeout rates were higher in the high minors than they were in the majors. We’ve had reliable, widespread PITCHf/x for six years, so let’s look at a few six-year windows.
Majors: 16.2% strikeouts
Majors: 18.2% strikeouts
Averages don’t capture the fact that strikeouts continue to rise. They do, however, capture the fact that, before PITCHf/x, there was a higher strikeout baseline in the high minors than there was in the majors. So while strikeout rates are comparable now, a few years ago players struck out in the high minors more than they did in the majors, so the increase in the majors has been more dramatic. Seems to me that’s probably the fault of a bigger zone. Seems to me that’s probably the fault of getting regular strike-zone feedback from superiors.
A strikeout-rate increase in the high minors preceded the strikeout-rate increase in the majors. Between 1996-2005, strikeouts in the majors stayed the same. Between Triple-A and Double-A, they were up 1.4 percentage points. It makes sense that we’d see something follow in the majors — some of those players striking out more get promoted, or, alternatively, some of those players doing the striking out from the mound get promoted. The whole idea is that there should be players in common, here, so we should be able to spot the same trends, in theory.
The trend is that strikeouts are up in the minors. They’re up more in the majors, especially since the introduction of PITCHf/x. The explanation, as you probably could’ve guessed, is complicated, with multiple parts. I think it’s possible for everyone to be right. Umpires have changed their zones, probably on account of the computer measurements. Pitchers are throwing harder. Pitchers might be throwing more sliders. Batters might be more willing to swing and miss, or be more patient, and teams might be more willing to tolerate strikeouts from guys at the plate. Teams have a good understanding of the importance of strikeouts for pitchers, so they’ll select guys who make batters go away, and the result is what we have today. I can’t call it the “end result”, because we don’t know if we’re at the end. Probably not. I imagine strikeouts will only continue to go up. It’s more a matter of identifying the cap. A cap does have to exist, someplace. And umpire strike zones can improve only so much.
Print This Post