One of the most interesting fields of study in baseball over the last few years has been that of pitch-framing, or pitch-receiving, or pitch-stealing, or whatever you want to call it. This is the stuff that’s made Jose Molina nerd-famous, and it’s drawing more attention with every passing month. Framing has been discussed on ESPN. It’s been discussed on MLB Network. It’s been the subject of countless player interviews, and what’s been revealed is that a great amount of thought and technique goes into how a catcher catches a pitch. Catchers don’t just catch the baseball. They catch the baseball with a purpose.
Research has uncovered a few outliers, like Molina and Jonathan Lucroy and, say, Jesus Montero and Ryan Doumit in the other direction. It’s interesting these guys can be given such different strike zones, since the strike zone is supposed to be consistent for everybody. And it’s interesting that, as much as people come up with run values in the dozens, it’s hard to identify the actual effect. For example, Rays pitchers this year have allowed a higher OPS throwing to Molina than when throwing to Jose Lobaton, the other guy. Last year, Molina again had the worst numbers. It reminds me too much of Catcher ERA for my tastes, but you’d think you’d see something. Instead, you see little. Where is the value going?
In this post, I’ll ask more than I answer, and this should be a jumping-off point for other, better researchers. It seems to me the effect of pitch-receiving is often overstated, and I’ve been searching for explanations why. I do have one theory, and before we get to that, another explanation of the same old familiar home-brewed stat. The stat is named Diff/1000, and it’s the difference between actual strikes and expected strikes per 1,000 called pitches, as derived from plate-discipline data available on FanGraphs. A positive Diff/1000 means a player or team got more strikes than expected. A negative Diff/1000 means the opposite. Whenever I calculate Diff/1000, now, I adjust it to set the league average at zero. The league average is always below zero, by a decent amount.
I went back and collected all team pitching data from between 2008 and 2013, and I calculated Diff/1000. Then I plotted Zone% against that stat, yielding the following graph:
As teams have been better at receiving — that is, as they’ve generated a greater rate of extra strikes — they’ve thrown a lower rate of pitches in the strike zone. The relationship is remarkably strong — given what we’re dealing with — and the bottom fifth in Diff/1000 have an average Zone% of 51.3%. The upper fifth in Diff/1000 have an average Zone% of 48.4%. Broken into quintiles of 36 teams:
- Quintile 1: 48.4% Zone%
- Quintile 2: 49.4%
- Quintile 3: 49.8%
- Quintile 4: 51.0%
- Quintile 5: 51.3%
I think there are three possibilities: The first is the most obvious. If a team knows it can get strikes off the edge, it’ll pitch to the edge more often. A pitcher will be more likely to miss the strike zone because the catcher might set up on the fringes, knowing he’s capable of turning some of those pitches into strikes. A strike on a pitch on the edge of the zone — or out of it — is a good strike, because those are difficult pitches to hit. Pitchers won’t stay in the zone if they don’t have to.
Another explanation goes in the reverse. Maybe pitchers have been getting extra calls because they’ve been pitching to spots out of the zone. Maybe instead of framing earning pitches, pitches have “earned” framing. But then, it’s been demonstrated that there are different receiving techniques, and some are better than others. So. The first two explanations are kind of related to one another.
And then there’s the possibility that this is just circular. Zone% is used in the calculation to derive Diff/1000, so maybe that’s just what I’m picking up. I am posting this knowing I might get exposed as an idiot, but at least then I could learn something from being called out. I noted before that I’m mostly asking instead of answering. I’d love to know if I’m screwing something up, because I’m interested in this though I’m not all that intellectually powerful.
Let’s move forward. Teams that have been better at receiving have thrown a lower rate of pitches in the zone. Therefore, even though they’re getting extra strikes on pitches off the edge, that should just be counter-balancing the overall strike rate. Let’s plot team strike rate against Diff/1000:
Here, we see almost no relationship. There’s a slight positive slope, but here are those same quintiles, from best to worst Diff/1000:
- Quintile 1: 63.3% Strike%
- Quintile 2: 63.0%
- Quintile 3: 62.7%
- Quintile 4: 63.0%
- Quintile 5: 62.8%
In terms of actually getting strikes, the best receiving teams have barely been better than the worst receiving teams. This is because the worst receiving teams have thrown more pitches in the zone than the best receiving teams. This seems like it could be where a lot of the value is going. Good receivers have generated good numbers of extra strikes, based on the pitches they’ve caught, but they’ve caught a lot of pitches out of the zone so the overall strike rate still looks fairly normal. Which could explain why we don’t observe that much of a boost.
Of course, a strike that’s called is better than a strike that isn’t called because a non-called strike could be a ball in play and balls in play can be bad. While there’s little difference in overall strike rate, there’s a greater difference in called strike rate and that’s why it would be better to be good at receiving than bad. But I’m still skeptical that we’re dealing with differences of dozens upon dozens of runs. It looks like a chunk of the added value is lost from throwing more pitches out of the zone, but I’m open to other ideas. This is not the last word on anything.
As a quick case study, Derek Lowe got a ton of extra strikes from 2008 to 2011, when Russell Martin and Brian McCann were his primary catchers. In 2012, those extra strikes went away, as Lowe joined the Indians. Between 2011 and 2012, Lowe’s Zone% increased from 37% to 47%. You can see some differences in the following heat maps, in which 2012 is on top and 2011 is below:
Lowe couldn’t pitch off the edge as much, which cost him out-of-zone strikes. But he just pitched in the zone more often, such that his overall strike rate barely changed. Of course, Lowe was also terrible in 2012, so this could be a case where Lowe perhaps depended on the quality receiving. Maybe, in this case, framing was a huge help. As a different case study, there’s Kyle Lohse, who’s joined this year’s Brewers, albeit after pitching to Yadier Molina. The Brewers lead baseball this year in Diff/1000, and Lohse’s Zone% is down from 51% to 49%. His strike rate is exactly the same. And so on and so forth.
I hope that others keep writing and keep researching, because this stuff is interesting. Good and bad pitch-receiving, unquestionably, makes a difference. I just badly want to know how much of a difference it makes. I’m unsatisfied that we have the real answer right now.
Print This Post