Ode to the ‘80s

Yeah, yeah, I know it was the era of big hair, synthesized drums, St. Elmo’s Fire, and James G. Watt. Believe me, I remember all that entirely too well.

But not everything about the 1980s is worthy of being lampooned on VH1. The Major League Baseball being played in the 1980s was terrific: not just a high quality of play, but an especially compelling style of play as well. For my money, the style of play generally presented in the 1980s was as good as it gets, a more interesting mode of baseball than that which we see today.

Preference in style of play is a completely subjective matter, of course. It’s entirely a question of taste to prefer a high-scoring game or a low-scoring game, to prefer to see a stolen base or a home run. There is no objectively right or wrong about this kind of thing.

So if you happen to prefer the kind of baseball we see in the current era over that which we saw 20 years ago, you’re entitled to that opinion. You aren’t wrong. (You’re misguided, maybe … but if you think St. Elmo’s Fire is a good movie, then you really are wrong.) Nevertheless, I’d like to present some comparisons that demonstrate exactly how the style of play has changed since the 1980s, and offer up my reasons for preferring the earlier mode.

Let’s compare the mode of Major League Baseball presented in the past four seasons (2002 through 2005), with that of the four seasons of exactly 20 years earlier (1982 through 1985).


Period	R/G
1982-85	4.30
2002-05	4.69

Delta	+ 9%

A 9% increase in overall major league scoring rates may not seem like a lot, but in historical terms, it truly is. The run scoring that was taking place in the early 1980s was very close to the historical average of all seasons since 1901, of 4.38 runs per team per game. The scoring that has been taking place in the early 2000s, even though it’s come down from the 1994-2000 period, is far higher than the historical major league norm. Only 30 individual seasons among the 105 that have taken place since 1901 have featured higher rates of scoring than the average of the past four seasons. The kind of scoring we’re experiencing now is not at all typical in historical terms; indeed it’s extremely high.

Being extreme isn’t a bad thing in and of itself. But extremely high scoring, while it may be seen as providing extremely great action for fans, also by definition also presents an extremely atypical balance between runs and outs, an atypical presentation of the tension surrounding the expectancy of scoring. If 4.69 runs per game is a better brand of baseball than 4.30, then wouldn’t 5.69 be better still? Wouldn’t each team scoring an average of 10 runs per game be even better than that? How about 20?

If you agree with me that more runs doesn’t automatically mean more interesting baseball, then you agree with me that there is a concept of an optimal balance between the action of offensive success and the tension of its uncertain achievement. You agree with me that, fun and exciting as run scoring is—I sure wouldn’t be happy with a mode of play in which the typical score was 1-0—there is a point at which high scoring becomes too much scoring, when run production becomes so frequent as to become less interesting. More runs mean more blowouts, fewer close games, and comebacks that are less thrilling because they’re less surprising.

So then the question becomes simply one of whether 4.30 or 4.69 is closer to the point of optimal balance, of providing the most interesting kind of game. From my perspective, 4.69 is a little too high. I don’t feel really strongly about it, but I think the types of games one typically sees in a 4.30 scoring environment are more interesting, demonstrating a better balance between runs and outs, than those typical of a 4.69-run landscape. Personally, I think the sweet spot is somewhere around 4.40. For what it’s worth, major league seasons since World War II in which scoring was between 4.30 and 4.50 have been 1991, 1986, 1985, 1983, 1979, 1977, 1970, 1962, 1960, 1959, 1957, 1956, 1955, 1954, and 1947.


Okay, so how about a comparison of the frequency and kind of hits batters produced in the early 1980s versus that of the early 2000s:

Period	H/G	1B/G	2B/G	3B/G	HR/G
1982-85	8.86	6.31	1.51	0.23	0.80
2002-05	9.05	6.07	1.82	0.19	1.07

Delta	+ 2%	- 4%	+ 20%	- 20%	+ 33%

Hits are occurring at a similar rate now versus 20 years ago, only about 2% higher in the current era. But the form the hits are taking is dramatically different: nowdays we’re seeing fewer singles, significantly fewer triples, just as significantly more doubles, and most vividly of all, a lot more home runs.

Make no mistake: I love home runs. The home run is a very thrilling play, a tremendous athletic feat that provides instant game-changing impact. Home runs are great.

But home runs bring with them a few downsides. First is that the over-the-fence version of the home run (which constitutes about 99.9% of modern home runs) is a play in which the ball is not touched by a fielder. It involves no catch, no throw, no relay, no slide, no tag, no baserunning decision or activity of any kind.

Let’s contrast this with the triple: the ball is hit (generally struck extremely well, in fact) into the field of play. At least one outfielder engages in frantic pursuit of the ball as it freely bounces and ricochets about. An outfielder finally collects the ball, and a sequence of relay throws unfolds. The batter, sprinting around the basepaths (in stark contrast with the slowly jogging batter on a home run), makes a split-second on-the-fly decision (the slightest hesitation will likely doom him) to try to make third base. The relay man makes a split-second decision to hold the ball or fire it to third, and usually he makes the throw. The runner steaming into third makes a split-second decision to go in standing or sliding (and may be frantically instructed in this regard by his third base coach, making a split-second decision). The third baseman usually braces himself to receive the relay throw, often a tricky short hop, with the runner usually sliding in, churning up a cloud of dust. The umpire positions himself and makes a call, which may be very close.

Which play features more action, is more exciting to watch? Much as I love home runs, to me this one’s a no-brainer: the triple is vastly more interesting. And yet we’re seeing distinctly fewer triples now than 20 years ago, indeed we’re seeing fewer triples now that at any point in history. From a fan’s perspective, this hasn’t been a good development.

Mental Health and the CBA
A particular bit of language in the latest CBA could have negative consequences for some players.

Doubles are up, and that’s a good thing. Doubles aren’t quite as fun as triples, but they’re terrific. But singles are down slightly. All in all, you have to love home runs even more than I do to see the decidely double- and home run-centric current-day hitting profile as clearly positive. That’s because all these home runs bring along with them some other downsides …

Balls and Strikes

Period	BB/G	SO/G
1982-85	3.20	5.22
2002-05	3.27	6.42
Delta	+ 2%	+ 23%

Okay, we all know that walks are a key component of on-base percentage, and OBP is highly correlated with scoring runs. I love it when the team I’m rooting for draws lots of walks.

But come on, let’s face it: walks are dull. If the batter takes the bat off his shoulder at all, the best he produces is a foul ball. The pitcher fails to find the strike zone four times in which the batter neglects to swing. After sitting through this, we’re rewarded with the scintillating spectacle of the batter dropping his bat and jogging (if not strolling) down to first base. Meanwhile, the fielders are all standing around, scratching their you-know-whats. As positive offensive events go, this one is as far as it could possibly be from, oh, say, the triple.

And walks are, of course, a side effect of home runs. It’s actually a bit surprising to me that their frequency is only up as modestly as it is from 20 years ago, but it is up. This isn’t an exciting development.

Another side effect of home runs is, of course, strikeouts: batters who swing for power do so at the risk of striking out a lot. This can be a very sensible tradeoff, and teams properly concern themselves with maximizing run production rather than aesthetics. But the fact remains: with the dramatic increase in home runs has come a dramatic increase in strikeouts.

And like walks, strikeouts are big-time dull. The ball is never put into play; every defender other than the pitcher and catcher is a spectator. Neither fielding nor baserunning takes place. Just as the walk is the dullest of positive offensive outcomes, the strikeout is the dullest of positive defensive outcomes. And in the current era, we’re witnessing more strikeouts than ever before, as well as more strikeouts and walks combined than ever before. On this count, give me the 1980s anytime.

And this isn’t the only downside of all the home runs.


Period	SBA/G	SB%
1982-85	1.12	67%
2002-05	0.78	70%
Delta	- 31%	+ 4%

The more likely your team is to hit a home run, the less cost-beneficial the stolen base becomes. A home run scores every baserunner, from first base just as surely as from second or third. So in a home run-heavy offense, the benefit of extra bases is minimized, and the cost of losing baserunners is as great as ever. The dramatic reduction in steal attempts in the current era compared with the 1980s is entirely logical, and the fact that increasingly only the most proficient stealers are making attempts explains the improved success rate.

But whether attempting a stolen base is smart or not, one thing remains constant: stolen base attempts are extremely fun. Beginning with the will-he-or-won’t-he-go intrigue, through the cat-and-mouse of the pitcher attempting to hold the aggressive runner close (and perhaps pick him off!), to the attempt itself, with the catcher forced to quickly unload a hard and accurate throw, with the infielder forced to cover the bag, to receive the throw and make the tag. The chance of an errant throw, and yet another extra base, is very real. Safe or out, the stolen base attempt is a wonderfully exciting play. And we’re seeing more than 30% fewer of them nowadays than we did 20 years ago. There’s simply no getting around that this development has taken a lot of interesting action out of the game.

Moreover, while I don’t have the data to quantify it, it’s very reasonable to deduce that, as stolen base attempts have dramatically diminished, so have extra-base attempts on batted balls: the cost-benefit equation is exactly equivalent to that of stealing bases. We’re witnessing less daring, less challenging baserunning in today’s baseball. While I respect the logic behind this, it’s a way in which the game has become less compelling to watch.

We can’t leave the discussion of baserunning without addressing the issue of artificial turf. I won’t beat around the bush here: I hate artificial turf, and its far more prevalent deployment in the 1980s than today is a big demerit against the baseball of the 1980s. But as somebody once wrote (I’m not sure, but I think it was Bill James in one of his 1980s Abstracts), I don’t like artificial turf itself, but I love what it does to the game. Speed is fun, triples are a blast, and aggressive baserunning is really great to watch. To the extent that the artificial turf of the 1980s encouraged that mode of play, that was the one good thing about the ugly, unpleasant, stupid fake grass.


In the early 1980s, the complete game reached an historic low. Fewer than one start in five was finished.

Period	CG/G	SV/G
1982-85	0.16	0.23
2002-05	0.04	0.25
Delta	- 76%	+ 10%

Little did we know, however, that compared to a couple of decades later, the 1980s would be the Golden Age of the complete game. In the current era, complete games are occurring just a quarter as often as they were then, veering perilously close to complete extinction.

A complete game for its own sake isn’t very sensible. There are obvious reasons to remove a starting pitcher from a game (though there are also good reasons to preserve the bullpen). But the achievement of a complete game can be very fun to watch. Seeing a starter working a strong performance into the seventh, into the eighth, into the ninth, wondering if he’ll be able to sustain his effectiveness through 27 outs, is great theater. The current-day mode of virtual certainty that he’ll be removed has largely removed a fun element from the game.

And as we’ve examined many times, it isn’t just the starting pitcher in today’s game who routinely works a shorter stint than before. Relievers as well, in all their various modern guises—LOOGY, setup man, closer, and so on—work shorter stints too. Bullpens are bigger than they were 20 years ago, and they’re deployed more liberally as well: one is more likely to witness a pitching change today than ever before. Indeed, a game without at least two pitching changes per team, often mid-inning pitching changes, is a rarity today.

Setting aside the wisdom of this development, let’s consider it from the standpoint of pure entertainment. When you talk about the highlights of last night’s ball game with your friends, do you ever mention how thrilling those pitching changes were? Do you ever say, man, those eight warmup tosses the second LOOGY made were killer? Didn’t think so. Pitching changes are a good excuse to run to the bathroom, but that’s about the best that can be said for them for spectator appeal.

And, as to the wisdom of this frantic bullpen deployment and specialization … vastly fewer complete games, therefore many more opportunities for saves, yet the elaborate closer-centered bullpens of the 2000s are producing saves just 10% more often than the smaller, simpler bullpens of 20 years ago. Consider me underwhelmed.


Fielding remains the most elusively difficult element of the game for statistical analysts to master. It is also, in my opinion, generally the least appreciated element of the game. We fans marvel at the feats of major league hitters and major league pitchers, and we understand the crucial importance of their contributions. But too often, I think, we take for granted the tremendous skill of major league fielders and overlook the degree to which successful teams depend upon their work.

The fielding-specific differences between the major league game of today and that of 20 years ago is, perhaps fittingly, the most difficult for us to get a handle on, but it also might be the most interesting.

1982-85	28.1	.702	.287	0.82	0.94
2002-05	26.8	.693	.298	0.65	0.94
Delta	- 5%	- 1%	+ 4%	- 21%	0

First of all, let’s consider the raison d’etre of fielders: balls batted into the field of play. The increase in both home runs and strikeouts over the past 20 years has yielded a meaningful reduction in balls in play. Current-day teams have to contend with only about 95% of the volume of batted balls handled by their counterparts of the early 1980s.

And yet, despite handling more plays, those defenses of a generation ago achieved a slightly higher rate of Defensive Efficiency (the proportion of batted balls converted into outs). Looking at it another way, the defenses of the 2000s surrender a slightly higher Batting Average on Balls in Play (hits minus home runs, divided by at-bats minus home runs and strikeouts). Likely there are two key explanations for this. One is that current-day hitters, obviously more powerful than those of 20 years ago, hit the ball harder (when they aren’t striking out). Another is that current-day fielders, bulkier than those of the 1980s, selected more for power than for speed, don’t exhibit quite as much defensive range as those of the 1980s.

The clearest difference between the fielding of the two periods is the rate of errors: fielders of today commit dramatically fewer. There is simply no way to assess this fact without concluding that fielders of the 2000s are exhibiting superior sure-handedness and throwing accuracy over their 1980s counterparts, as a distinct illustration of improved quality of play. To the extent that this is the case—and it may well be—this is a clear argument in favor of the 2000s.

But I strongly suspect there’s more to it than just this. For one thing, there are 5% fewer balls being hit into play in the 2000s than in the 1980s—right there is a reason for fewer errors to be committed. Second, as we saw, significantly fewer runners are attempting to steal (and presumably just as many fewer are attempting to take extra bases on batted balls), which has created a significantly lessened opportunity for throwing errors. I have no trouble believing that the overall quality of play today is better than it was in the early 1980s, but there’s no way in the world that the quality of play has increased by over 20% in 20 years, as the sheer reduction in error rate might indicate. This is an intriguing issue.

Interesting as well is the fact that through all this, the rate of double plays between the two periods is identical. More stealing (and presumably more hit-and-running and bunting) in the 1980s would make double plays more difficult to come by in those years, but more strikeouts and home runs in the 2000s would appear to have made them equally more difficult to come by in the current era.

So, Baseball Is Like a Box of Pizza

To the extent, whatever it is, that the athleticism and skill excellence of players today is better than at any point in the past, then in that regard we’re treated today to the best possible baseball-watching experience. But style of play is distinct from quality of play. Quality issues aside, I assert that the type of baseball the major leagues presented in the early 1980s was more compelling and more interesting, than the type of baseball we’ve seen in recent years: it offered a better balance between runs and outs, a better balance between power and speed, a livelier, more challenging game for fielders, and a more entertaining, less tedious deployment of pitchers. And the quality of play we saw in the 1980s, while it might well have been lower than today’s, wasn’t much lower. Twenty years isn’t a very long time in that regard.

Don’t get me wrong. I’m not saying I don’t like the baseball of today. Far from it: I love today’s game. The issue here is similar to the agreement my son and I have on the subject of pizza. He and I take our pizza quite seriously, and we engage in endless debates about the superiority of this crust, that sauce, these toppings. But, we always say, even the worst pizza is still great. It’s pizza, after all.

References & Resources
Why did I choose to compare the four seasons of 2002 through 2005 with the four seasons of 1982 through 1985? I’d have included 2001 and 1981 as well, and made it a nice clean half-decade to half-decade comparison, but 1981 was, alas, a severely strike-shortened season, and its statistics were a little bit messed up. So a four-year span to four-year span it is instead, and the essential issues come through anyway.

Print This Post

Comments are closed.