Personally, my opinion is that athletes in all fields have gotten much better over the last 100 years and will continue to do so. I’m not sure if Babe Ruth would crack a starting roster nowadays, and think someone like Davis might hit 80 home runs back in 1925. It’s impossible to prove either of those assertions, however.

]]>The underlying issue here is the comparison to the league average, which is an attractive but misleading benchmark. If the statistical distribution of home runs changes from year to year, the average will mean a completely different thing year to year. What you would have to do in order to get a consistent benchmark is to model a statistical distribution and determine how statistically rare a particular batter is compared to that distribution, not just to the average. By using the league average you are essentially taking this distribution, chopping at an arbitrary point (i.e. the threshold to the majors) and taking the average of what remains. The error introduced by that method, especially when the distribution is a sharp one (as it certainly is with home runs) will be huge. Of course this method would be a looooot more work, but as you say, if you want to go part of the way toward benchmarking, you may as well contemplate what a full solution would look like.

]]>1) They are more consistent in the production of baseballs–ie they wind them tighter with more regularity. It has been documented that tighter balls travel further.

This also goes for bats.

2) Presumed lack of steroids in the 1920s. Bonds definitely juiced. Babe? Not likely.

These are accounted for in Dave’s calculation of HR+ and I believe they are significant. I just can not account for his conclusion that is contrary to his statistics.

]]>