One thing I haven't seen much on, however, is David Gassko's interesting attempt to quantify differences between managers. The effects of managers (and coaches) are often discussed by the public and in the media. See, for example, all the prognostications of the Dodgers' resurgence this year because of the arrival of Joe Torre. But the actual effects of managers are hard to quantify, because their performance is confounded with the performance of their players. As a result, few studies have done a good job of quantifying manager effect, and many of the ones that have done a good job often report fairly minimal effects.
In Gassko's article, he succeeds by keeping his focus narrow: all he tries to do is address the issue of whether some managers get their players to perform better than others. He does not address issues of strategy. He uses a fairly simple projection system to predict player performance each year (it includes an aging curve, park factors, league adjustments, past performance, etc). Next, he compares how players actually did, relative to their projections, to get a "runs above self" statistic for each player season. Finally, he sums up the runs above self statistics for all players playing for each manager in history. The result is a statistic that allows one to detect differences in manager performance.
Technical notes: Gassko does regress to the mean. He also confirmed that the performances within managers of players with last names in the first half of the alphabet were correlated to those in the last part of the alphabet, which indicates that there's some real signal present here and he's not just describing random variation...though as a side note, I'd much rather he just randomly split players within managers into two groups using a truely random process (e.g. rand() in excel). The alphabet-based split is a classic example of non-random sampling decisions that are intended to be random, and is something that's been discussed in several stat classes I've taken. Still, I'm sure that the answer would be the same either way, so it's a minor point.
One of the surprising findings was that Dusty Baker comes out as the 7th best manager of all time, in terms of his "skill" (or, historical tendency) to get players to play above their projections, with a rating of +1.65 wins per 162 games. In fact, while he's been nothing special with pitchers, Gassko reports that he has been the best manager of all time in terms of getting his hitters to perform above their projections. Derrick Lee may be helping him a bit, but not enough to vault him to the best of all time.
I know there's a lot of Dusty hate out there. I personally put myself into the category of "concerned," and I've written about my worries with respect to his record several times. But if there's one thing that I've heard time and time again from insiders about Dusty Baker, it's that players love to play for him, and that he gets the most out of what you give him. Gassko's study provides data that supports that contention.
Update: Commenter David (not to be confused with the study author David Gassko, or Dave from Louisville who also commented!) pointed out that part of Baker's excellence might be due to the fact that he presided over several of Barry Bonds' Ridiculous Years. David Gassko very kindly forwarded me a spreadsheet with all of Baker's players and their "runs above self" statistics so I could break this down.
I found that when removing Barry Bonds from the dataset, Baker's score dropped from +1.7 wins to +1.0 wins per 162 games (+2.7 hitters + -1.7 pitchers; I'm using a 50% regression to mean, which Gassko indicated is appropriate in this case). That would pull Baker off of the top-10 all time list, but still would rank him pretty darn high on the list, and would certainly be among the highest scores of currently-employed and living managers. I'd also point out, however, that it's really unfair to Baker to do this--we should pull the best hitter out of each manager's score to make this a fair comparison. If we did this, Baker would probably still fall in the overall rankings a bit (Barry was ridiculous, after all), but not as much as he does in this comparison.
If you're interested, here are the top-10 and bottom-10 seasons contributing to Baker's outstanding "skill" with hitters. You'll see that most of his "best" player seasons occurred in San Francisco, but many of his "worst" player seasons also occurred there. Interestingly, some guys show up on both lists. :)
Top 10 Baker hitter seasons in runs above self
Bottom 10 Baker hitter seasons in runs above self:
Thanks to David Gassko for not only conducting this interesting study, but also generously sharing his data upon request.