As I watched last night's game, with the Phillies repeatedly facing Manny Ramirez in the worst possible positions, I had to wonder - why aren't they pitching to him the way the Yankees pitched to Joe Mauer? That is - basically avoid him. Be happy with walking him. And the answer repeatedly was:
the last thing you want to do is put the winning run on base....
I hear that. Matt Kemp is no slouch. But I wondered, perhaps inanely, just what was the best plan from a purely mathematical standpoint. At what point, when holding a one-run lead, does adding another baserunner change the "expected runs" for an inning from under one run to over one run?
And since I hadn't published anything yesterday, and didn't have any better ideas, I thought I'd punish you folks with the results.
So here's what we're gonna do. We're gonna start with Palmer and Thorn's Run Expectancy Matrix. (Easy does it. It's math, but don't freak out. I'll explain it, I promise. Just stick with me through the next paragraph.) Then we'll see at what point adding a runner increases the expected runs from less than one to more than one. And we'll see if it jives with the common knowledge that you don't want to put the winning run on base.
So first, what is Palmer and Thorn's Run Expectancy Matrix? It's simple. It's a neat grid that shows, given a certain number of outs and people on base, the average number of runs that should score that inning, based on 75 years of major league games. It was published in The Hidden Game of Baseball by Pete Palmer and John Thorn. It looks like this:
|OUTS||None||1st||2nd||3rd||1st & 2nd||1st & 3rd||2nd & 3rd||Full|
How does it work? Let's do an example. See that number 1.068 in the first row? That means that in innings that had a runner on 2nd base (the column) and zero outs (the row) the team (over 70 years of studying baseball) averaged scoring 1.068 runs that inning.
Now, it's important to understand that this matrx doesn't know who is pitching or who is batting next. It's essentially nothing more than a baseline. But if you understand that, you can use this little chart for all kinds of stuff. And I want to know at what point walking a player puts the expected runs over the value of 1. Here's what is say:
With 0 outs - You don't want to walk a batter when you only have a guy on first base. Doing so increases the expected runs from .783 to 1.38. In any other situation, you're already expecting to give up over one run. So you're OK, provide you don't force an existing runner to the next base. That seems to follow common baseball wisdom.
With 1 out - You don't want to walk a runner if there is someone on third base. (But it's OK if there is a runner on 1st or 2nd base, strangely enough.) You also don't want to walk a runner to load the bases if there are only runners on first and second. This is NOT baseball wisdom, and points out how silly this exercise might be, in my opinion.
With 2 out - You walk the batter and you don't much care what else the situation is, provided you don't walk in a run. The expected runs never get over one run, unless you're silly enough to walk a run in and have exactly the same situation with the next batter.
So the lesson is that the prevailing wisdom isn't totally correct.
- If there are no outs, walk the guy provided you're not moving another into scoring position.
- If there is one out, don't walk a guy to put runners on the corners. That seems totally counter-intuitive, since it also sets up the double-play. But there it is.
- If there are two outs, walk the guy. Tempting fate isn't a bad option. Especially if the next batter is Jason Kubel.
(I keeeeed. I keeeeeed, Jason. I make funny.)
Have a good weekend folks. See you Monday.