Pondering a First Inning Mystery
Inning | Away | Home | HFA |
---|---|---|---|
1 | .318 | .340 | .022 |
2 | .304 | .314 | .010 |
3 | .311 | .322 | .011 |
4 | .323 | .330 | .007 |
5 | .314 | .330 | .016 |
6 | .319 | .329 | .010 |
7 | .308 | .317 | .009 |
8 | .302 | .308 | .006 |
9+ | .296 | .297 | .001 |
The first inning has the biggest gap, with only the fifth coming even close. It’s a consistent effect year-to-year, and it’s a big deal: A 22-point edge in wOBA works out to three-quarters of a run per game, which would work out to roughly a .570 winning percentage, significantly higher than the actual edge. If you could bottle that edge and apply it to every inning, baseball would look very different.
A theory that makes more sense to me is that home pitchers have a unique advantage in the first inning. In that inning, and that inning alone, they can exactly predict when they’ll be needed on the mound. Have a perfect warmup routine? You can finish it just before first pitch, then transition directly to the game. Visiting pitchers are at the mercy of the game. Start too late, and you won’t be ready in time for the bottom of the first. Start too early, and an extended turn at the plate might leave you cold.
After the first inning, this advantage disappears. Both pitchers have to wait an indeterminate amount of time between pitching, with no rhyme or reason to whose timing will be disrupted more. For that one half-inning, though, it’s an uneven playing field. Seems like as reasonable a guess as any to me. Before we get to it, though, let’s rule out one other possibility.
I was curious to see whether the weirdness of 2020 did anything to the first-inning effect. It’s a natural experiment, of sorts: Any crowd-related boost — the initial adrenaline of hearing the fans, say — should be gone in the 2020 data. The first inning still showed the biggest home edge:
Inning | Away | Home | HFA |
---|---|---|---|
1 | .304 | .339 | .035 |
2 | .303 | .315 | .012 |
3 | .321 | .345 | .024 |
4 | .315 | .343 | .028 |
5 | .329 | .322 | -.007 |
6 | .328 | .341 | .013 |
7 | .307 | .315 | .008 |
8 | .294 | .316 | .022 |
9+ | .303 | .293 | -.01 |
With a smaller sample, the data is necessarily noisier, but I’m happy saying it isn’t a crowd effect. Ruling that out gave me two ideas, both related to the theory that visiting pitchers might have a tough time warming up. First, I ran a simple test: Do the same pitchers throw harder when they’re pitching in the top of the first rather than the bottom of the first?
As an example, Andrew Heaney threw 39 four-seam fastballs in the top of the first inning this year, averaging 93.1 mph. In the bottom of the first, he threw 68 fastballs and averaged 91.4 mph. We did it! A 1.7 mph differential would go a long way to explain the gap.
Just one problem: Heaney showed the second-largest differential in the majors (Shohei Ohtani was first, but with only four pitches in the top of the first). On the flip side of the coin, German Marquez sat at 95 mph even on 49 top-first pitches and 96.7 mph on 69 bottom-first pitches. A 1.7 mph differential the other way? That’s doing us no favors.
To get an overall sense of the size of this effect, I weighted each pitcher based on the lesser of their fastballs thrown in the top and bottom half of the first. Then I took the average differential based on those weightings. Pitchers did throw harder in the top of the first, but only by 0.13 mph on average.
The differential vanished completely after the first; from that point forward, pitchers actually threw very slightly harder in road games, though by only about 0.05 mph on average. In other words, there really does appear to be a special velocity boost that home pitchers get when they can time the completion of their warmups and immediately take the mound.
That tiny bump isn’t enough to account for the wOBA differential. An old but still-excellent study by Mike Fast looked at pitchers who gain or lose velocity from one year to the next, and while that’s a different effect, it’s a fine first-order way to look at things. Fast found a difference of 0.28 RA/9 per one mph increase, which means our tiny differential over one inning would be worth something like four-thousandths of a run, or something like one point of wOBA over an inning’s worth of batters. Maybe something else is going on.
My second test used the same methodology as the first but with a different observation. Instead of looking at fastball velocity, I looked at fastball zone rate on 0-0 pitches. This is going to be a much smaller sample, but that’s unavoidable if we want to control for count, and I think we absolutely do; treating first pitch fastballs and 3-0 fastballs the same is asking for a biased sample.
Here, I found no effect. That’s not completely true: I found a 0.9 percentage point decrease in zone rate for home pitchers. But with a tiny sample, it’s certainly mathematically indistinguishable from zero. It appears that any effect from more predictable start times affects only velocity, not location.
In the end, I’ll leave you with a question: what do you think explains this first-inning effect? It’s a puzzle I’d love to crack, though I haven’t gotten there yet, and the first team to do so will reap … well, they’ll reap a small benefit. If eliminating the first-inning edge gets rid of half of home field advantage, that’s somewhere between one and two wins a year. It’s not figuring out aging or finding a hitherto unknown player development technique, but it’s worthwhile nonetheless, and what is the offseason for if not looking for puzzles like this one?