I’m reading an academic paper about cricket at the moment, and this came up.
“The usual batting average is defined as the total number of runs scored by a batsman (both out and not-out) divided by the number of outs (total number of completed innings). Under this definition, it is possible that players who often end the innings not out may get inflated batting averages.”
Really!? I’d have hoped researchers would know better. This is a popular myth about batting averages that is not just wrong but actually the opposite of the truth. Getting a lot of not outs deflates your average. I will attempt to explain why without getting too technical.
It is well known that batters “get their eye in”. When first at the crease, batting is more difficult, and once the batter gets used to the conditions (the pitch, the bowler, their technique, etc) they get better. This is not just folklore, the effect can be detected in data. If a batter is currently fresh at the crease (i.e. they are on a low score), they are less good than if they have been batting for a while.
Consider two batters. Batter 1 scores . Batter 2 scores . Both batters scored 168 runs and were dismissed 6 times, for an average of 28. But who is better? Well, it looks like Batter 1 had three fairly good innings (40, 72, and 38) where they were able to get their eye in. But Batter 2 only had one innings where they scored above 20. Batter 2 scored their runs in more difficult circumstances, while not “warmed up”. Yet they still performed as well as Batter 1. Therefore, Batter 2 is probably better.
Another way to think of this is to imagine that you are on 5 not out (say). If you were able to complete your innings, how many extra runs would you score? Since you now have your eye in a little bit, the expected amount of extra runs is greater than your actual batting average. Not outs deflate your average because they rob you of the opportunity to score an additional amount of runs that is greater than your average, because you already have your eye in.