Math Baseball Mindfuck

Unarmed Gunman

Medium Pimpin'
May 2, 2007
7,335
287
0
The D
www.googlehammer.com
Fractions are a huge part of baseball stats, but something doesn't make sense.

If a batter gets two hits in five at bats he is 2/5 = .400 batting average.

If he goes one for four in the next game he is now 3/9 = .333 batting average.

This is because when you add up his total hits (2+1) and divide by total at bats (5+4) you get 3/9. But when I try to add 2/5 + 1/4 (grade school style), I actually get 8/20 + 5/20 = 13/20 which would be a batting average of .650.

divide-by-zero9.jpg


What the fuck am I doing wrong? Is this is a major math fail on my part or have I possibly discovered the secrets of time travel, division by zero and the answer to life the universe and everything in it? INB4 42.
 


You're not calculating it right. If he went 8/20 and then 5/20, that would be 40 at-bats.

13/40 = .325
 
Yes, you are doing it wrong.

To get a batting (or any) average, you need to divide the total.

With your reasoning, if I would hit 3 balls out of 6 one game (3/6 or 1/2 = .5) and 2 of 4 the next game (2/4 or 1/2 = .5) I would have a batting average of 100 percent or
1/2 + 1/2 = 2/2 = 1

Whereas really, you need to take the total number of tries (6+4 = 10) and succesful hits (5) to get to the real number (5/10 or 1/2 or .5)

::emp::
 
to find the average of multiple percentages you don't add them together. You add then divide by the number of items.
 
1/4 + 2/5 = 13/20 in math, but

1 quarter + 4 dimes = 65 cents (.65 of 1 dollar)

1 dollar is always 1 dollar. 1 at bat is also always 1 at bat, but the total amount of at bats is always changing, and this is what the average is in comparison to.

65 cents is always 65 cents, but it would be .325 of 2 dollars, .163 of 3 dollars...