How Not to Be Wrong: A book review

May 31, 2015 — Leave a comment

How Not to Be Wrong by Jordan Ellenberg

How not to be wrong: The power of mathematical thinking By Jordan Ellenberg is a story of how we can use mathematics to make decisions and better understand the world around us.  Mathematics is “extending our common sense by other means.”

More great quotes from the book:

“Dividing one number by another number is mere computation; figuring out what you should divide by what is mathematics.”  p.84

“Once your laptop can do it, it’s not mathematics anymore.” p. 283

“When you’re working hard on a theorem, you should try to prove it by day and disprove it by night.” p.433

When to celebrate and when not to celebrate

Be careful of the inferences you make from small data sets.  If you are a teacher or principal at a small school and your EQAO scores shoot way up, or way down, this is more likely numbers doing their thing than something wonderful (or terrible) that you did.  Here is the thing that Ellenberg reminds us so well, small sample sizes can have huge variability.  The results of a small group can fire up and down.  Be mindful that small sample sizes do this type of thing.  If it was a huge improvement, quietly say to yourself that it could equally have been a huge downturn. The numbers in these cases are really only valuable by looking at the changes over a longer period of time.  So, if you are a principal or teacher in a small school, don’t let the yearly data fool you, look at the bigger picture.

Similarly, if you are a huge school, there is something you should keep in mind too.  Are you an administrator or teacher in a giant school? Are you finding that your EQAO scores don’t change much year to year? Well, that’s because of regression to the mean!  Again, look at the overall trends.

In both cases, look at the big picture. Don’t let a small or large number send you thinking things that aren’t true.

Counter-intuitive Thinking and Missing Bullet Holes  

Abraham Wald was a mathematician who contributed to the war effort in WW2 from within Columbia University as part of the Statistical research Group (SRG).  He and a group of brilliant mathematicians were working on how to armour planes so that they would not get shot down.  But, the problem was that adding armour added weight, which took more fuel and rendered the planes less maneuverable (arguably cancelling out any benefit of adding armour in the first place).

So, they looked at the data.  Where are planes being hit?  The plan was to add armour to those parts of the plane.  The researchers found that the fuselages of planes had many bullet holes whereas there were far fewer bullet holes on the engines of returning planes.

So, where would you put the extra armour? On the fuselage!  No, this is not the right idea.  Wald said the armour shouldn’t go where the bullet holes are most, but where there are no bullet holes at all, which was on the engines.  Wald was able to see that the planes that did not make it back for inspection where probably the planes that got hit in the engine.  The planes the researchers were inspecting were able to make it back, so bullet holes had not caused fatal damage.

Where are the missing holes? The missing holes are on the planes that were shot down!  Wald was able to ask: What assumptions are you making? And are they justified?

This connects to the quote above about believing your ideas by day, and disbelieving them at night.  The numbers of bullet holes, or any set of data, must be interpreted. That’s where the mathematical thinking is most essential. Here we use math to guide our intuitions down a more structured path so they don’t bring us down.

More is not always Better and Ferris Bueller Explained

More is not always better.  We know this, but time and time again we think that data runs on a straight line going up or down. More often, however, data is “straight locally and curved globally.” When we stand back, the shape of the data starts to look different.

Remember Ferris Bueller’s Day Off?  Bueller? Bueller?  The super boring economics teacher was talking about the Laffer Curve, taxation and voodoo economics. There is actually some great math in this segment!  You see, governments thought that if raising taxes a little raised a little revenue, then raising taxes a lot would raise A LOT of revenue. Turns out that it’s not a straight line, it’s a curve, a Laffer Curve.  There is a point at which you increase taxes so much that your revenue starts to drop because people either hide their money or stop working. If you watch the Ferris Bueller clip, and look at this diagram, this is the diagram the teacher is describing when there is saliva on the desk from the sleeping teen.

Laffer Curve

This also applies to giving advice, writing a blog, making mistakes in the classroom and missing flights!

Yes, missing flights. If you are always on time for your flight and you never miss one, you are probably spending too much time at the airport.  Mind you, this probably only applies to frequent flyers. (Utils are an economists units for the utility something gives you).

“George Stigler, the 1982 Nobelist in economics, used to say, “If you never miss the plane, you’re spending too much time in airports.” p. 232 

So, other lessons of the Laffer Curve:

If you worry too much about giving good advice, you probably don’t give enough advice.

If you spend too much time trying to be productive, you are less productive.



No Comments

Be the first to start the conversation.

Leave a Reply


Text formatting is available via select HTML. <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>