Steve Yegge’s article, “math for programmers,” has been making the rounds. His thesis is that mathematical breadth in pre-college education would be more valuable than the attempt to provide mathematical depth in few apparently arbitrary areas (like geometry). He argues that specific math skills taught in grade school (like long division) are not necessarily of tremendous use to the average programmer. He writes:

Which is why I think they’re teaching math wrong. They’re doing it wrong in several ways. They’re focusing on specializations that aren’t proving empirically to be useful to most high-school graduates, and they’re teaching those specializations backwards. You should learn how to count, and how to program, before you learn how to take derivatives and perform integration.

I don’t really agree with Steve. Programmers or not, the main problem is that people aren’t taught the ability to think clearly and to abstract. Second, depth is necessary for learning and an early emphasis on breadth would weaken understanding.

From my experience helping my classmates with math from high school through college, I think that many students have one key problem: they don’t know basic abstract mathematical concepts and how they build upon each other. As a result, their knowledge of math consists largely of patterns and rules; their ability to do math rests in their ability to match a given problem to the correct rule. They don’t see that math is a framework of abstractions.

When you learn theoretical math, you start from axioms and definitions. From these you prove theorems. From theorems you develop frameworks. Elementary education starts math with groups and fields: addition, multiplication, and their inverses. It’s probably tough to teach second graders to think abstractly about this, so groups and fields are glossed over in favor of concrete examples: 2 + 2 = 4. With any luck, concrete knowledge results in an ability to manipulate natural numbers. Then rationals or, as they are called in fourth grade, fractions. Soon variables (“x”) and equations are introduced. Of course, variables are just an abstraction of rules that were taught earlier they reveal yet more structure.

Ideally, you build abstractions to match this structure and to provide intuition about how mathematical objects behave. Eventually you think in terms of those objects, without worrying about the foundation. Without the abstraction though, the concepts start to grow unmanageable. Fractions are something special, not an extension of division. Exponents and logarithms aren’t really just talking about multiplication. Without abstraction, you have to remember how to handle everything separately. Further, the result of missing abstractions gets worse with time: there are a lot of rules and each year they teach you more. Instead of building confidence, they build fear.

Good students abstract. They don’t have to study, memorizing tens of special cases and specific formulas. When I took probability, Professor Rota used to say about statistics, “it’s just balls into boxes.” Bose-Einstein, Maxwell-Boltzmann, it’s “balls into boxes.” The difference is perhaps in whether the balls are the same color or the size of the boxes, but the concept is the same. More generally, you can get pretty far in engineering if you really understand conservation of momentum. But if you don’t understand the underlying abstraction, the key idea, your brain has to work a lot harder.

The problem is that our early education doesn’t emphasize building mental models and frameworks of abstraction. Naturally, computer programmers (and computer scientists) tend to be good at this. (Or perhaps people who are good at this make good programmers.) The concept of abstraction is critical to computer science. Build the right abstractions in your software and you can develop more complex programs. If you build the right math abstractions in your mind, you can solve more complex math problems.

My second thought is that depth is important for learning. Mastery of concepts takes a long time—Peter Norvig has an excellent essay arguing that it takes about ten years to develop expertise in any field. Getting good at math is no different: practice. To learn how to do fractions, you have to do fractions. To learn calculus, you have to do calculus. As much as I love MIT and its firehose style of education, I feel that I retained a lot more of high school because it was repetitive and in-depth. MIT doesn’t always give the repetition and time needed to really internalize the mechanics.

Depth gives you a better understanding of details. In programming, abstraction hides details but it is often necessary to work with those hidden details. When you encounter something counter-intuitive or perhaps simply non-obvious, it helps to be able to dig down and debug the problem. This happens just as much in math or physics: instead of debugging, we “go back to first principles.” Being able to do this helps you find mistakes and solve problems that you’re not too sure about.

Steve is right: programming early is important. But, I think I started the other way, as my ability to abstract and think clearly from math helped develop ability to program and debug. To me, focusing on these skills in math will help much more than learning the names and broad facts about many subdisciplines of math. Once you have these skills, learning something specific will be that much easier, whether or not you are a programmer.