Math notation is a mess, and that mess is a major source of difficulty in math education. The reason for the mess is that the writing system for mathematics grew organically, without planning or pruning. This problem is hardly unique to mathematics — think of English spelling or verb conjugation, or the redundant system of upper and lower letterforms.
Some of the more egregious irregularities in math notation include
- use of the same symbol for negative numbers and subtraction
- the myriad notations for division and multiplication (including completely omitting the multiplication sign)
- the wildly irregular notation for differentiation and integration in calculus (what does “dx” even mean by itself?)
- logarithm notation, which completely obscures the inverse relationship between powers and logs (a better proposed alternative is to use up-arrow for power, and down-arrow for log)
An irregularity in mathematics that hurts young students is number names in English. The numbers “one” through “ten” are decent, but then things get strange. “Eleven” and “twelve” are only distantly related to “one” and “two”, “thirteen” should really be “ten three”, and “twenty” should really be “two ten”. In Chinese (and Korean and Japanese) numbers are named in an entirely rational systematic way — the equivalent of counting “one”, “two”, “three”…“ten”, “ten one”, “ten two”…“ten nine”, “two ten”, “two ten one”, “two ten two” and so on. The result is that students taught in Chinese master base 10 place value concepts faster and more reliably than students taught in English.
It is impractical of course to change the names of numbers in English, but teachers could use the rational number names as an alternate scaffolding notation during teaching. Elementary teachers commonly use physical base 10 blocks to teach place value — physical “manipulatives” like these can be considered an alternate notation.
The most developed alternate notations for math can be found in programming languages. APL, invented by Ken Iverson in 1960, started not as a programming language, but as a notation for working with arrays. APL does two key things: all notations follow the same syntax (infix operators that can be unary or binary), and all operations can apply to arrays of numbers as well as to individual numbers.
The Wolfram Language — the core of Mathematica — also qualifies as a rethinking of math notation that irons out all syntactic irregularities. The Wolfram Language goes far beyond encompassing just matrices — all mathematical objects are swept into its domain of discourse, including proofs, derivation sequences, and much more.
But the biggest problem with mathematics notation is not syntactic irregularities, but rather all the things that mathematics does not even attempt to notate. For instance, how do you ask “what is 3+4” in purely mathematical notation? You can’t. You must use a mixture of mathematical notation and English. Schoolchildren come to read the incomplete phrase “3+4=” as asking for an answer, with the equals sign meaning “produces the answer”. No wonder students are confused by mathematical notation.
A similar dilemma faces students who want to express the idea of a function that maps x to x+3. You can write “f(x)=x+3”. But is that statement an equation declaring that two things are equal, an expression that evaluates to true or false, or the definition of a function called “f”? Computer languages explicitly distinguish between these ideas, whereas mathematical notation does not.
I consider it strange that mathematics, supposedly the pinnacle of rational thought, is conducted in an unruly mixture of English, symbols and references to other papers. And that most mathematicians themselves do not realize that their house needs to be cleaned makes the problem much worse. There are a few mathematicians who seek to formalize everything and do so, but their notation is even less readable than APL. To my mind mathematicians have a lot to learn from computer science — cleaning house would not just make math easier to learn, it would make many deep truths easier to perceive. And to be clear the reverse is true too…computer scientists have much to learn from mathematicians.