Another math illiteracy moment August 15, 2009Posted by lumixedia in General, history of mathematics, math education, number theory.
Tags: math illiteracy, number theory
I was recently informed that the Goldbach conjecture is popularly known in China as the “1+1=2” conjecture. As in, “every positive even number can be written as the sum of two primes. For example, 1+1=2.” [Edit--I was told this by a Chinese person who might nevertheless not be representative of how this nickname is understood--see comments.]
When I mentioned that this nickname is not in fact accurate, the person who so informed me got rather annoyed with my pointless pedantry. Why shouldn’t 1 be prime? Why not define a “prime” to be a positive integer with at most two distinct divisors, rather than a positive integer with exactly two distinct divisors? Clearly the “1+1=2” conjecture sounds way cooler than the “2+2=4” conjecture to a layman, and we are talking about popular mathematics here, so why not?
Okay, I guess it might not be immediately obvious why current notation is preferable. Maybe. From a certain perspective. It is also admittedly true, according to Wikipedia, that 1 was indeed widely considered to be prime by mathematicians up to a few hundred years ago. Fine. So let’s temporarily redefine “prime” to mean a positive integer with at most two distinct divisors, and see if it’s acceptable today.
Well, first of all, the conjecture that started this discussion is going to completely change for even numbers of the form 1+p where p is a prime, as in it will become trivial where (as far as I know) it previously was not. If we want to preserve the original Goldbach conjecture, we should state it this way: every even number greater than 2 can be written as the sum of two primes which are not 1.
That’s not too bad, I guess. We can live with that. And the Goldbach conjecture is not (as far as I know) fundamental in the study of number theory, so adding the four words “which are not 1” is no big deal.
Let’s look at something fundamental, then. Let’s look at the Fundamental Theorem of Arithmetic. It is no longer possible to say this: every integer greater than 1 has a unique prime factorization. We should now say this: every positive integer has a unique prime factorization up to powers of 1.
Oh, well. It’s only adding three words (five if you remove the “greater than 1” in the original and consider the prime factorization of 1 to be the empty product). Besides, as the person who inspired this post retorted, who cares about unique prime factorization? Only head-in-the-clouds number theorists, that’s who.
I can’t technically disagree with that. Yeah, only people interested in math care about unique prime factorization. But considering that “prime” is a math term, you’d think people interested in math should be the only ones whose opinions matter anyway. I’m getting off track, though. The question here is why we care.
The answer is obvious. At the same time, it’s probably sufficiently deep and multifaceted that I won’t actually give it to the satisfaction of anyone who reads this post, but that’s fine. I look forward to much better explanations in the comments.
The first point is that “unique” is much, much better, just conceptually, than “unique up to powers of 1”. I mean, seriously, do you really want to consider to be an acceptable prime factorization of 12? It would completely defeat the purpose of the prime factorization of a number as a decomposition into, in a sense, simplest possible parts. The moment you start trying to compute basic number theoretic functions, such as the number of positive divisors or the sum of all positive divisors of a given integer, you want to exclude the possibly present power of 1 in the integer’s prime factorization. And that’s only the beginning of what prime factorization is used for.
Maybe I’m being unfair here. The Fundamental Theorem of Arithmetic can be easily rescued by restating it like this: every integer greater than 1 has a factorization into primes greater than 1. This might be followed by a sentence like “we define this factorization to be the integer’s basic factorization”, or whatever. Now we just search-and-replace “prime factorization” by “basic factorization” in any number theory works that mention the former, and we’re okay.
That’s not all we have to do, though. This is a consequence of the Fundamental Theorem of Arithmetic:
To be precise, we should restate it like this:
In general, every time Wikipedia’s list of arithmetic functions deals implicitly or explicitly with either a fixed prime or a set of primes, we have to add the qualifier “greater than 1”. (Or, I suppose, we could instead opt to break all the additivity and multiplicity properties. I’d rather not, though.) In specific cases we can probably include 1 or restate things more efficiently, but still it gets to the point where we’d really be better off coming up with a specific term for all primes other than 1. Let’s do that. Let’s define an “oink” to be a number with exactly two distinct factors—that is, any prime greater than 1.
Fast forward a reasonable amount of time in which this becomes accepted notation. Schoolchildren and any non-mathematical adults become hopelessly confused over the difference between a prime and an oink. People who attempt to explain that 1 is not an oink are yelled at for being pointlessly pedantic. Mathematicians have stopped using the term “prime” altogether since “oink” makes more sense in every context you might care to name. Eventually prime drops out of the mathematical vocabulary, oink takes its place, and we have this entire conversation all over again. And again. And again.
Moral of the post: mathematical pedantry usually exists for a really, really good reason.