Much Ado About Nothing

Arguing definitions in mathematics is a silly thing to do. Definitions are in place because they are useful or not, and they are as flexible as we like them to be. Define already what you want to talk about and start talking about it. We will understand. Whether a ring should be commutative or not, whether a groupoid is a term in category theory or a synonym for a magma, whether infinity is a number or not... Really, there are better things to worry about than terminology, best left to the philosophers who like to argue about nothing. The rest of us just like to move on with the mathematics and have our fun.

Nevertheless. The question as to whether zero is a natural number or not is bound to arise eventually. Parties will take sides, arguments will begin, people will push their own agenda, and tempers will rise. All about nothing.

It is understandable that a controversy exists, because zero has always been strange and unique for a number of reasons. Mathematical historians love the question of when zero was first conceived and ascribe special mathematical sophistication to cultures who adopt the use of zero, an exaggeration in my opinion. It's nowhere near as important as the discovery of π, and if a particular culture didn't define zero, it was simply because they didn't need it, not because they were somehow mathematically underdeveloped. We create mathematics as we need them and as they amuse us.

Natural Numbers

The natural numbers, as you will observe from following the pipelinked title of this section, are variously defined to be either the set {0, 1, 2, 3, ...} or the set {1, 2, 3, ...}. The idea is that they are the most fundamental kind of numbers, the counting numbers, the source of all mathematics. What is more natural than counting? There seems to be some evidence that even some pea-brained animals like birds have counting abilities.

In this vein, the question arises: is it natural to start counting from zero or not? Some say it isn't and cite as evidence the early cultures that did have symbols for other numbers but not for zero. Nothingness is a strange concept, they believe. Others counter that nothing is indeed a natural concept, and the argument starts while the onlookers and I roll our eyes.

The problem is that zero is indeed different in very special way from the other numbers. The separation is not merely philosophical but very practical and mathematical indeed. At the same time, zero is very much like the other natural numbers. Whether you believe the differences to be greater than the similarities will influence your particular desire to include zero among the other natural numbers.

Nothing is weirder than nothing!

Zero, if included among the natural numbers, would be the only natural number that isn't positive. This creates some uncomfortable situations, because positive numbers have some very desirable properties. Not only are they more easily visualised by children learning to count, they are also pleasing arithmetically. You can multiply both sides of an inequality by a positive number without changing the truth of the inequality, but you can't multiply both sides of an inequality by zero without turning it into an equality. Adding a positive number always makes things bigger, but adding zero doesn't. Equal positive numbers can be cancelled from both sides of an equality of multiplication to yield another valid equality, but zero can't be cancelled in the same way. And so on.

Zero would be a very strange natural number to include among the other natural numbers! There's also the whole pesky business about division and how division by zero is mostly meaningless. This alone is reason enough for some mathematicians fond of division to banish zero from the realm of natural numbers. They're completely within their right to do so.

Zero is as natural as one, two, three

Zero is also very much alike to the other natural numbers in that it is a nonnegative integer. The possible weirdness of zero pales in comparison to the weirdness of negative numbers. Read any reputable algebra text of fifteenth century Europe to see the great pains they go through to avoid mentioning negative numbers, and you will see what I mean. Even as late as the nineteenth century, kooks like Leopold Kronecker devised roundabout ways to avoid mentioning negative numbers (I don't believe he ever specified either if he included zero when he said "God created the natural numbers; everything else is the work of Man.").

Its primitiveness is another good reason to include it among the natural numbers. It's the additive identity, for crying out loud; anything plus zero stays the same. If you include zero among the natural numbers you have a monoid, one of the minimally interesting algebraic sets. Furthermore, when building numbers out of set theory as they enjoyed doing it during the early twentieth century, the only set that we seem certain enough that should exist by itself to merit its own axiom of existence is the empty set. It's the only natural place to start, and zero of course gets identified with the empty set. All other natural numbers, as sets, can be built out of the empty set! Zero, if it belongs anywhere, belongs with the natural numbers.

Enough pussyfooting. What's the verdict?

There is no verdict. Zero is in a class of its own. Pick a definition and stick with it. Or let someone else define it whichever way they want and follow their lead. If you want to be all unambiguous and haughty yourself, then simply refrain from using the term "natural number" because people won't ever be sure which one you mean. Instead, say "positive integers" or "nonnegative integers". Or say at the very beginning "zero is/isn't a natural number", and move on.

Most modern mathematicians are well aware of the ambiguous use of "natural number" to either avoid the term, or define it precisely when they first use it. Bourbaki tells us zero is a natural number; computer scientists like Dijkstra and analysts like Cauchy love to start counting from zero; number theorists live in the realm of multiplicativity and zero just messes up too many things to be included in their definitions; and logicians are much too proud of the foundation they built out of nothing to consider it unnatural. If you want to do mathematics, you'll have to get used to all of these traditions eventually, so stop writhing in discomfort any time anyone defines zero to be or not to be a natural number.

Also, please don't do what I just did, and please step away from the argument. It is by no means interesting, I assure you.

Log in or register to write something here or to contact authors.