# Numbers are Weird, Man

Numbers are weird, man. I have an amateur interest in the history of maths*That is to say, I can't be arsed to actually invest time and energy to do serious research about it, but I am interested enough to retain knowledge about history of maths that I read - to the point that in my book about JavaScript, I had a small section devoted to the history of numerical representation., and I’ve been thinking about numbers lately whilst on a flight.

## From Logical Underpinnings...

The concept of numbers we’re familiar with today arise from logic. Using logic, we can “trace” the lineage of numbers starting from natural numbers, to integers, then to real numbers, and then complex numbers and beyond (hypercomplex and other “unreal” numbers).

### Naturals

We start an axiom - a set of underlying premises and assumptions upon which our ideas of numbers are built - the Peano axioms. Using peano axioms we can define an infinite set of numbers which we call natural numbers, ℕ. ℕ is the smallest possible infinite set*If you're worried about the idea of "smallest" infinite set, don't worry, just know that some infinities can be bigger than others, and it also has the simplest way of introducing and somewhat encapsulating*I'm not sure if naturals fully encapsulate the principles of induction, should check the principles induction.

In broad strokes, the Peano axioms say this:

$$\mathbb{N} = \{0, 1, 2, 3, 4 ...\}$$

Note that the definition defines that the successor of 0 is 1, and the successor of 1 is 2. We can write this as $latex 1 = succ(0)$, $latex 2 = succ(succ(0))$ and so on and so forth. This way we also define the relationship of each natural number to another.

Since the Peano axioms also defines the succession of each number, given natural numbers, we can now define operations on natural numbers. This is the definition of addition:

$$a + 0 = a\ .......................(1)\\ a + succ(b) = succ(a+b)\ ........(2)$$

Now that addition is defined, here’s a quick and very very very loose proof of 1 + 1 = 2:

We’ve defined $latex 1 = succ(0)$. Due to the transitivity of equality, we can say that the inverse is true: $latex succ(0) = 1$. This way, we can replace “1” with “succ(0)” and vice-versa.

Using Rule (2) from above, we replace the left of the equation:

$$let\:a = 1; \; \:succ(b) = 1\;in\;lhs\\ 1 + 1 = succ(a+b)$$

There is only one solution to what $latex b$ can be: 0, because by definition, $latex succ(0) = 1; 1 = succ(0)$. Now we are ready to replace the right hand side of the equation. We replace $latex a$ with 1, and $latex b$ with 0:

$$1 + 1 = succ(1 + 0)$$

Now we can apply Rule (1) of the definition of addition:

$$1 + 1 = succ(1)$$

Since $latex succ(1)$ is defined as $latex 2$, we have successfully proven that 1 + 1 = 2*note: this is a very non rigorous proof, there are a lot of hidden assumptions that were not stated. To state them all would take hundreds of pages, which is exactly what Russell and Whitehead did.

### Integers

You may not have noticed this, but when we were replacing slots in the equation with values, we introduced simple algebra. The introduction of algebra leads to a variety of new definitions. For example, if we know $latex a$ is $latex 5$ and the result of the addition with $latex b$ is $latex 7$, then $latex b$ would be $latex 2$:

$$a + b = c\ ..............(3)\\ let\:a = 5;\;c = 7\;in\;(3)\\ \implies 5 + b = 7$$

The obvious logical deduction is that b is 2 but then we come to a conundrum. What if $latex c$ were smaller than $latex a$?

$$let\:a = 5;\;c = 2\;in\;(3)\\ \implies b = undef$$

There isn’t a single natural number that could be deduced that fills $latex b$. And so we invented negative numbers. The set of all natural numbers and their inverse are called integers, ℤ*If you wondered about how some infinities can be larger than others, think about it, there would be an entire infinity more numbers in the set ℤ compared to the set ℕ).

### Rationals

Multiplication can be defined without defining addition first*and as far as I can tell, only for natural numbers, but for now, we’ll define multiplication as being built on top of addition. With multiplication having been defined, we run into a problem with badly-behaved equations in the same way that led to the invention of integers. What if $latex c$ were smaller than $latex a$?:

$$a \times b = c\ ........(4)\\ let\:a = 5;\;c = 2\;in\;(4)\\ \implies b = undef$$

And so in order to fix the problem of what number to fill in $latex b$ with, we invent yet another type of number. We call them rationals, ℚ. Rationals are numbers that can be defined as a ratio of two other numbers.

### Reals

If you start doing a lot of divisions and deal with rationals, you may keep coming across certain numbers which can’t seem to fit into a ratio of two numbers. This happens if you deal with circles a lot. A disturbing thought echoes out in your mind, “What if there exist some numbers that cannot be defined as a ratio of two other numbers?”. Shit’s about to get real, yo*Geddit, geddit? Puns..

Then a further thought makes you shudder. Because rationals are built on top of integers, and integers are built on top of natural numbers, the set of rational numbers is larger than the set of integers, which is one infinity larger than the set of natural numbers. If there exists certain numbers that cannot be defined by a ratio of two numbers - call them irrational numbers - then the combined set of irrational and rational numbers must be bigger than the set of rational numbers. The problem is we wouldn’t be able to tell how much bigger, because we don’t know how many irrational numbers there are. Thinking about that alone would make one feel absolutely infinitesimal and will surely send you into an existential crisis.

We call this set of combined numbers real numbers, ℝ, mainly because Rene Descartes was an arsehole (you can see why later in the next section). We can convince ourselves that real numbers are… real by learning how to construct them. The easiest method in my opinion is the Dedekind cut.

### Complexes

To this day, a large majority of maths work is done in the space of real numbers alone. But real numbers still leaves us with problems that cannot be solved with real numbers. For example:

$$\sqrt{-1} = undef$$

And so, in bid to answer the question, we invented imaginary numbers. Cardano first postulated the possibilities of the existence of imaginary numbers in Ars Magna, but he called it “useless” and “fictitious”*because they only appeared in the working and not in the problem nor the answer. Later, Rene Descartes ridiculed the notion of these numbers by calling them “imaginary” numbers. Instead, Descartes insisted that the real numbers are the “real” and only numbers.

But Descartes was wrong. Complex numbers are in fact useful. Sure, complex numbers are a lot more abstract than real numbers, and abstract concepts take a long time to find use. In fact, imaginary numbers weren’t the first time the notions that a certain set of numbers are imaginary and useless. A long time before imaginary numbers, negative numbers were also derided as useless.

This brings us to the weird part about numbers

## Historically...

So far we’ve managed to trace the “lineage” of numbers from first principles. This is where it gets weird. We started with logical axioms, invented natural numbers, integers, rationals, reals and complex numbers and beyond. All because we weren’t able to solve a particular class of problems using existing numbers.

The historical discovery of these numbers on the other hand, is a little less even and less linear. The single most well-researched source on the history of numbers is George Ifrahs’ The Universal History of Numbers*It's a book whose introduction was so riveting to me because I shared a similar experience that I proceeded to finish the rest of the book in a single sitting.. Here I’ve adapted all the key ideas from that book and compiled a brief timeline of the history of numbers as well as the invention of various techniques that led to the invention of particular types of numbers.

Early humanity
Natural numbers: I personally believe humanity took a leap forwards when we abandoned a unary numerical system and started to represent numbers with abstractions
3000 BCE
Rudimentary positional numerical system - Roman Numerals, Egyptian Numerals, etc
2000 BCE
Base-60 positional numerical system invented in Ancient Babylon.
300 BCE
Base-10 positional numerical systems invented across the globe.*lolwut, humanity invented a base-60 positional system before inventing a base 10 system?? I highly suspect our ancestors were aliens with 60 fingers. I shall don my tinfoil hat.
Rational numbers invented/discovered.
200 BCE
Negative numbers invented in China, used for taxation purposes. Never left China.
5 BCE
Irrational numbers: Hippassus of Meropontum drowned for suggesting that irrational numbers might exist.
300 CE
Negative numbers invented in in Alexandria by Diophantus. The West is exposed to negative numbers. This is the de-facto story of negative numbers we learn at school*pretty white-centric, imo. Diophantus called negative numbers "absurd", because they were created solely to solve some linear algebra problems. They were promptly dismissed.
600 CE
Negative numbers invented in Ancient India by Brahmagupta. First known rules for arithmetics dealing with negative numbers.
16th Century CE
Imaginary numbers invented. Immediately deemed useless outside solving a particular set of problems.
20th Century CE
Explosion of number types (and classifications thereof)

Now the reason why I think it’s weird. History doesn’t follow the narrative we built up from logic earlier. Instead, it’s a series of independent inventions/discoveries. That somehow led to a coherent understnading of numbers. The part that weirds me out is the narrative for all the types of numbers according to formal logic don’t even feel retconned in, as you would expect when narratives are mixed and matched.

Or maybe it’s because I’m writing this from a pub after drinks and discussions.

Who knows.

Tell me what you think.