Of course it all depends on what numbers are taken to be. Numbers are not real. You cannot see or touch or smell them. They are labels (words) with associated symbols (numerals). They are neither nouns nor adjectives though, in some contexts, they can be used as nouns. Philosophy calls them abstract objects. They are perceived as abstractions in the real world of existence. But they are abstractions which display relationships and patterns among themselves quite independent of the applications from which they are discerned. What lies at the source of numbers? How did they come to be? I see numbers as a means, a language, for describing the underlying patterns connecting countable things in the existing universe.
It starts with one.
There are four abstract concepts, I suggest, which lie at the source not only of all numbers and numbering systems but of mathematics in general (beginning with arithmetic and geometry). It seems to me that these four concepts are necessary and sufficient.
-
- Oneness (1)
- Set
- Sum
- Arithmetical addition (+)
If there were nothing to count, we would not have numbers.
Of things that exist, even abstract things, human cognition distinguishes between countable things and uncountable things. The necessary and required conditions for things to be considered countable are, first that they exist, and second that each has a unique, discrete identity. Things that are uncountable are those perceived as being part of a continuum and therefore not discrete or unique. (The word uncountable is also used for countable things which are too numerous for the human mind to contemplate but it is not that meaning that I use here). Thus apples, oranges, grains of sand, people and atoms are all countable. Things and concepts that cannot be divided into discrete elements are uncountable. Space, sky, water, air, fire, and earth are all perceptions of continua which are uncountable. Nouns for generalisations are uncountable (furniture, music, art, ….). The distinction applies to abstract things as well. Discrete thoughts or specific ideas can be counted. But abstractions (shapes, information, news, advice, ….) which lack a discrete, unique identity are within the ranks of the uncountable. Types of emotions are countable, but emotion is not.
To be discrete and unique give substance to identity. Existence (a Great Mystery) comes first, of course. To have identity is to have some distinguishing characteristic which enables the quality of “oneness”. Note that the quality of being identical (similar) does not disturb identity. Two, or many, things may be identical, but the identity of each remains inviolate. An atom of hydrogen here may be identical to an atom of hydrogen elsewhere, but the identity of each remains undisturbed. It is estimated that there are between 1078 to 1082 atoms existing in the observable universe. Each one distinct from all the others. Each one having identity.
We use the word identity in many contexts. In the philosophical sense, which includes the context of counting, my definition of identity is then:
identity – oneness; the distinguishing character of a thing that constitutes the objective reality of that thing
It is the discreteness and uniqueness contained in identity which gives rise to the concept of oneness as a quality of a thing which makes that thing countable. It is having the concept of oneness which allows us to define a concept of number, label it as “one” and give it a symbol (1). How the concept of identity (oneness) emerged in the species is another one of the Great Mysteries of life and consciousness.
But the concept of identity alone is not sufficient to generate the need to count and the invention of numbers. Having defined one (1), something else is still needed to generate a number system. A social, behavioural component is also required; It is cooperation and interaction with others which leads to the need to count. It probably emerged when humans created social groupings and things were accumulated for rainy days. The notion of addition as the accumulation of things belonging to a set is also needed. An ancient human may have gathered an apple an orange and a goat and accumulated many things but would probably not have thought of those things as belonging to the set of things. If he had gathered only apples and oranges, he may well have recognised that he had accumulated a set of things identified as fruit. And someone at sometime in our prehistory did note that his accumulation of individual goats all belonged to the set of things identified as goats. We cannot now know how our ancestors first came to a numbering system and the concept of addition with numbers, but it must certainly have been at around the same time that the need for counting emerged.
To get from just observing the accumulation of things in the real world to the concept of arithmetical addition was a major intellectual leap. That journey also needed that the concepts of a set and of a sum were in place. We can only speculate on how that emergence and conjunction of the necessary concepts took place. It would surely have been noticed that there was a common, underlying pattern (rule) which applied with the accumulation – separately – of, say, apples and / or goats. But it would also have been noticed that the pattern did not apply when dealing with mixtures of apples and goats together. Accumulating an apple and an apple exhibits the same underlying pattern as accumulating a goat and a goat. But a goat and an apple followed the same rule only when they were considered, not as goats or apples, but as things belonging to a greater class (set) of things.
1 apple + 1 apple follows the same abstract, underlying pattern as 1 goat + 1 goat or 1 thing + 1 thing, but the rule breaks down at 1 apple + 1 goat.
A set of things – is a multiplicity of similar countable things which together can assume a separate identity (unique and discrete)
It is likely that it was then that they realised that the accumulation of things could be represented by abstract rules (patterns) which were independent of the set of things being accumulated. The rule of arithmetical addition (+), they would have found, applied in a like manner to accumulations of members of any set, and that a common name could be given to the result (sum) of the accumulation.
Sum – the result of an accumulation
But they would also have found that the rule (pattern) of accumulation and counting broke down when dealing with mixed sets of things. Whereas one apple and one apple gave the same sum as one goat and one goat, that sum had no meaning if one apple was accumulated with one goat. However, the summation rule reasserts itself when considering the sum of things with the accumulation of one thing (apple) and one thing (goat). This general, but abstract, rule of the summation operation was arithmetical addition.
Arithmetical addition (+) – the accumulation of one number to another number giving a sum as the result
Maintaining identity remained crucial. They would have noted that the abstract rule did not apply if the things being accumulated lost identity (their oneness) during the operation. One goat and one lion gave one lion. One bubble and one bubble could merge to give one bubble. But they would also have noted that uncountable things were not capable of being accumulated.
Given the abstract concepts of identity (oneness, 1) and arithmetical addition (+), all natural numbers inevitably follow. With a 1 and with a +, and the concept of a set and a sum, all the natural numbers can be generated.
1 + 1 + 1 + 1 ……
Having invented a label and a symbol for oneness (one, 1), new labels and symbols were then invented for the abstract sums. The chosen base (binary, decimal, hexagesimal, vigesimal, ….) determines the number of labels and symbols to be invented.
1 + 1 gave 2, 1+ 2 gave 3, ….. and so on
And all the natural numbers were born.
The reverse of accumulation, the giving away or lessening of things, led to the abstraction of arithmetical subtraction (-) and that gave us zero and all the negative integers. Note that oneness and one (1) must come first before the concept of zero (0) makes any sense. Zero is a very specific quality of nothingness, but zero is not nothingness. In the observed world an absence of apples (0 apples) is not quite the same thing as an absence of goats (0 goats), but the number abstraction (0) from both is the same. As a number, zero is special and does not follow all the rules discovered connecting, and applying to, the other numbers. (Zero also functions as a placeholder in our notations but that is a different matter to its properties as a number). Zero added to another number does not create a new number as every other number does. Division is allowed by any number but not by zero. Division by zero is undefined. One (1), not zero (0), is where numbers start from. Zero is merely a consequence of removing (subtracting) one (1) from one (1).
Multiplication is just recursive addition. Recursive subtraction leads to division and that generates irrational numbers. Applying numbers to shapes (geometry) led to the discovery of transcendental numbers. Number theory is about studying and discovering relationships between numbers. But all these discovered relationships exist only because numbers exist as they do. All the relationships exist as soon as the concepts of oneness (1) and addition (+) are fixed together with the concepts of a set and a sum. Discoveries of the relationships can come much later. Numbers depend on counting and number theory depends upon the numbers.
Numbers start with one (1), and without a one (1) there can be no numbers.
Numbers, ultimately, rest on the concept of identity (oneness).