Numbers are abstract. They do not have any physical existence. That much, at least, is fairly obvious and uncontroversial.

Are numbers even real? The concept of numbers is real but reason flounders when considering the reality of any particular number. All “*rational”* numbers (positive or negative) are considered “*real numbers”*. But in this usage, “*real”* is a label not an adjective. “*Rational”* and “*irrational”* are also labels when attached to the word number and are not adjectives describing the abstractions involved. The phrase “*imaginary numbers”* is not a comment about reality. *“Imaginary”* is again a label for a particular class of the concept that is numbers. Linguistically we use the words for numbers both as nouns and as adjectives. When used as a noun, meaning is imparted to the word only because of an attached context – implied or explicit. *“A ten”* has no meaning unless the context tells us it is a *“ten of something”* or as a *“count of some things”* or as a *“measurement in some units”* or a *“position on some scale”*. As nouns, numbers are not very pliable nouns; they cannot be modified by adjectives. There is a mathematical abstraction for *“three”* but there is no conceptual, mathematical difference between a *“fat three”* and a *“hungry three”*. They are not very good as adjectives either. *“Three apples”* says nothing about the apple. *“60”* minutes or *“3,600”* seconds do not describe the minutes or the seconds.

The number of apples on a tree or the number of atoms in the universe are not dependent upon the observer. But number is dependent upon a brain in which the concept of number has some meaning. All of number theory, and therefore all of mathematics, builds on the concept and the definition of *one*. And *one* depends, existentially, on the concept of *identity*.

From

Croutons in the soup of existence

The properties of one are prescribed by the assumptions (the “grammar”) of the language. One (1,unity), by this “grammar” of mathematics is the first non-zero natural number. It is the integer which follows zero. It precedes the number two by the same “mathematical distance” by which it follows zero. It is the “purest” number. Any number multiplied by one or divided by one remains that number. It is its own factorial. It is its own square or square root; cube or cube root; ad infinitum. One is enabled by existence and identity but thereafter its properties are defined, not discovered.

The question of identity is a philosophical and a metaphysical quicksand. Identity is *the relation everything has to itself and nothing else. *But what does that mean? Identity confers uniqueness. (Identical implies sameness but identity requires uniqueness). The concept of one of anything requires that the concept of identity already be in place and emerges from it. It is the uniqueness of identity which enables the concept of a one.

Things exist. A class of similar things can be called apples. Every apple though is unique and has its own identity within that class of things. Now, and only now, can you count the apples. First comes existence, then comes identity along with uniqueness and from that emerges the concept of one. Only then can the concept of numbers appear; where a two is the distance of one away from one, and a three is a distance of one away from two. It is also only then that a negative can be defined as distance away in the other direction. Zero cannot exist without one being first defined. It only appears as a movement of one away from one in the opposite direction to that needed to reach two. Negative numbers were once thought to be unreal. But the concept of negative numbers is just as real as the concept for numbers themselves. The negative sign is merely a commentary about relative direction. Borrowing (+) and lending (-) are just a commentary about direction.

But identity comes first and numbers are a concept which emerges from identity.