Archive for the ‘Mathematics’ Category

The paradox of one: Identity leads to the concept of one but oneness kills identity

October 25, 2023

It is the bedrock of my understanding of the world around me. I think it applies not only to every other human but also to every other living thing which has any semblance of cognition.

Any “thing” which exists, can not, at the same time, be “some other thing”. A thing may be identical in properties to some other things but the various things each remain uniquely identified in space- time.

Claims of violation of this uniqueness (by some proponents / propagandists of quantum theory) are just Edward Lear types of nonsense physics. Statements about single particles existing simultaneously in different spaces or times are using a nonsense definition of existence. Nonsense mathematics (whether for quantum entanglement or time travel or the Ramanujam summation) is as nonsensical as the Dong with the luminous nose. Nonsense is nonsense in any language. No matter.

There are – it is said – “1078 to 1082 atoms in the known universe”. Of course this statement is about the 5% of the known universe which is matter. It is silent about dark energy and dark matter which makes up the remaining 95%. No matter. It is entirely silent about what may or may not be in the universe (or universes) that are not known. No matter.

… roughly 68% of the universe is dark energy. Dark matter makes up about 27%. The rest – everything on Earth, everything ever observed with all of our instruments, all normal matter – adds up to less than 5% of the universe.

Nevertheless, every known atom in the known universe has a unique identity – known or unknown.

In a sense, every atom in the universe can be considered to have its own identity. Atoms are the basic building blocks of matter and are distinguished by their unique combination of protons, neutrons, and electrons. Each element on the periodic table is composed of atoms with a specific number of protons, defining its atomic number and giving it a distinct identity. For example, all hydrogen atoms have one proton, while all carbon atoms have six protons. Moreover, quantum mechanics suggests that each individual atom can have its own unique quantum state, determining its behavior and properties. This means that even identical atoms can be differentiated based on their quantum states.

It has been guesstimated that there may be roughly 1097 fundamental particles making up the 1078 to 1082 atoms in the known universe. These particles too, and not only atoms, have unique identities.

In fact, every fundamental particle has its own separate identity. Fundamental particles are the smallest known building blocks of matter and are classified into various types, such as quarks, leptons, and gauge bosons. Each fundamental particle has distinct properties that define its identity, including its mass, electric charge, spin, and interactions with other particles. For example, an electron is a fundamental particle with a specific mass, charge of -1, and spin of 1/2. Furthermore, in quantum mechanics, each particle is associated with its own unique quantum state, which determines its behavior and properties. These quantum states can differ even among particles of the same type, leading to their individual identities. 

It is the concept of the existence of things, each having a unique identity, which allow us to define a concept of one (1, unity). The concept of identity and the concept of one are inseparable. It is the concept of one which leads to all numbers. The concepts of identity and of “one” are inseparable in both the philosophical and spiritual contexts. But the notion of identity ultimately emerges from the fundamental interconnectedness of the existence of things. And this creates the paradox. Things have separate identities but have the same kind of existence. It is the parameters of the existence which exhibit oneness. But this oneness of existence negates the separate, unique identity of existing things.

Unique identity of existing things gives the concept of one (1) whereas a unifying oneness eradicates what is unique and kills identity.

Identity and oneness can be seen as interconnected in certain philosophical and spiritual perspectives. While they may appear contradictory at first, a deeper exploration reveals a nuanced relationship between the two. Identity refers to the distinguishing characteristics and qualities that make something or someone unique and separate from others. It is the sense of individuality and selfhood that we associate with ourselves and the objects or beings around us. Identity implies a sense of boundaries and distinctions, where each entity is defined by its own set of attributes and properties. On the other hand, the concept of oneness suggests a fundamental unity or interconnectedness that transcends individual identities. It suggests that all things and beings are ultimately interconnected, part of a larger whole or cosmic unity. This perspective emphasizes the underlying unity of existence, where the boundaries and distinctions that define individual identities are seen as illusory or superficial.

In some philosophical and spiritual traditions, the concept of oneness is understood as the ultimate reality or truth, while individual identities are considered temporary manifestations or expressions of this underlying oneness. From this viewpoint, individual identities are like waves in the ocean—distinct for a time but ultimately inseparable from the ocean itself. However, it’s important to recognize that these concepts can be understood and interpreted differently across various philosophical and spiritual frameworks. Some may place more emphasis on the individual identities and the uniqueness of each entity, while others may emphasize the interconnectedness and unity of all things. Ultimately, whether one considers identity and oneness as inseparable or not depends on their philosophical, spiritual, or cultural perspectives. Both concepts offer valuable insights into understanding our place in the world and our relationship with others, and exploring the interplay between identity and oneness can lead to profound philosophical and existential contemplations.

This perspective can be found in various philosophical and spiritual traditions, certain forms of mysticism, and some interpretations of quantum physics. It suggests that the perceived boundaries and separations that define individual identities are illusory, and the true nature of reality is the oneness that transcends these apparent divisions. Identity and one (1) could well be illusory just as all of numbers and the mathematics which follow are not real.

However, it’s important to note that not all philosophical or spiritual perspectives hold this inseparability between identity and oneness. Other perspectives may emphasize the importance of individual identities, personal autonomy, and the uniqueness of each entity. As with any philosophical or metaphysical concept, the relationship between identity and oneness can be a matter of interpretation and personal belief. Different perspectives offer diverse insights into the nature of reality, and individuals may resonate with different understandings based on their experiences, cultural backgrounds, and philosophical inclinations.


Where numbers come from


Numbers and mathematics are possible only because time flows

August 13, 2022

It is probably just a consequence of ageing that I am increasingly captivated (obsessed?) by the origin of things. And of these things, I find the origins of counting, numbers and mathematics (in that order) particularly fascinating. In that order because I am convinced that these developed within human cognition – and could only develop – in that order.  First counting, then numbers and then mathematics. The entire field of what is called number theory, which studies the patterns and relationships between numbers, exists because numbers are what they are. All the patterns and relationships discovered in the last c. 10,000 years all existed – were already there – as soon as the concept of numbers crystallised. Whereas counting and numbers were invented, all the wonders of the patterns and relationships that make up number theory were – and are still being – discovered. And what I find even more astonishing is that the entire edifice of numbers is built upon just one little foundation stone- the concept of identity which gives the concept of oneness.

Croutons in the soup of existence

The essence of identity lies in oneness. There can only be one of any thing once that thing has identity. Once a thing is a thing there is only one of it. Half that thing is no longer that thing. There can be many of such things but every other such thing is still something else.

Numbers are abstract and do not exist in the physical world. They are objects (“words”) within the invented language of mathematics to help us describe the physical world. They enable counting and measuring. The logical one or the philosophical one or the mathematical one all emerge from existence and identity. Neither logic nor philosophy nor mathematics can explain what one is, except that it is. Every explanation or definition attempted ends up being circular. It is what it is.

Given one (1), all other numbers follow.

Where numbers come from

Numbers start with one (1), and without a one (1) there can be no numbers. …… . Given the abstract concepts of identity (oneness, 1) and arithmetical addition (+), all natural numbers inevitably follow. With a 1 and with a +, and the concept of a set and a sum, all the natural numbers can be generated.

1 + 1 + 1 + 1 ……

…. Numbers, ultimately, rest on the concept of identity (oneness).

Equally fascinating are the questions that existence, time and causality are answers to. I am coming to the conclusion that the flow of time (whatever time is) does not emerge from existence but, in fact, enables existence.

Revising Genesis

…. What time is remains a mystery but the first act of creation is to set it flowing. Note that the flow of time does not need existence. To be, however, requires that time be flowing. Time itself, whatever it is, is a prerequisite for the flow of time and the flow of time is prerequisite for existence. ………. For even the concept of existence to be imaginable, it needs that the flow of time be ongoing. It needs to be present as a permanent moving backdrop. The potential for some particular kind of existence then appears, or is created, only when some particular rules of existence are defined and implemented. These rules of existence must therefore also be in place before the concept of things, whether abstract or material or otherwise, can be conjured up.


It is inevitable that my views have evolved and they may well evolve further but my current conclusion is that for mathematics to exist time needs to be flowing.

The bottom line:

  1. All branches of mathematics, though abstract, are existentially dependent upon the concept of numbers.
  2. Numbers depend on the concept of counting.
  3. Counting derives from the concept of oneness (1).
  4. Oneness depends upon the concept of a unique identity.
  5. The existence of a unique identity requires a begin-time.
  6. Beginnings require time to be flowing.
  7. Existence is enabled by the flow of time

Therefore

Numbers and mathematics are possible only because time flows


Where numbers come from

June 6, 2022

Of course it all depends on what numbers are taken to be. Numbers are not real. You cannot see or touch or smell them. They are labels (words) with associated symbols (numerals). They are neither nouns nor adjectives though, in some contexts, they can be used as nouns. Philosophy calls them abstract objects. They are perceived as abstractions in the real world of existence. But they are abstractions which display relationships and patterns among themselves quite independent of the applications from which they are discerned. What lies at the source of numbers? How did they come to be? I see numbers as a means, a language, for describing the underlying patterns connecting countable things in the existing universe.

It starts with one. 

There are four abstract concepts, I suggest, which lie at the source not only of all numbers and numbering systems but of mathematics in general (beginning with arithmetic and geometry). It seems to me that these four concepts are necessary and sufficient.

    1. Oneness (1)
    2. Set
    3. Sum
    4. Arithmetical addition (+)


If there were nothing to count, we would not have numbers.

Of things that exist, even abstract things, human cognition distinguishes between countable things and uncountable things. The necessary and required conditions for things to be considered countable are, first that they exist, and second that each has a unique, discrete identity. Things that are uncountable are those perceived as being part of a continuum and therefore not discrete or unique. (The word uncountable is also used for countable things which are too numerous for the human mind to contemplate but it is not that meaning that I use here). Thus apples, oranges, grains of sand, people and atoms are all countable. Things and concepts that cannot be divided into discrete elements are uncountable. Space, sky, water, air, fire, and earth are all perceptions of continua which are uncountable. Nouns for generalisations are uncountable (furniture, music, art, ….). The distinction applies to abstract things as well. Discrete thoughts or specific ideas can be counted. But abstractions (shapes, information, news, advice, ….) which lack a discrete, unique identity are within the ranks of the uncountable. Types of emotions are countable, but emotion is not.

To be discrete and unique give substance to identity. Existence (a Great Mystery) comes first, of course. To have identity is to have some distinguishing characteristic which enables the quality of “oneness”. Note that the quality of being identical (similar) does not disturb identity. Two, or many, things may be identical, but the identity of each remains inviolate. An atom of hydrogen here may be identical to an atom of hydrogen elsewhere, but the identity of each remains undisturbed. It is estimated that there are between 1078 to 1082 atoms existing in the observable universe. Each one distinct from all the others. Each one having identity.

We use the word identity in many contexts. In the philosophical sense, which includes the context of counting, my definition of identity is then:

identityoneness; the distinguishing character of a thing that constitutes the objective reality of that thing

It is the discreteness and uniqueness contained in identity which gives rise to the concept of oneness as a quality of a thing which makes that thing countable. It is having the concept of oneness which allows us to define a concept of number, label it as “one” and give it a symbol (1). How the concept of identity (oneness) emerged in the species is another one of the Great Mysteries of life and consciousness.

But the concept of identity alone is not sufficient to generate the need to count and the invention of numbers. Having defined one (1), something else is still needed to generate a number system. A social, behavioural component is also required; It is cooperation and interaction with others which leads to the need to count. It probably emerged when humans created social groupings and things were accumulated for rainy days. The notion of addition as the accumulation of things belonging to a set is also needed. An ancient human may have gathered an apple an orange and a goat and accumulated many things but would probably not have thought of those things as belonging to the set of things. If he had gathered only apples and oranges, he may well have recognised that he had accumulated a set of things identified as fruit. And someone at sometime in our prehistory did note that his accumulation of individual goats all belonged to the set of things identified as goats. We cannot now know how our ancestors first came to a numbering system and the concept of addition with numbers, but it must certainly have been at around the same time that the need for counting emerged.

To get from just observing the accumulation of things in the real world to the concept of arithmetical addition was a major intellectual leap.  That journey also needed that the concepts of a set and of a sum were in place. We can only speculate on how that emergence and conjunction of the necessary concepts took place. It would surely have been noticed that there was a common, underlying pattern (rule) which applied with the accumulation – separately – of, say, apples and / or goats. But it would also have been noticed that the pattern did not apply when dealing with mixtures of apples and goats together. Accumulating an apple and an apple exhibits the same underlying pattern as accumulating a goat and a goat. But a goat and an apple followed the same rule only when they were considered, not as goats or apples, but as things belonging to a greater class (set) of things.

1 apple + 1 apple follows the same abstract, underlying pattern as 1 goat + 1 goat or 1 thing + 1 thing, but the rule breaks down at 1 apple + 1 goat.

A set of thingsis a multiplicity of similar countable things which together can assume a separate identity (unique and discrete)

It is likely that it was then that they realised that the accumulation of things could be represented by abstract rules (patterns) which were independent of the set of things being accumulated. The rule of arithmetical addition (+), they would have found, applied in a like manner to accumulations of members of any set, and that a common name could be given to the result (sum) of the accumulation.

Sumthe result of an accumulation

But they would also have found that the rule (pattern) of accumulation and counting broke down when dealing with mixed sets of things. Whereas one apple and one apple gave the same sum as one goat and one goat, that sum had no meaning if one apple was accumulated with one goat. However, the summation rule reasserts itself when considering the sum of things with the accumulation of one thing (apple) and one thing (goat). This general, but abstract, rule of the summation operation was arithmetical addition.

Arithmetical addition (+) the accumulation of one number to another number giving a sum as the result

Maintaining identity remained crucial. They would have noted that the abstract rule did not apply if the things being accumulated lost identity (their oneness) during the operation. One goat and one lion gave one lion. One bubble and one bubble could merge to give one bubble. But they would also have noted that uncountable things were not capable of being accumulated.

Given the abstract concepts of identity (oneness, 1) and arithmetical addition (+), all natural numbers inevitably follow. With a 1 and with a +, and the concept of a set and a sum, all the natural numbers can be generated.

1 + 1 + 1 + 1 ……

Having invented a label and a symbol for oneness (one, 1), new labels and symbols were then invented for the abstract sums. The chosen base (binary, decimal, hexagesimal, vigesimal, ….) determines the number of labels and symbols to be invented.

1 + 1 gave 2, 1+ 2 gave 3, ….. and so on

And all the natural numbers were born.

The reverse of accumulation, the giving away or lessening of things, led to the abstraction of arithmetical subtraction (-) and that gave us zero and all the negative integers. Note that oneness and one (1) must come first before the concept of zero (0) makes any sense. Zero is a very specific quality of nothingness, but zero is not nothingness. In the observed world an absence of apples (0 apples) is not quite the same thing as an absence of goats (0 goats), but the number abstraction (0) from both is the same. As a number, zero is special and does not follow all the rules discovered connecting, and applying to, the other numbers. (Zero also functions as a placeholder in our notations but that is a different matter to its properties as a number). Zero added to another number does not create a new number as every other number does. Division is allowed by any number but not by zero. Division by zero is undefined. One (1), not zero (0), is where numbers start from. Zero is merely a consequence of removing (subtracting) one (1) from one (1).

Multiplication is just recursive addition. Recursive subtraction leads to division and that generates irrational numbers. Applying numbers to shapes (geometry) led to the discovery of transcendental numbers. Number theory is about studying and discovering relationships between numbers. But all these discovered relationships exist only because numbers exist as they do. All the relationships exist as soon as the concepts of oneness (1) and addition (+) are fixed together with the concepts of a set and a sum. Discoveries of the relationships can come much later. Numbers depend on counting and number theory depends upon the numbers.

Numbers start with one (1), and without a one (1) there can be no numbers.

Numbers, ultimately, rest on the concept of identity (oneness).


Discovery versus invention and why mathematics is just another language

October 31, 2021

Languages are invented, and words are invented. We can assign whatever meaning we can commonly agree on, to any word. Discovery and invention are such words and are labels, placeholders, for some intended meaning. Confusion with words arise because the nuances of meaning associated with a word may not be as common as we think. This post is about mathematics being a language, but I need to start by what I understand to be discovery and how it differs from invention.


Discoveries and inventions

I do not quite understand the uncertainty sometimes expressed about whether things are inventions or discoveries. I find no ambiguity between “discovery” and “invention”. It is not as if everything in the world must be either discovered or invented. There is no need for any epistemological issues here. Discovery consists of the first finding of that which exists but is unknown. It includes what may be newly inferred either by deduction or by induction. What has been discovered by others can still be discovered by someone else as new knowledge. The discoverer, however, does not impact what is discovered. Discovery is always about finding. Invention, on the other hand, is always about creation. It creates something (material or immaterial) that did not exist before. The act of invention requires purpose and always results in a construct. The inventor defines and creates the invention. Once invented by someone, something may be discovered by others. An invention may be so complex that even its inventor has to discover some of its detailed characteristics or workings. It may be rediscovered if it has been forgotten (if immaterial) or has fallen into disuse (if material). Creation is not necessarily or always invention. A copy or even an improvement of an invention by someone else may involve creation but is neither a discovery nor an invention. Both finding and creation imply a doer, though it is not necessary for the doer to be human.

Of course everything discovered to exist must – so our logic tells us – have had a beginning. Whether such a beginning was a creation event (an invention), or random happenchance, or something else, lies at the heart of the most intractable philosophical question of all – the mystery of existence. I take the view that existence encompasses both material and immaterial things, and that the immaterial does not necessarily require a mind within which it is perceived to exist. The flow of time, causality and the laws of nature are, to my mind, examples of such immaterial existence. Thoughts, however, are also immaterial and exist but they need a mind within which to reside.

  • we discover our existence, we invent our names
  • emotions are discovered, their descriptive names are invented
  • hate is discovered, war is invented
  • a thought is discovered, a story is invented
  • the laws of nature are discovered, the laws of man are invented
  • knowledge is discovered, the Koran and the Bible are invented
  • human capabilities are discovered, actual behaviour can often be invented
  • the cognitive ability to relate and recognise patterns in sound is discovered, a piece of music is invented
  • language ability is discovered, languages are invented
  • new individual experiences (emotions, colours, feelings) are discovered, naming them may be inventions
  • individual learning of what is knowable but not known is discovery, creating a new word is invention
  • (Pooh learning about elephants was an invented discovery, naming them heffalumps was invention by Milne)
  • the cognitive ability to tell lies is discovered, fiction is invented
  • the human need to seek explanations is discovered, gods are invented
  • the emotional, human need for spiritualism is discovered, religions are invented
  • logic is discovered, argument is invented
  • relationships and patterns in the universe are discovered, languages to describe these are invented
  • where no man has gone before leads to discovery, naming or mapping the where is invention
  • my need for coffee is a discovery, my mug of coffee is a creation but is no invention
  • human behaviour may be discovered, “social science” is always invention
  • new geography is usually discovery and no geography can ever be invented
  • the past is immutable, history is a patchwork of discovered islands in a sea of invented narrative
  • news is discovered, fake news is invented

As a generalisation there is scientific discovery and there is artistic invention. The process of science is one of discovery and there is little ambiguity about that. The search for knowledge is about discovery and never about the invention of knowledge. That tools and instruments used for this process of discovery are often inventions is also apparent. Some ambiguity can be introduced – though I think unnecessarily – by treating postulates and hypotheses and theories as inventions in their own right. Thoughts are discovered, never invented. Any hypothesis or theory – before being put to the test – is a discovered thought, whether based on observations or not. Theories are discovered thoughts, but a conspiracy is always invented. Of course, observation (empiricism) may lead to conjecture, which may then take the form of a postulate or a theory. But there is no need for such conjectural thought to be equated with invention (though it often is).  Science discovers, engineering often invents. A good scientist discovers and uses inventions to further his discoveries. A good scientist may also be, and often is, an inventor. (Of course a great many calling themselves scientists are just bean counters and neither discover nor invent). Composers and authors and lawyers often invent. Doctors discover a patient’s ailments, a quack invents them. Most pharmaceutical companies invent drugs to suit discovered illnesses, some less ethical ones invent illnesses to suit their discovered compounds. Talent is discovered, a celebrity is invented.

Then there is the question of whether mathematics is discovered or invented. But before that we must define what it is that we are considering.

What mathematics is

(more…)

Mathematics started in prehistory with counting and the study of shapes

January 8, 2021
Compass box

Mathematics today is classified into more than 40 different areas by journals for publication. I probably heard about the 3 R’s (Reading, Riting and Rithmetic) first at primary school level. At high school, which was 60 years ago, mathematics consisted of arithmetic, geometry, algebra, trigonometry, statistics and logic – in that order. I remember my first class where trigonometry was introduced as a “marriage of algebra and geometry”. Calculus was touched on as advanced algebra. Some numerical methods were taught as a part of statistics. If I take my own progression, it starts with arithmetic, moves on to geometry, algebra and trigonometry and only then to calculus and statistics, symbolic logic and computer science. It was a big deal when, at 10, I got my first “compass box” for geometry, and another big deal, at 13, with my first copy of trigonometric tables. At university in the 70s, Pure Mathematics was distinguished from Applied Engineering Mathematics and from Computing. In my worldview, Mathematics and Physics Departments were where the specialist, esoteric mysteries of such things as topology, number theory, quantum algebra, non-Euclidean geometry and combinatorics could be studied.

I don’t doubt that many animals can distinguish between more and less. It is sometimes claimed that some primates have the ability to count up to about 3. Perhaps they do, but except for in the studies reporting such abilities, they never actually do count. No animals apply counting, They don’t exhibit any explicit understanding of geometrical shapes or structures, though birds, bees, ants and gorillas seem to apply some structural principles, intuitively, when building their nests. Humans, as a species, are unique in not only imagining but also in applying mathematics. We couldn’t count when we left the trees. We had no tools then and we built no shelters. So how did it all begin?

Sometimes Arithmetic, Geometry and Algebra are considered the three core areas of mathematics. But I would contend that it must all start with counting and with shapes – which later developed into Arithmetic and Geometry. Algebra and its abstractions came much later. Counting and the study of shapes must lie at the heart of how prehistoric humans first came to mathematics. But I would also contend that counting and observing the relationship between shapes would have started separately and independently. They both require a certain level of cognition but they differ in that the study of shapes is based on observations of physical surroundings while counting requires invention of concepts in the abstract plane. They may have been contemporaneous but they must, I think, have originated separately.

No circle of standing stones would have been possible without some arithmetic (rather than merely counting) and some geometry. No pyramid, however simple, was ever built without both. No weight was dragged or rolled up an inclined plane without some understanding of both shapes and numbers. No water channel that was ever dug did not involve some arithmetic and some geometry. Already by the time of Sumer and Babylon, and certainly by the time of the Egyptians and the Harappans, the practical application of arithmetic and geometry and even trigonometry in trade, surveying, town planning, time-keeping and building were well established. The sophisticated management of water that we can now glimpse in the ancient civilizations needed both arithmetic and geometry. There is not much recorded history that is available before the Greeks. Arithmetic and Geometry were well established by the time we come to the Greeks who even conducted a vigorous discourse about the nobility (or divinity) of the one versus the other. Pythagoras is not happy with arithmetic since numbers cannot give him – exactly – the hypotenuse of a right triangle of sides of equal length (√2). Which he can so easily draw. Numbers could not exactly reflect all that he could achieve with a straight edge and a compass. The circle could not be squared. The circumference was irrational. The irrationality of the numbers needed to reflect geometrical shapes was, for the purists, vulgar and an abomination. But the application of geometry and arithmetic were common long, long before the Greeks. There is a great distance before counting becomes arithmetic and the study of shapes becomes geometry but the roots of mathematics lie there. That takes us back to well before the Neolithic (c. 12,000 years ago).

That geometry derives from the study of shapes and the patterns and relationships between shapes, given some threshold level of cognition, seems both obvious and inevitable. Shapes are real and ubiquitous. They can be seen in all aspects of the natural world and can be mimicked and constructed. The arc of the sun curving across the sky creates a shape. Shadows create shapes. Light creates straight lines as the elevation of the sun creates angles. Shapes can be observed. And constructed. A taut string to give a straight line and the calm surface of a pond to give a level plane. A string and a weight to give the vertical. A liquid level to give the horizontal. Sticks and shadows. A human turning around to observe the surroundings created a circle. Strings and compasses. Cave paintings from c. 30,000 years ago contain regular shapes. Circles and triangles and squares. Humans started not only observing, but also applying, the relationships between shapes a very long time ago.

Numbers are more mystical. They don’t exist in the physical world. But counting the days from new moon to new moon for a lunar month, or the days in a year, were also known at least 30,000 years ago. Ancient tally sticks to count to 29 testify to that. It would seem that the origins of arithmetic (and numbers) lie in our ancient prehistory and probably more than 50,000 years ago. Counting, the use of specific sounds as the representation of abstract numbers, and number systems are made possible only by first having a concept of identity which allows the definition of one. Dealing with identity and the nature of existence take us before and beyond the realms of philosophy or even theology and are in the metaphysical world. The metaphysics of existence remain mystical and mysterious and beyond human cognition, as much today as in prehistoric times. Nevertheless, it is the cognitive capability of having the concept of a unique identity which enables the concept of one. That one day is distinguishable from the next. That one person, one fruit, one animal or one thing is uniquely different to another. That unique things, similar or dissimilar, can be grouped to create a new identity. That one grouping (us) is distinguishable from another group (them). Numbers are not physically observable. They are all abstract concepts. Linguistically they are sometimes bad nouns and sometimes bad adjectives. The concept of one does not, by itself, lead automatically to a number system. That needs in addition a logic system and invention (a creation of something new which presupposes a certain cognitive capacity). It is by definition, and not by logic or reason or inevitability, that two is defined as one more than the identity represented by one, and three is defined as one more than two, and so on. Note that without the concept of identity and the uniqueness of things setting a constraint, a three does not have to be separated from a two by the same separation as from two to one. The inherent logic is not itself invented but emerges from the concept of identity and uniqueness. That 1 + 1 = 2 is a definition not a discovery. It assumes that addition is possible. It is also significant that nothingness is a much wider (and more mysterious and mystical) concept than the number zero. Zero derives, not from nothingness, but from the assumption of subtraction and then of being defined as one less than one. That in turn generalises to zero being any thing less than itself. Negative numbers emerge by extending that definition. The properties of zero are conferred by convention and by definition. Numbers and number systems are thus a matter of “invention by definition”, but constrained by the inherent logic which emerges from the concept of identity. The patterns and relationships between numbers have been the heady stuff of number theory and a matter of great wonder when they are discovered, but they are all consequent to the existence of the one, the invention of numerals and the subsequent definition that 1 + 1 = 2. Number theory exists only because the numbers are defined as they are. Whereas the concept of identity provides the basis for one and all integers, a further cognitive step is needed to imagine that the one is not indivisible and then to consider the infinite parts of one.

Mere counting is sometimes disparaged, but it is, of course, the most rudimentary form of a rigorous arithmetic with its commutative, associative and distributive laws.

Laws of arithmetic

The cognitive step of getting to count in the first place is a huge leap compared to the almost inevitable evolution of counting into numbers and then into an arithmetic with rigorous laws. We will never know when our ancestors began to count but it seems to me – in comparison with primates of today – that it must have come after a cognitive threshold had been achieved. Quite possibly with the control of fire and after the brain size of the species had undergone a step change. That takes us back to the time of homo erectus and perhaps around a million years ago.

Nearly all animals have shape recognition to some extent. Some primates can even recognise patterns in similar shapes. It is plausible that recognition of patterns and relationships between shapes only took off when our human ancestors began construction either of tools or of rudimentary dwellings. The earliest tools (after the use of clubs) were probably cutting edges and these are first seen around 1.8 million years ago. The simplest constructed shelters would have been lean-to structures of some kind. Construction of both tools and shelters lend themselves naturally to the observation of many geometrical shapes; rectangles, polygons, cones, triangles, similar triangles and the rules of proportion between similar shapes. Arches may also have first emerged with the earliest shelters. More sophisticated tools and very simple dwellings take us back to around 400,000 years ago and certainly to a time before anatomically modern humans had appeared (c. 200,000 years ago). Both rudimentary counting and a sense of shapes would have been present by then. It would have been much later that circles and properties of circles were observed and discovered. (Our earliest evidence of a wheel goes back some 8,000 years and is the application of a much older mathematics). Possibly the interest in the circle came after a greater interest in time keeping had emerged. Perhaps from the first “astronomical” observations of sunrise and sunset and the motion of the moon and the seasons. Certainly our ancestors were well-versed with circles and spheres and their intersections and relationships by the time they became potters (earlier than c. 30,000 years ago). 

I suspect it was the blossoming of trade – rather than the growth of astronomy – which probably helped take counting to number systems and arithmetic. The combination of counting and shapes starts, I think, with the invention of tools and the construction of dwellings. By the time we come to the Neolithic and settlements and agriculture and fortified settlements, arithmetic and geometry and applied mathematics is an established reality. Counting could have started around a million years ago. The study of shapes may have started even earlier. But if we take the origin of “mathematics” to be when counting ability was first combined with a sense of shapes, then we certainly have to step back to at least 50,000 years ago.


Numbers emerge from the concept of identity

December 18, 2020

Numbers are abstract. They do not have any physical existence. That much, at least, is fairly obvious and uncontroversial.

Are numbers even real? The concept of numbers is real but reason flounders when considering the reality of any particular number. All “rational” numbers (positive or negative) are considered “real numbers”. But in this usage, “real” is a label not an adjective. “Rational” and “irrational” are also labels when attached to the word number and are not adjectives describing the abstractions involved. The phrase “imaginary numbers” is not a comment about reality. “Imaginary” is again a label for a particular class of the concept that is numbers. Linguistically we use the words for numbers both as nouns and as adjectives. When used as a noun, meaning is imparted to the word only because of an attached context – implied or explicit. “A ten” has no meaning unless the context tells us it is a “ten of something” or as a “count of some things” or as a “measurement in some units” or a “position on some scale”. As nouns, numbers are not very pliable nouns; they cannot be modified by adjectives. There is a mathematical abstraction for “three” but there is no conceptual, mathematical difference between a “fat three” and a “hungry three”. They are not very good as adjectives either. “Three apples” says nothing about the apple. “60” minutes or “3,600” seconds do not describe the minutes or the seconds.

The number of apples on a tree or the number of atoms in the universe are not dependent upon the observer. But number is dependent upon a brain in which the concept of number has some meaning. All of number theory, and therefore all of mathematics, builds on the concept and the definition of one.  And one depends, existentially, on the concept of identity.

From Croutons in the soup of existence

The properties of one are prescribed by the assumptions (the “grammar”) of the language. One (1,unity), by this “grammar” of mathematics is the first non-zero natural number. It is the integer which follows zero. It precedes the number two by the same “mathematical distance” by which it follows zero. It is the “purest” number. Any number multiplied by one or divided by one remains that number. It is its own factorial. It is its own square or square root; cube or cube root; ad infinitum. One is enabled by existence and identity but thereafter its properties are defined, not discovered. 

The question of identity is a philosophical and a metaphysical quicksand. Identity is the relation everything has to itself and nothing else. But what does that mean? Identity confers uniqueness. (Identical implies sameness but identity requires uniqueness). The concept of one of anything requires that the concept of identity already be in place and emerges from it. It is the uniqueness of identity which enables the concept of a one.

Things exist. A class of similar things can be called apples. Every apple though is unique and has its own identity within that class of things. Now, and only now, can you count the apples. First comes existence, then comes identity along with uniqueness and from that emerges the concept of one. Only then can the concept of numbers appear; where a two is the distance of one away from one, and a three is a distance of one away from two. It is also only then that a negative can be defined as distance away in the other direction. Zero cannot exist without one being first defined. It only appears as a movement of one away from one in the opposite direction to that needed to reach two. Negative numbers were once thought to be unreal. But the concept of negative numbers is just as real as the concept for numbers themselves. The negative sign is merely a commentary about relative direction. Borrowing (+) and lending (-) are just a commentary about direction. 

But identity comes first and numbers are a concept which emerges from identity.


Why did we start to count?

October 12, 2020

Counting and the invention of numbers and the abstractions enabling mathematics are surely cognitive abilities. Counting itself involves an abstract ability. The simple act of raising two fingers to denote the number of stones or lions or stars implies first, the abstract ability to describe an observed quality and second, the desire to communicate that observation.

What led humans to counting and when?

Before an intelligence can turn to counting it must first have some concept of numbers. When and how did our ancient ancestors  first develop a concept of numbers and then start counting? …….. 

It seems clear that many animals do distinguish – in a primitive and elementary way – between “more” and “less, and “few” and “many”,and “bigger” and “smaller”, and even manage to distinguish between simple number counts. They show a sophisticated use of hierarchy and precedence.

Some primates show some primitive abilities when tested by humans

…..  Rhesus monkeys appear to understand that 1 + 1 = 2. They also seem to understand that 2 + 1 = 3, 2 – 1 = 1, and 3 – 1 = 2—but fail, however, to understand that 2 + 2 = 4. ……

But even chimpanzees and monkeys rarely, if ever, use counts or counting in interactions among themselves. The abilities for language and counting are not necessarily connected genetically (though it is probable), but they are both certainly abilities which appear gradually as cognition increases. Mathematics is, of course, just another language for describing the world around us. Number systems, as all invented languages, need that a system and its rules be shared before any communication is feasible. It is very likely that the expressions of the abilities to count and to have language follow much the same timeline. The invention of specific sounds or gestures to signify words surely coincided with the invention of gestures or sounds to signify numbers. The step change in the size of brains along the evolutionary path of humans is very likely closely connected with the expressions of the language and the counting abilities.

The ability to have language surely preceded the invention of languages just as the ability to count preceded the expressions of counting and numbering. It is not implausible that the first member of a homo erectus descendant who used his fingers to indicate one of something, or four of something else, to one of his peers, made a far, far greater discovery – relatively – than Newton or Einstein ever did.

We must have started counting and using counts (using gestures) long before we invented words to represent counts. Of course, it is the desire to communicate which is the driving force which takes us from having abilities to expressions of those abilities. The “cooperation gene” goes back to before the development of bipedalism and before the split with chimpanzees or even gorillas (at least 9 million years ago).

The simple answer to the question “Why did we start to count?” is because we could conceive of a count, observed it and wished to communicate it. But this presupposes the ability to count. Just as with language, the ability and the expression of the ability, are a consequence of the rapid increase in brain size which happened between 3 m and 1 m years ago.

I am persuaded that that rapid change was due to the control of fire and the change to eating cooked food and especially cooked meat. The digestion of many nutrients becomes possible only with cooked food and is the most plausible driver for the rapid increase in brain size.

Raw Food not enough to feed big brains

………. our brains would still be the size of an ape’s if H. erectus hadn’t played with fire: “Gorillas are stuck with this limitation of how much they can eat in a day; orangutans are stuck there; H. erectus would be stuck there if they had not invented cooking,” she says. “The more I think about it, the more I bow to my kitchen. It’s the reason we are here.”


And then came counting

August 25, 2018

Origins of human cognitive development go back a lot longer than was once thought. Our first bipedal ancestors who came down from the trees more than 5 million years ago, had  already some concept of “more” and “less” and  perhaps even of rudimentary numbers upto 3 (as rhesus monkeys have today). Genetic analysis of ancient bones is showing that the origin and development of modern humans needs to be taking the Neanderthals and the Denisovans into account and perhaps has to go back at least to the time of a common ancestor from over 1 million years ago. Just considering the last 200,000 years is no longer enough.

I have no doubt that the mastery of fire, the eating of cooked meat, the growth of the brain and, above all, the increased need for cooperation were interconnected and drove human cognitive development. Whether developments happened in one place and spread or happened at many places, perhaps at many times, will always be a matter of speculation. But it is not so difficult to come up with a not implausible time-line of the key developments which gave us first counting and then tallying and arithmetic and geometry and now complex number theory. The oldest evidence we have of counting are tally-sticks from over 50,000 years ago. But counting surely started long before that.


Related:

What led humans to counting and when?

The origins of base 60


 

Pareto’s 80/20 rule is ubiquitous

June 11, 2018

I first came across and learned to use the Pareto principle in the 70s as a young engineer. It was the starting point for fault analysis of any kind.  Root cause analysis always started with a “Pareto diagram”. It was an extremely powerful tool not only for fault analysis but also then in all quality improvement actions.

The Pareto principle (also known as the 80/20 rule, the law of the vital few, or the principle of factor sparsity) states that, for many events, roughly 80% of the effects come from 20% of the causesWikipedia

Pareto showed in 1896 that 80% of land in Italy was owned by 20% of the population and thus was born the 80/20 rule. It has now become almost a cliche in all business processes and in financial and economic analysis to describe the relationship where a minority of causes lead to a majority of the result.

The 80-20 rule is a business rule of thumb that states that 80% of outcomes can be attributed to 20% of all causes for a given event. In business, the 80-20 rule is often used to point out that 80% of a company’s revenue is generated by 20% of its total customers. Therefore, the rule is used to help managers identify and determine which operating factors are most important and should receive the most attention based on an efficient use of resources.Investopedia

The 80/20 rule seems to apply in almost all fields. It applies in wealth distribution, in personal and public finance, in all kinds of manufacturing, in quality control, in experimentation and in disease control.

It is not perhaps so surprising.

Wherever a phenomenon is subject to a power-law probability distribution, the 80/20 rule will apply, and a power-law probability distribution is perhaps the most common form of probability distribution that occurs in nature and in man-made processes. Put differently it is not at all surprising.

Of all the possible causes of an observed effect, a minority of the possible causes are usually responsible for a majority of the observed effect.

Perhaps we should be surprised only if the 80/20 “rule” does not apply. The “20%” and the “80%” should be taken as symbols for a “minority” and a “majority” respectively and then the 80/20 rule is ubiquitous.


 

Decimals are too simple

April 22, 2018

Of course all attempts to create a 10 hour day with 100 minutes to each hour and 100 seconds to each minute have failed. Similarly all attempts to divide the circle into 100 parts have not caught on.

The use of 60 is almost as basic as the use of 10.

The origins of base 60

All the non-decimal systems I learnt were embedded in memory before I was 20. I don’t expect many will remember these.

As a child I learned the use of 12 and 60 from my grandmother. The use of 12 was automatic with 4 and 3. Three pice to a pie. 4 pies to an anna and 16 annas to the rupee. When driving around India with my father, miles and furlongs and yards and feet came naturally. Bushels and pecks and gallons and quarts and pints came later as an apprentice in England.

Decimals are simple. But they are also simplistic.

Perhaps too simple.

Rupee, anna, pice, pies

Pounds, shillings, pence, farthings

Ton, hundredweights, pounds, ounces

Mile, furlongs, yards, feet

Bushel, pecks, gallons, quarts, pints