Archive for the ‘Mathematics’ Category

Mathematics started in prehistory with counting and the study of shapes

January 8, 2021
Compass box

Mathematics today is classified into more than 40 different areas by journals for publication. I probably heard about the 3 R’s (Reading, Riting and Rithmetic) first at primary school level. At high school, which was 60 years ago, mathematics consisted of arithmetic, geometry, algebra, trigonometry, statistics and logic – in that order. I remember my first class where trigonometry was introduced as a “marriage of algebra and geometry”. Calculus was touched on as advanced algebra. Some numerical methods were taught as a part of statistics. If I take my own progression, it starts with arithmetic, moves on to geometry, algebra and trigonometry and only then to calculus and statistics, symbolic logic and computer science. It was a big deal when, at 10, I got my first “compass box” for geometry, and another big deal, at 13, with my first copy of trigonometric tables. At university in the 70s, Pure Mathematics was distinguished from Applied Engineering Mathematics and from Computing. In my worldview, Mathematics and Physics Departments were where the specialist, esoteric mysteries of such things as topology, number theory, quantum algebra, non-Euclidean geometry and combinatorics could be studied.

I don’t doubt that many animals can distinguish between more and less. It is sometimes claimed that some primates have the ability to count up to about 3. Perhaps they do, but except for in the studies reporting such abilities, they never actually do count. No animals apply counting, They don’t exhibit any explicit understanding of geometrical shapes or structures, though birds, bees, ants and gorillas seem to apply some structural principles, intuitively, when building their nests. Humans, as a species, are unique in not only imagining but also in applying mathematics. We couldn’t count when we left the trees. We had no tools then and we built no shelters. So how did it all begin?

Sometimes Arithmetic, Geometry and Algebra are considered the three core areas of mathematics. But I would contend that it must all start with counting and with shapes – which later developed into Arithmetic and Geometry. Algebra and its abstractions came much later. Counting and the study of shapes must lie at the heart of how prehistoric humans first came to mathematics. But I would also contend that counting and observing the relationship between shapes would have started separately and independently. They both require a certain level of cognition but they differ in that the study of shapes is based on observations of physical surroundings while counting requires invention of concepts in the abstract plane. They may have been contemporaneous but they must, I think, have originated separately.

No circle of standing stones would have been possible without some arithmetic (rather than merely counting) and some geometry. No pyramid, however simple, was ever built without both. No weight was dragged or rolled up an inclined plane without some understanding of both shapes and numbers. No water channel that was ever dug did not involve some arithmetic and some geometry. Already by the time of Sumer and Babylon, and certainly by the time of the Egyptians and the Harappans, the practical application of arithmetic and geometry and even trigonometry in trade, surveying, town planning, time-keeping and building were well established. The sophisticated management of water that we can now glimpse in the ancient civilizations needed both arithmetic and geometry. There is not much recorded history that is available before the Greeks. Arithmetic and Geometry were well established by the time we come to the Greeks who even conducted a vigorous discourse about the nobility (or divinity) of the one versus the other. Pythagoras is not happy with arithmetic since numbers cannot give him – exactly – the hypotenuse of a right triangle of sides of equal length (√2). Which he can so easily draw. Numbers could not exactly reflect all that he could achieve with a straight edge and a compass. The circle could not be squared. The circumference was irrational. The irrationality of the numbers needed to reflect geometrical shapes was, for the purists, vulgar and an abomination. But the application of geometry and arithmetic were common long, long before the Greeks. There is a great distance before counting becomes arithmetic and the study of shapes becomes geometry but the roots of mathematics lie there. That takes us back to well before the Neolithic (c. 12,000 years ago).

That geometry derives from the study of shapes and the patterns and relationships between shapes, given some threshold level of cognition, seems both obvious and inevitable. Shapes are real and ubiquitous. They can be seen in all aspects of the natural world and can be mimicked and constructed. The arc of the sun curving across the sky creates a shape. Shadows create shapes. Light creates straight lines as the elevation of the sun creates angles. Shapes can be observed. And constructed. A taut string to give a straight line and the calm surface of a pond to give a level plane. A string and a weight to give the vertical. A liquid level to give the horizontal. Sticks and shadows. A human turning around to observe the surroundings created a circle. Strings and compasses. Cave paintings from c. 30,000 years ago contain regular shapes. Circles and triangles and squares. Humans started not only observing, but also applying, the relationships between shapes a very long time ago.

Numbers are more mystical. They don’t exist in the physical world. But counting the days from new moon to new moon for a lunar month, or the days in a year, were also known at least 30,000 years ago. Ancient tally sticks to count to 29 testify to that. It would seem that the origins of arithmetic (and numbers) lie in our ancient prehistory and probably more than 50,000 years ago. Counting, the use of specific sounds as the representation of abstract numbers, and number systems are made possible only by first having a concept of identity which allows the definition of one. Dealing with identity and the nature of existence take us before and beyond the realms of philosophy or even theology and are in the metaphysical world. The metaphysics of existence remain mystical and mysterious and beyond human cognition, as much today as in prehistoric times. Nevertheless, it is the cognitive capability of having the concept of a unique identity which enables the concept of one. That one day is distinguishable from the next. That one person, one fruit, one animal or one thing is uniquely different to another. That unique things, similar or dissimilar, can be grouped to create a new identity. That one grouping (us) is distinguishable from another group (them). Numbers are not physically observable. They are all abstract concepts. Linguistically they are sometimes bad nouns and sometimes bad adjectives. The concept of one does not, by itself, lead automatically to a number system. That needs in addition a logic system and invention (a creation of something new which presupposes a certain cognitive capacity). It is by definition, and not by logic or reason or inevitability, that two is defined as one more than the identity represented by one, and three is defined as one more than two, and so on. Note that without the concept of identity and the uniqueness of things setting a constraint, a three does not have to be separated from a two by the same separation as from two to one. The inherent logic is not itself invented but emerges from the concept of identity and uniqueness. That 1 + 1 = 2 is a definition not a discovery. It assumes that addition is possible. It is also significant that nothingness is a much wider (and more mysterious and mystical) concept than the number zero. Zero derives, not from nothingness, but from the assumption of subtraction and then of being defined as one less than one. That in turn generalises to zero being any thing less than itself. Negative numbers emerge by extending that definition. The properties of zero are conferred by convention and by definition. Numbers and number systems are thus a matter of “invention by definition”, but constrained by the inherent logic which emerges from the concept of identity. The patterns and relationships between numbers have been the heady stuff of number theory and a matter of great wonder when they are discovered, but they are all consequent to the existence of the one, the invention of numerals and the subsequent definition that 1 + 1 = 2. Number theory exists only because the numbers are defined as they are. Whereas the concept of identity provides the basis for one and all integers, a further cognitive step is needed to imagine that the one is not indivisible and then to consider the infinite parts of one.

Mere counting is sometimes disparaged, but it is, of course, the most rudimentary form of a rigorous arithmetic with its commutative, associative and distributive laws.

Laws of arithmetic

The cognitive step of getting to count in the first place is a huge leap compared to the almost inevitable evolution of counting into numbers and then into an arithmetic with rigorous laws. We will never know when our ancestors began to count but it seems to me – in comparison with primates of today – that it must have come after a cognitive threshold had been achieved. Quite possibly with the control of fire and after the brain size of the species had undergone a step change. That takes us back to the time of homo erectus and perhaps around a million years ago.

Nearly all animals have shape recognition to some extent. Some primates can even recognise patterns in similar shapes. It is plausible that recognition of patterns and relationships between shapes only took off when our human ancestors began construction either of tools or of rudimentary dwellings. The earliest tools (after the use of clubs) were probably cutting edges and these are first seen around 1.8 million years ago. The simplest constructed shelters would have been lean-to structures of some kind. Construction of both tools and shelters lend themselves naturally to the observation of many geometrical shapes; rectangles, polygons, cones, triangles, similar triangles and the rules of proportion between similar shapes. Arches may also have first emerged with the earliest shelters. More sophisticated tools and very simple dwellings take us back to around 400,000 years ago and certainly to a time before anatomically modern humans had appeared (c. 200,000 years ago). Both rudimentary counting and a sense of shapes would have been present by then. It would have been much later that circles and properties of circles were observed and discovered. (Our earliest evidence of a wheel goes back some 8,000 years and is the application of a much older mathematics). Possibly the interest in the circle came after a greater interest in time keeping had emerged. Perhaps from the first “astronomical” observations of sunrise and sunset and the motion of the moon and the seasons. Certainly our ancestors were well-versed with circles and spheres and their intersections and relationships by the time they became potters (earlier than c. 30,000 years ago). 

I suspect it was the blossoming of trade – rather than the growth of astronomy – which probably helped take counting to number systems and arithmetic. The combination of counting and shapes starts, I think, with the invention of tools and the construction of dwellings. By the time we come to the Neolithic and settlements and agriculture and fortified settlements, arithmetic and geometry and applied mathematics is an established reality. Counting could have started around a million years ago. The study of shapes may have started even earlier. But if we take the origin of “mathematics” to be when counting ability was first combined with a sense of shapes, then we certainly have to step back to at least 50,000 years ago.


Numbers emerge from the concept of identity

December 18, 2020

Numbers are abstract. They do not have any physical existence. That much, at least, is fairly obvious and uncontroversial.

Are numbers even real? The concept of numbers is real but reason flounders when considering the reality of any particular number. All “rational” numbers (positive or negative) are considered “real numbers”. But in this usage, “real” is a label not an adjective. “Rational” and “irrational” are also labels when attached to the word number and are not adjectives describing the abstractions involved. The phrase “imaginary numbers” is not a comment about reality. “Imaginary” is again a label for a particular class of the concept that is numbers. Linguistically we use the words for numbers both as nouns and as adjectives. When used as a noun, meaning is imparted to the word only because of an attached context – implied or explicit. “A ten” has no meaning unless the context tells us it is a “ten of something” or as a “count of some things” or as a “measurement in some units” or a “position on some scale”. As nouns, numbers are not very pliable nouns; they cannot be modified by adjectives. There is a mathematical abstraction for “three” but there is no conceptual, mathematical difference between a “fat three” and a “hungry three”. They are not very good as adjectives either. “Three apples” says nothing about the apple. “60” minutes or “3,600” seconds do not describe the minutes or the seconds.

The number of apples on a tree or the number of atoms in the universe are not dependent upon the observer. But number is dependent upon a brain in which the concept of number has some meaning. All of number theory, and therefore all of mathematics, builds on the concept and the definition of one.  And one depends, existentially, on the concept of identity.

From Croutons in the soup of existence

The properties of one are prescribed by the assumptions (the “grammar”) of the language. One (1,unity), by this “grammar” of mathematics is the first non-zero natural number. It is the integer which follows zero. It precedes the number two by the same “mathematical distance” by which it follows zero. It is the “purest” number. Any number multiplied by one or divided by one remains that number. It is its own factorial. It is its own square or square root; cube or cube root; ad infinitum. One is enabled by existence and identity but thereafter its properties are defined, not discovered. 

The question of identity is a philosophical and a metaphysical quicksand. Identity is the relation everything has to itself and nothing else. But what does that mean? Identity confers uniqueness. (Identical implies sameness but identity requires uniqueness). The concept of one of anything requires that the concept of identity already be in place and emerges from it. It is the uniqueness of identity which enables the concept of a one.

Things exist. A class of similar things can be called apples. Every apple though is unique and has its own identity within that class of things. Now, and only now, can you count the apples. First comes existence, then comes identity along with uniqueness and from that emerges the concept of one. Only then can the concept of numbers appear; where a two is the distance of one away from one, and a three is a distance of one away from two. It is also only then that a negative can be defined as distance away in the other direction. Zero cannot exist without one being first defined. It only appears as a movement of one away from one in the opposite direction to that needed to reach two. Negative numbers were once thought to be unreal. But the concept of negative numbers is just as real as the concept for numbers themselves. The negative sign is merely a commentary about relative direction. Borrowing (+) and lending (-) are just a commentary about direction. 

But identity comes first and numbers are a concept which emerges from identity.


Why did we start to count?

October 12, 2020

Counting and the invention of numbers and the abstractions enabling mathematics are surely cognitive abilities. Counting itself involves an abstract ability. The simple act of raising two fingers to denote the number of stones or lions or stars implies first, the abstract ability to describe an observed quality and second, the desire to communicate that observation.

What led humans to counting and when?

Before an intelligence can turn to counting it must first have some concept of numbers. When and how did our ancient ancestors  first develop a concept of numbers and then start counting? …….. 

It seems clear that many animals do distinguish – in a primitive and elementary way – between “more” and “less, and “few” and “many”,and “bigger” and “smaller”, and even manage to distinguish between simple number counts. They show a sophisticated use of hierarchy and precedence.

Some primates show some primitive abilities when tested by humans

…..  Rhesus monkeys appear to understand that 1 + 1 = 2. They also seem to understand that 2 + 1 = 3, 2 – 1 = 1, and 3 – 1 = 2—but fail, however, to understand that 2 + 2 = 4. ……

But even chimpanzees and monkeys rarely, if ever, use counts or counting in interactions among themselves. The abilities for language and counting are not necessarily connected genetically (though it is probable), but they are both certainly abilities which appear gradually as cognition increases. Mathematics is, of course, just another language for describing the world around us. Number systems, as all invented languages, need that a system and its rules be shared before any communication is feasible. It is very likely that the expressions of the abilities to count and to have language follow much the same timeline. The invention of specific sounds or gestures to signify words surely coincided with the invention of gestures or sounds to signify numbers. The step change in the size of brains along the evolutionary path of humans is very likely closely connected with the expressions of the language and the counting abilities.

The ability to have language surely preceded the invention of languages just as the ability to count preceded the expressions of counting and numbering. It is not implausible that the first member of a homo erectus descendant who used his fingers to indicate one of something, or four of something else, to one of his peers, made a far, far greater discovery – relatively – than Newton or Einstein ever did.

We must have started counting and using counts (using gestures) long before we invented words to represent counts. Of course, it is the desire to communicate which is the driving force which takes us from having abilities to expressions of those abilities. The “cooperation gene” goes back to before the development of bipedalism and before the split with chimpanzees or even gorillas (at least 9 million years ago).

The simple answer to the question “Why did we start to count?” is because we could conceive of a count, observed it and wished to communicate it. But this presupposes the ability to count. Just as with language, the ability and the expression of the ability, are a consequence of the rapid increase in brain size which happened between 3 m and 1 m years ago.

I am persuaded that that rapid change was due to the control of fire and the change to eating cooked food and especially cooked meat. The digestion of many nutrients becomes possible only with cooked food and is the most plausible driver for the rapid increase in brain size.

Raw Food not enough to feed big brains

………. our brains would still be the size of an ape’s if H. erectus hadn’t played with fire: “Gorillas are stuck with this limitation of how much they can eat in a day; orangutans are stuck there; H. erectus would be stuck there if they had not invented cooking,” she says. “The more I think about it, the more I bow to my kitchen. It’s the reason we are here.”


And then came counting

August 25, 2018

Origins of human cognitive development go back a lot longer than was once thought. Our first bipedal ancestors who came down from the trees more than 5 million years ago, had  already some concept of “more” and “less” and  perhaps even of rudimentary numbers upto 3 (as rhesus monkeys have today). Genetic analysis of ancient bones is showing that the origin and development of modern humans needs to be taking the Neanderthals and the Denisovans into account and perhaps has to go back at least to the time of a common ancestor from over 1 million years ago. Just considering the last 200,000 years is no longer enough.

I have no doubt that the mastery of fire, the eating of cooked meat, the growth of the brain and, above all, the increased need for cooperation were interconnected and drove human cognitive development. Whether developments happened in one place and spread or happened at many places, perhaps at many times, will always be a matter of speculation. But it is not so difficult to come up with a not implausible time-line of the key developments which gave us first counting and then tallying and arithmetic and geometry and now complex number theory. The oldest evidence we have of counting are tally-sticks from over 50,000 years ago. But counting surely started long before that.


Related:

What led humans to counting and when?

The origins of base 60


 

Pareto’s 80/20 rule is ubiquitous

June 11, 2018

I first came across and learned to use the Pareto principle in the 70s as a young engineer. It was the starting point for fault analysis of any kind.  Root cause analysis always started with a “Pareto diagram”. It was an extremely powerful tool not only for fault analysis but also then in all quality improvement actions.

The Pareto principle (also known as the 80/20 rule, the law of the vital few, or the principle of factor sparsity) states that, for many events, roughly 80% of the effects come from 20% of the causesWikipedia

Pareto showed in 1896 that 80% of land in Italy was owned by 20% of the population and thus was born the 80/20 rule. It has now become almost a cliche in all business processes and in financial and economic analysis to describe the relationship where a minority of causes lead to a majority of the result.

The 80-20 rule is a business rule of thumb that states that 80% of outcomes can be attributed to 20% of all causes for a given event. In business, the 80-20 rule is often used to point out that 80% of a company’s revenue is generated by 20% of its total customers. Therefore, the rule is used to help managers identify and determine which operating factors are most important and should receive the most attention based on an efficient use of resources.Investopedia

The 80/20 rule seems to apply in almost all fields. It applies in wealth distribution, in personal and public finance, in all kinds of manufacturing, in quality control, in experimentation and in disease control.

It is not perhaps so surprising.

Wherever a phenomenon is subject to a power-law probability distribution, the 80/20 rule will apply, and a power-law probability distribution is perhaps the most common form of probability distribution that occurs in nature and in man-made processes. Put differently it is not at all surprising.

Of all the possible causes of an observed effect, a minority of the possible causes are usually responsible for a majority of the observed effect.

Perhaps we should be surprised only if the 80/20 “rule” does not apply. The “20%” and the “80%” should be taken as symbols for a “minority” and a “majority” respectively and then the 80/20 rule is ubiquitous.


 

Decimals are too simple

April 22, 2018

Of course all attempts to create a 10 hour day with 100 minutes to each hour and 100 seconds to each minute have failed. Similarly all attempts to divide the circle into 100 parts have not caught on.

The use of 60 is almost as basic as the use of 10.

The origins of base 60

All the non-decimal systems I learnt were embedded in memory before I was 20. I don’t expect many will remember these.

As a child I learned the use of 12 and 60 from my grandmother. The use of 12 was automatic with 4 and 3. Three pice to a pie. 4 pies to an anna and 16 annas to the rupee. When driving around India with my father, miles and furlongs and yards and feet came naturally. Bushels and pecks and gallons and quarts and pints came later as an apprentice in England.

Decimals are simple. But they are also simplistic.

Perhaps too simple.

Rupee, anna, pice, pies

Pounds, shillings, pence, farthings

Ton, hundredweights, pounds, ounces

Mile, furlongs, yards, feet

Bushel, pecks, gallons, quarts, pints


 

Counting on fingers leads naturally also to base-60

August 19, 2017

We have forgotten what it was like to count on our fingers. We have forgotten that counting itself was a mystery long before the mysteries of manipulation of numbers and the magic of mathematics. Yet the use of base-60 lies deep in our psyches. We still use it for time measurement and for geographical and spatial measurements. Attempts to use decimals for time and angle measurement have all failed miserably. Sixty still occurs in ancient Chinese and Indian calendars.

Today the use of 60 still predominates for time, for navigation and geometry. But generally only for units already defined in antiquity. A base of 10 is used for units found to be necessary in more recent times. Subdivision of a second of time or a second of arc is always using the decimal system rather than by the duodecimal or the sexagesimal system.

Usually the origin of sexagesimal systems of counting are traced back to the Babylonians (c. 1,800 BCE) and even to the Sumerians (c. 3,000 – 2,500 BCE). But I suspect that it goes back much further and that base-60 long precedes the Babylonians and the Sumerians.

I observe that twelve and then sixty come naturally from three factors:

  1. using fingers for counting,
  2. maximising the count with only one hand free, and
  3. the growth of trade and the need for counts of greater than 20

That five comes naturally from the fingers of one hand is self-evident. With only one free hand, a count to twelve using the thumb and the digits of the other four fingers is also self-evident. I saw my great grandmother, and my grandmother after her, regularly count to twelve using only one hand. Sixty comes naturally from a hand of five of a hand of twelve. Counting to five and twelve would have been well known to our hunter-gatherer ancestors. It seems very plausible that a hunter would look to maximise the count on a single hand. So, it is not necessary to look for the origins of base-60 in the skies or in the length of the year or the number of its divisors or the beginnings of geometry. If the origins of counting lie some 50,000 years ago, the use of twelve and then of sixty probably goes back some 20,000 years.

The origins of base 60

I like 60. Equilaterals. Hexagons. Easy to divide by almost anything. Simple integers for halves, quarters, thirds, fifths, sixths, tenths, 12ths, 15ths and 30ths. 3600. 60Hz. Proportions pleasing to the eye. Recurring patterns. Harmonic. Harmony.

The origins of the use of base 60 are lost in the ancient past. By the time the Sumerians used it about 2,500 years ago it was already well established and continued through the Babylonians. But the origin lies much earlier.

hand of 5I speculate that counting – in any form more complex than “one, two, many….” – probably goes back around 50,000 years. I have little doubt that the fingers of one hand were the first counting aids that were ever used, and that the base 10 given by two hands came to dominate. Why then would the base 60 even come into being?

The answer, I think, still lies in one hand. Hunter-gatherers when required to count would prefer to use only one hand and they must – quite early on and quite often – have had the need for counting to numbers greater than five. And of course using the thumb as pointer one gets to 12 by reckoning up the 3 bones on each of the other 4 fingers.

a hand of 12 - image sweetscience

a hand of 12 – image sweetscience

My great-grandmother used to count this way when checking the numbers of vegetables (onions, bananas, aubergines) bought by her maid at market. Counting up to 12 usually sufficed for this. When I was a little older, I remember my grandmother using both hands to check off bags of rice brought in from the fields – and of course with two hands she could get to 144. The counting of 12s most likely developed in parallel with counting in base 10 (5,10, 50, 100). The advantageous properties of 12 as a number were fortuitous rather than by intention. But certainly the advantages helped in the persistence of using 12 as a base. And so we still have a dozen (12) and a gross (12×12) and even a great gross (12x12x12) being used today. Possibly different groups of ancient man used one or other of the systems predominantly. But as groups met and mixed and warred or traded with each other the systems coalesced.

hands for 60

And then 60 becomes inevitable. Your hand of 5, with my hand of 12, gives the 60 which also persists into the present.  (There is one theory that 60 developed as 3 x 20, but I think finger counting and the 5 x 12 it leads to is far more compelling). But it is also fairly obvious that the use of 12 must be prevalent first before the 60 can appear. Though the use of 60 seconds and 60 minutes are all pervasive, it is worth noting that they can only come after each day and each night is divided into 12 hours.

While the use of base 10 and 12 probably came first with the need for counting generally and then for trade purposes (animals, skins, weapons, tools…..), the 12 and the 60 came together to dominate the measuring and reckoning of time. Twelve months to a year with 30 days to a month. Twelve hours to a day or a night and 60 parts to the hour and 60 parts to those minutes. There must have been a connection – in time as well as in the concepts of cycles – between the “invention” of the calendar and the geometrical properties of the circle. The number 12 has great significance in Hinduism, in Judaism, in Christianity and in Islam. The 12 Adityas, the 12 tribes of Israel, the 12 days of Christmas, the 12 Imams are just examples. My theory is that simple sun and moon-based religions gave way to more complex religions only after symbols and writing appeared and gave rise to symbolism. ……… 

If we had six fingers on each hand the decimal system would never have seen the light of day. A millisecond would then be 1/ 1728th of a second. It is a good thing we don’t have 7 fingers on each hand, or – even worse – one hand with 6 fingers and one with 7. Arithmetic with a tridecimal system of base 13 does not entice me. But if I was saddled with 13 digits on my hands I would probably think differently.

 


 

The inherent logic of the universe – but not language – was established by the Big Bang

June 16, 2017

You could call this the First Law of Everything.

Logic is embedded in the universe.

At the Big Bang we have no idea what the prevailing laws were. Physicists merely call it a singularity where the known laws of physics did not apply. It was just another Creation Event. But thereafter – after the Big Bang – everything we observe in the universe is logical. We take logic to be inherent in the Universe around us. We discover facets of this embedded logic empirically and intuitively (and intuition is merely the synthesis of empiricism). We do not invent logic – we discover it. If logic was ever created it was created at the time of the Big Bang.

Language, on the other hand, is invented by man to describe and communicate the world around us. We build into the framework of our languages, rules of logic such that the use of language is consistent with the embedded logic of the universe. But language is not always equal to the task of describing the universe around us. “I have not the words to describe ….”. And then we imbue old words with new meanings or invent new words, or new grammar. But we never make changes which are not consistent with the logic of the universe.

Reasoning with language is then constrained to lie within the logical framework so constructed and therefore, also always consistent with our empirical observations of the universe around us. Given certain assumptions – as expressed by language – always lead to the same logical inferences – also as described by that language. Such inferring, or reasoning, works and – within our observable universe – is a powerful way of extrapolating from the known to the not-yet-known. The logical framework itself ensures that the inferences drawn remain consistent with the logic of the universe.

In the sentence “If A is bigger than B, and if B is bigger than C, then A is bigger than C”, it is the logic framework of the language which constrains if, then and bigger to have meanings which are consistent with what we can observe. The logic framework is not the grammar of the language. Grammar would allow me to say: “If A is bigger than B, and if B is bigger than C, then A is smaller/louder/faster/heavier than C”, but the embedded logic framework of the language is what makes it ridiculous. The validity of the reasoning or of inferring requires that the logic framework of the language not be infringed. “If A is bigger than B, and if B is bigger than C, then A is smaller than C” is grammatically correct but logically invalid (incorrect). However, the statement “If A is bigger than B, and if B is bigger than C, then A is heavier than C” is grammatically correct, logically invalid but not necessarily incorrect.

Mathematics (including Symbolic Logic) also contains many languages which provide a better means of describing facets of the universe which other languages cannot. But they all contain a logic framework consistent with the embedded logic of the universe. That 1 + 1 =2 is a discovery – not an invention. That 2H2 + O2 = 2H2O is also a discovery, not an invention. The rules for mathematical operations in the different branches of mathematics must always remain consistent with the embedded logic of the universe – even if the language invented has still to find actual application. Imaginary numbers and the square root of -1 were triggered by the needs of the electrical engineers. Set theory, however, was only used in physics and computing long after it was “invented”.

Languages (including mathematics) are invented but each must have a logical framework which itself is consistent with the inherent logic of the universe.


 

Number theory was probably more dependent upon live goats than on raindrops

June 14, 2017

It used to be called arithmetic but it sounds so much more modern and scientific when it is called number theory. It is the branch of mathematics which deals with the integers and the relationships between them. Its origins (whether one wants to call it a discovery or an invention) lie with the invention of counting itself. It is from where all the various branches of mathematics derive. The origin of counting can be said to be with the naming of the integers, and is intimately tied to the development of language and of writing and perhaps goes back some 50,000 years (since the oldest known tally sticks date from some 30,000 years ago).

How and why did the naming of the integers come about?  Why were they found necessary (necessity being the cause of the invention)? Integers are whole numbers, indivisible, complete in themselves. Integers don’t recognise a continuum between themselves. There are no partials allowed here. They are separate and discrete and number theory could as well be called quantum counting.

Quite possibly the need came from counting their livestock or their prey. If arithmetic took off in the fertile crescent it well may have been the need for trading their live goats among themselves (integral goats for integral numbers of wives or beads or whatever else they traded) which generated the need for counting integers. Counting would have come about to fit their empirical observations. Live goats rather than carcasses, I think, because a carcass can be cut into bits and is not quite so dependent upon integers.  Quanta of live goat, however, would not permit fractions. It might have been that they needed integers to count living people (number of children, number of wives …..) where fractions of a person were not politically correct.

The rules of arithmetic – the logic – could only be discovered after the integers had been named and counting could go forth. The commutative, associative and distributive properties of integers inevitably followed. And the rest is history.

But I wonder how mathematics would have developed if the need had been to count raindrops.

After all:

2 goats + 2 goats = 4 goats, and it then follows that

2 short people + 2 short people = 4 short people.

But if instead counting had been inspired by counting raindrops, they would have observed that

2 little raindrops + 2 little raindrops = 1 big raindrop.

They might then have concluded that

2 short people + 2 short people = one tall person

and history would then have been very different.


 

Counting was an invention

March 19, 2017

A new book is just out and it seems to be one I have to get. I am waiting to get hold of an electronic version.

Number concepts are a human invention―a tool, much like the wheel, developed and refined over millennia. Numbers allow us to grasp quantities precisely, but they are not innate. Recent research confirms that most specific quantities are not perceived in the absence of a number system. In fact, without the use of numbers, we cannot precisely grasp quantities greater than three; our minds can only estimate beyond this surprisingly minuscule limit.

Numbers fascinate me and especially how they came to be.

The earliest evidence we have of humans having counting ability are ancient tally sticks made of bone and dating up to 50,000 years ago. An ability to tally at least up to 55 is evident. One of the tally sticks may have been a form of lunar calendar. By this time apparently they had a well developed concept of time. And concepts of time lead immediately and inevitably to the identification of recurring time periods. By 50,000 years ago our ancestors counted days and months and probably years. Counting numbers of people would have been child’s play. They had clearly developed some sophistication not only in “numbering” by this time but had also progressed from sounds and gestures into speech.  They were well into the beginnings of language.

Marks on a tally stick tell us a great deal. The practice must have been developed in response to a need. Vocalisations – words – must have existed to describe the tally marks. These marks were inherently symbolic of something else. They are evidence of the ability to symbolise and to think in abstract terms. Perhaps they represented numbers of days or a count of cattle or of items of food or of number of people in the tribe. But their very existence suggests that the concept of ownership of property – by the individual or by the tribe – was already in place. Quite probably a system of trading with other tribes and protocols for such trade were also in place. At 50,000 years ago our ancestors were clearly on the threshold of using symbols not just on tally sticks or in cave paintings but in a general way and that would have been the start of developing a written language. …….

My time-line then becomes:

  • 8 million YBP           Human Chimpanzee divergence
  • 6 million YBP           Rudimentary counting among Archaic humans (1, 2, 3 many)
  • 2 million YBP           Stone tools
  • 600,000 YBP          Archaic Human – Neanderthal divergence
  • 400,000 YBP          Physiological and genetic capability for speech?
  • 150,000 YBP           Speech and counting develop together
  • 50,000   YBP           Verbal language, counting, trading, calendars in place (tally sticks)
  • 30,000   YBP           Beginnings of written language?
Clearly our counting is dominated by the base of 10 and our penchant for 12-based systems. The joints on the fingers of one hand allows us to count to 12 and that together with the five fingers of the other clearly led to our many 60-based counting systems.

I like 60. Equilaterals. Hexagons. Easy to divide by almost anything. Simple integers for halves, quarters, thirds, fifths, sixths, tenths, 12ths, 15ths and 30ths. 3600. 60Hz. Proportions pleasing to the eye. Recurring patterns. Harmonic. Harmony.

The origins of the use of base 60 are lost in the ancient past. By the time the Sumerians used it about 2,500 years ago it was already well established and continued through the Babylonians. But the origin lies much earlier. ……

Why then would the base 60 even come into being?

image sweet science

The answer, I think, still lies in one hand. Hunter-gatherers when required to count would prefer to use only one hand and they must – quite early on and quite often – have had the need for counting to numbers greater than five. And of course using the thumb as pointer one gets to 12 by reckoning up the 3 bones on each of the other 4 fingers. 

My great-grandmother used to count this way when checking the numbers of vegetables (onions, bananas, aubergines) bought by her maid at market. Counting up to 12 usually sufficed for this. When I was a little older, I remember my grandmother using both hands to check off bags of rice brought in from the fields – and of course with two hands she could get to 144. The counting of 12s most likely developed in parallel with counting in base 10 (5,10, 50, 100). The advantageous properties of 12 as a number were fortuitous rather than by intention. But certainly the advantages helped in the persistence of using 12 as a base. And so we still have a dozen (12) and a gross (12×12) and even a great gross (12x12x12) being used today. Possibly different groups of ancient man used one or other of the systems predominantly. But as groups met and mixed and warred or traded with each other the systems coalesced.

If we had 4 bones on each finger we would be using 5 x 16 = 80 rather than 60.


 


%d bloggers like this: