Archive for the ‘History’ Category

Taking credit for your ancestors

May 14, 2022

I am always rather amused when people in the now try and bask in the past glories (usually exaggerated and always presumed) of their ancestors. Especially when someone claims descent from some very famous person. As if they chose them. To be proud of a famous father or grandfather is perfectly reasonable but to claim credit in the now for their deeds in the past is illogical. To claim credit for ancestors even further back in time verges on the ridiculous.

I find it especially silly when someone proudly declaims an ancestor’s presumed qualities or famous deeds and misses that they themselves suffer by comparison. I am equally unimpressed when someone proudly claims a long line of descent. Every single one of the 7.3 billion alive today (poor-man, rich-man, beggar-man, thief) has exactly the same number of ancestors as everyone else. One can now bask not only in famous ancestors but even in their past shame or misery. Nowadays it has  become fashionable to try and gain “victimhood credits” for the sufferings and failings of long-gone ancestors. Entitlement culture has now given us “victimhood privilege” as a new phenomenon of the 21st century.

Nationalist groups in many countries who are insecure about their own identities often bask in the presumed past glories of ancient civilizations. The one common feature of all these “great civilizations is, of course, that they all failed. It applies to all the classical “great civilizations” in Egypt, China, India, Greece and Rome. Some lasted much longer than others but they all eventually collapsed. Civilizations and societies which succumbed to others gives rise to claims of current victimhood credits for the sufferings and the failings of their ancestors. To be descended from the Phoenicians or the Mayans or Aztecs is now creditable in the now. To be descended from slaves of 200 years ago or from the colonised 500 years ago allows victimhood credits to be claimed in the present. Nowadays, in India, the Hindu right tries to take credit for the exaggerated, and often quite dubious, wonders of past “golden ages”, some two or five (or even ten!) thousand years ago. Never mind that the “golden ages” collapsed due to their own stresses, faults and imperfections. Never mind that the “golden ages” were always followed by millennia of “dark and dismal ages”. Never mind that glorious ages were followed by inglorious times because the glorious ages all led to decadence and depravity. Never mind that the “dark ages” and their misery were a direct consequence of the preceding “golden ages”,

Every person alive today had some ancestor who was a thief, a murderer, a cheat, a ruler or a slave. That includes every claimed descendant of Genghis Khan (40 generations) or Confucius (80 generations), and every current member of any “aristocracy” or “royal lineage” (the Norwegian House of Schleswig-Holstein-Sonderburg-Glücksburg from 1106 CE is probably the oldest recorded). There is nobody alive today who can even presume to trace a direct line of descent for more than about 40 generations. Even the most detailed line of descent leaves out more ancestors than are included. In practice nobody has a record of all their ancestors for more than about 10 generations and very few for more than 5. And if we want to go back to the heyday of ancient Greece (500 BCE) we would need 125 generations. And to reach back to the first cities ever we would need 500 – 600 generations. Modern humans started around 10,000 generations ago.

Every person alive today has more ancestors who were quite ordinary and forgettable than famous ones. There are more villains in each person’s ancestry than there are “good guys”. Basking in the fame or the shame of ancestors is about as silly as the human mind allows. There is no person alive today who does not have an ancestor who was an illiterate, speechless, murderous, selfish, tree-swinging ape.


On the paradox of purpose and veracity in histories

May 11, 2022

I find that all our histories have one of only two purposes. The first is to satisfy the intellectual need to know and the second is the desire to influence future behaviour. The first is part of epistemic curiosity and the second is just politics. All histories are, of course, stories about the past. All histories consist of a selection of “facts” and a narrative (always speculative to some degree) connecting the chosen “facts”. Much of the study of history as an academic discipline (whether archaeology or genetics or anthropology ….) is about the selection and justification of what is to be considered “fact”. It is never, and can never be, “the truth, the whole truth and nothing but the truth”.

Two purposes seems also to be the academic view though with slightly different labels.

In a discussion of the History Manifesto in 2015 Prof. Johann Neem wrote

Underlying the entire conversation was a tension between the two purposes of history, the philosophical or scientific, and the civic. The philosophical or scientific perspective considers the pursuit of historical truth to be of highest value. Like any organized scientific activity, historical research is corrupted when oriented to immediate public ends. Its public value ultimately depends on its autonomy.

The civic purpose of history, on the other hand, is to help a community—a nation, a religious or ethnic group—understand the present in ways that orient that group to the future. The questions asked, and the answers offered, will be ones relevant to the community at large rather than a scholarly community of inquiry.

We need both; in fact the civic depends on the scientific if history is to avoid becoming propaganda or having the preferences of the reading public drive the discipline’s priorities. Before historians can engage the public, they need good knowledge, and thus basic research.

What Neem calls “philosophical or scientific” purpose is just epistemic curiosity and what he calls the “civic” purpose boils down to politics. All epistemic curiosity in humans is ultimately just idle curiosity,

From my book Before Time Began (in preparation)

And then there is intellectual curiosity – epistemic curiosity. This only comes into play (a la Maslow) once basic survival needs are met. A recognizable brain is needed. It is the exercise of mind which epitomizes being an individual. Whereas perceptual curiosity is the drive to know enough to ensure survival, epistemic curiosity is the drive to know more. It seeks knowledge for the sake of knowledge. It gives us pleasure. It fuels study and learning and the accumulation of knowledge for its own sake. It gives us science and art. It drives language and literature and music and the other higher-order needs for self-actualization. It drives gossip and it drives play. ……….. Epistemic curiosity lies on an open-ended scale. It can never be satiated and we remain curious no matter what we discover. We are even curious about why we are curious. ………..

Epistemic curiosity, one could say, is the curiosity of idle minds.

Of these two purposes, one is a search for knowledge (truths) and is an end in itself. The other, the civic, political purpose, is as a tool for some other agenda. The curious thing is that whereas the veracity demanded by curiosity is absolute (since truths are needed to be considered knowledge) the use of history as a tool to influence future behaviour requires only perceived veracity. The use of a story about the past as a tool only works if others (those whose behaviour is to be influenced) perceive the history to be true. The actual truth value of the story may even be zero as long as the perceived veracity is high. A complete fiction becomes a history if it is perceived to be true.

Thus, actual veracity is irrelevant in a history to be used for social purposes. Only the perceived veracity matters. Actual veracity – the truth – is relevant only in satisfying epistemic curiosity.

When the purpose has no impact the truth is sought assiduously. Where the purpose is to have an impact, the actual truth is irrelevant and only perceived truth matters.

Perhaps it is just my cynicism but I find this somewhat of a paradox.


History, heroes, villains and the Jesus/ Judas story

April 19, 2022

Of course history is always just a story. It always contains the biases and prejudices of the historian and always cherry-picks “facts” and speculates as necessary to suit the historian’s agenda. It is, I think, largely unjustified that writing labeled as “history” is considered more “truthful” than works of fiction.

Stories need that their good guys and bad guys be available for the reader to identify with. Very often the plot collapses without the villain. No murder mystery can work unless we first have a murderer. Sometimes the author is actually the villain. A case in point is Edward Gibbon’s “Decline and Fall of the Roman Empire”. Edward Gibbon was not a nice man and his own peculiarities are now invisibly, but permanently, enshrined in his work. Most histories written during the 20th century are distorted by the political positions of their authors. But, not to worry. They are, after all, just works of fiction.

I observe that the Bible like any other story needs its villains for the plot to function. Easter week is just over and I started writing a post about history, the Bible and fiction. But I found I had already written about this 6 years ago which I reproduce below. (One forgets what one has written).

The Easter timeline suggests Judas was eliminated

But I have always been a little doubtful about the way in which poor Judas Iscariot is portrayed. It is not just coincidence that Easter week is a week of mystery.

Without the Resurrection, Christianity could still be a religion and a body of teachings with Jesus as a “great teacher”. But he would not then have demonstrated his divinity. He would not qualify to be the Son of God.

The capture of Jesus, in the plot of the Bible story, is a fundamental and necessary step for the Passion and the Crucifixion and the Resurrection. The role of Judas is utterly crucial to demonstrating the divinity of Jesus, but the Bible story is not very forthcoming as to his motivations. He is a traitor who “fingers” Jesus because Satan enters him. In some Gnostic writings he is a great soul who sacrificed himself for the necessary capture of Jesus – necessary for Jesus’ purposes. Judas was the cashier for the apostles and was entrusted with keeping all their monies. That thirty pieces of silver would be the motive for the betrayal does not convince.

The Bible story is somewhat unsatisfactory also in its details of the death (usually presumed to be suicide) of Judas. From the Bible story he either hanged himself or he fell into a field and burst such that he was disembowelled. The Gospel of Judas – found in the 1970s and dated to 280 AD – is considered a Gnostic text and is not accepted as being part of the Bible. Here Judas has visions of being stoned to death by the other apostles. It is only in the Gospel of Judas that we are told the story from the viewpoint of Judas and that Judas was actually acting on instructions from Jesus.

Consider the timeline of Holy Week in the Bible story.

  1. Day 1: Palm Sunday: Jesus triumphantly enters Jerusalem with all his apostles, riding humbly (?) on a donkey. Spends Sunday night at Bethany a little to the east of Jerusalem at the home of Mary, Martha and Lazarus.
  2. Day 2: Monday: Returns to Jerusalem. Along the way he curses a poor fig tree because it had failed to bear any fruit. The tree withers. He enters the Temple to find it filled with money changers (forex dealers since the Temple only accepted Tyrian shekels) and merchants selling animals for sacrifice. He chases them out with much ado. He returns to Bethany to spend the night.
  3. Day 3: Tuesday: Jesus returned to the Temple in Jerusalem and played hide-and-seek with the priests who challenged his authority and tried to apprehend him. But he evaded them. In the afternoon he and his disciples climbed the Mount of Olives and he made prophecies about the destruction of Jerusalem. He spent the night again in Bethany. Matthew reports that Judas negotiated his deal with the Sanhedrin on this day.
  4. Day 4: Wednesday: The Bible is silent about this day. It is presumed Jesus and his disciples stayed in Bethany and took it easy.
  5. Day 5: Thursday: Jesus sent Peter and John to “prepare” (presumably to reserve it as well) the Upper Room in Jerusalem (The Cenacle) for the Passover feast which would begin at twilight and continue on Friday. At twilight he washed the feet of his disciples and then began the Passover meal – the Last Supper. He prophecies that he will be betrayed by one of his disciples – which they each in turn deny. He identifies the traitor as being Judas by giving him a piece of bread soaked in the dish and as soon as he does so,  “Satan enters Judas” (?). From the Upper Room they all went to the Garden of Gethsemane. Here, late that evening, he is betrayed by Judas and arrested by the Sanhedrin and taken to the home of Caiaphas where the Sanhedrin Council have gathered.
  6. Day 6: Friday: Early on Friday morning, Judas is found dead. By the 3rd hour (9 am) the trial of Jesus has started. He is found guilty and forced to carry his cross to Calvary where he is crucified. By the ninth hour (3 pm) he is dead. Around the 12th hour (6 pm) his body is removed from the cross and is laid in a tomb guarded by Roman soldiers.
  7. Day 7: Saturday: The tomb is guarded by Roman soldiers all through the Sabbath day until dusk (12th hour – 6 pm). When the Sabbath ends, his body is anointed and prepared for burial by Nicodemus (himself a member of the Sanhedrin Council which found Jesus guilty).
  8. Day 8: Sunday: Early on Sunday several women went to the tomb and found it open and Jesus missing. He “appears” to five people during the day providing “proof” that he has been resurrected.

There are many, many writings by Bible scholars about the whole week. There are many interpretations of the symbolism but there is little controversy about the timeline. It is the timeline itself which makes me think that Judas was murdered. He identifies Jesus for the Sanhedrin on Thursday night and by dawn on Friday he is conveniently dead.

Applying the little grey cells a la Poirot,

  1. Jesus needs that someone close “betray” him.
  2. He picks Judas for that role
  3. He announces to all the apostles that Judas is the betrayer to be
  4. Judas follows instructions and identifies Jesus for arrest
  5. Judas dies before Jesus has even been tried and sentenced

The betrayal, death and resurrection of Jesus was the prophecy that needed to be fulfilled. The story that Judas killed himself in a fit of remorse, before Jesus even came to trial, sounds implausible to me. The accounts of his death also differ too much. Hanging cannot easily be mistaken for falling into a field and bursting. Both hanging and being thrown off a cliff could just as well have been murder as suicide. The parsimonious narrative that fits is that Jesus had to pick somebody – anybody – to be a scapegoat from among his disciples. Just turning himself in would not do, since it would not create the perception of being a martyr to a cause. He chose Judas to be the “betrayer” and put upon him that burden. However, the martyrdom of Jesus needed a “clean” betrayal; not one in which he was himself complicit. Judas was chosen as the scapegoat and had to be sacrificed to the greater cause. Jesus may well have realised that whoever he chose would incur the wrath of the other disciples. Why else did Jesus identify Judas as the betrayer to  the other disciples in advance of being betrayed? And Judas duly betrayed Jesus and incurred the wrath of the others. Before the night was out, and very conveniently, he was dead and the story-line of the betrayal was secure. Possibly Judas had been murdered (executed without trial) by the other disciples for the betrayal and they did not even realise that the story-line required Judas to die.

And since the Bible story is said to be written by his disciples, it is hardly likely that they would either mention that Judas was sacrificed by Jesus or that they had killed Judas to ensure his silence and protect the story-line. So did Jesus manipulate Judas to be the betrayer or did Judas act in full knowledge of his role? Did Jesus manipulate the other disciples to make sure Judas was silenced after he had played his part? It is not surprising that the Gospel of Judas is not accepted within the Bible. For that would mean that Jesus had orchestrated his own capture.

Poor Judas. He may have just been a dupe chosen by Jesus to be the scapegoat. But if he knowingly sacrificed his life and accepted being remembered in perpetuity as the “betrayer” of Jesus, his was probably a very great soul.


Colonisation and the genocide of the Neanderthals by the even-newer-Africans

November 5, 2021

Colonisation

Colonisation is not anything new. It is a necessary characteristic for any successful species.

Colonisation is the expansion by one community into the physical space being occupied by some other biological community. The invaders may be a whole species or just a particular strain within a species. It is a phenomenon exhibited by every successful biological community from viruses and bacteria and fungi to plants and animals (including humans). A new territory or habitat may be already occupied by other species, or strains of the same species, or unoccupied. The incoming community, if they dominate the new territory, are considered colonists. Newcomers who merely merge into the existing population are immigrants rather than colonists. Any communities already existing in the space and falling under the sway of the newcomers are the colonised. Many attempts at colonisation fail; either because the colonists cannot adapt to the new habitat or because they cannot compete (biologically, culturally, or technologically) with the existing inhabitants.

That living things exist in every conceivable corner of the earth’s land surface and of the oceans is a consequence of colonisation. That living things find it necessary to search for new habitats is driven by the need to grow and to survive changing environments by utilising the inherent genetic diversity available in every species. The greater the diversity available in any species, the greater the possibility that some individuals can adapt quickly to a new habitat. A successful species is one which grows its numbers and expands its habitat. Evolution results when the genetic diversity available allows individuals in a species to cope with changes to its environment.. These changes are usually small and quite slow. However, much of evolution has been when natural selection has been accelerated as species adapt to the rapid, and sometimes large, changes of environment in newly invaded habitats. Not all succeed and many would-be colonists have failed  to navigate the change. Species die out when they move, voluntarily or involuntarily, to new unsuitable habitats. There are a few species which have stagnated in tiny niche habitats, exhibit unusually little genetic diversity and have lost the wherewithal to change. They have become so specialised to fit their habitat that they are incapable of adapting to any other and have reached evolutionary “dead-ends”. Most such species have gone extinct but a few survive. Panda bears and theridiid spiders are examples. They have become incapable of growth or of colonisation and are probably on their slow path to extinction. 

The reality is that any living species which does not colonise is doomed either to stagnation in a niche habitat, or to failure and extinction. The most abject species failure would be an in-situ extinction without evolution of any descendant species. Colonisation favours, and even enables, the emergence of succeeding species. The change of habitat and the changes it brings about are essential to the continuation of life. Spaces are colonised when expanding communities invade and bring, or evolve, more competitive abilities, cultures, or technologies than available to the current inhabitants, if any, of that space. The dinosaurs on land had a long run but could not finally cope with the changes in their environment. But some of them took to the air. Many found new sources of food. Some developed a taste for insects and worms and begat a plethora of new species. (Along the way they even wiped out entire species of some flying insects). But the descendants of those adventurous, little dinosaurs have become the thousands of bird species to be found thriving today. They have colonised the entire globe while their larger, ancient kin perished miserably a long time ago.

However, success is no longer politically correct. Failure is glorified instead. Failure as a victim of another species is even more highly regarded. In the politically correct version of conservation, even successful species of fauna and flora are condemned for being invasive when they colonise new territories. Failing species, unfit for changed environments, or unable to bear the heat of competition from other species, are artificially, and unnaturally, protected in glorified zoos. However I do not hear any ornithologists condemning birds for being invasive species or for colonising the planet.

Human colonisation

In today’s politically correct world, colonisation has become a dirty word. Colonists are considered evil. Statues of colonists are a seen as a sign of oppression and depravity. The descendants of those colonised in the past claim a self-righteous victimhood in the present. Needless to say, a place in the kingdom of heaven is reserved for the colonised and their descendants. Colonists and their descendants are subject to eternal damnation. Yet, the history of mankind is one of successful communities expanding and colonising. The peopling of the world has been enabled by colonisation. The colonised became colonised usually because their culture was stagnating and their technology could not compete with incoming ones.

It is usually only the successful European colonisations between about 1400 and 1900 CE which have become politically incorrect. But the many attempts that failed are lost from history and get little sympathy. What is conveniently forgotten, though, is that the human populations in Australia and Africa and the Americas, at that time, were ripe for colonisation by any invading community with superior technology. For these populations, their own culture or their technology, or both, were stagnating and they were not growing. During that 500 year period, the native Americans, the inhabitants of S America and Africa and of Australia were themselves neither expanding nor  technologically capable of colonising new territories. If they had been more advanced the newcomers from Europe would have been immigrants rather than colonists. If it had not been the Europeans, it would have been someone else. If not the Europeans, the Chinese or Indians would have colonised Australia. If Europe had not had the capability to expand, they would themselves have probably succumbed to the expansion by the Ottomans. The Americas would then have been colonised from China or from Japan or some other region which was growing. Africa would have been colonised probably by the Ottomans (Turks and Arabs) or the Persians in the north, and by Indians and Chinese in the south. The key fact is that the inhabitants of Australia and the Americas and Africa, at that time, were not advancing (in numbers, culture, or technology) sufficiently to drive their own expansions or growth.

Go back another 2,000 years (500 BCE – 1,500 CE) and we find the South Indian colonisation of large tracts of SE Asia. The Greek and Persian expansions gave way to the Romans. The Mongols, the Normans, the Turks, and the Vikings were busy moving into the new territories they could access. The Norse attempts to colonise North America all failed. The Greeks used colonisation for expansion but also as a way of reducing social stress at home. Dissident populations were expelled to go and found colonies and be a nuisance elsewhere. Han Chinese colonies were used as the spearhead and as the means of expanding Empire. They were often guinea-pigs for testing the viability of new territories. And many failed. Where colonies succeeded, they nearly always had superior technology or weapons or organisation to dominate the local inhabitants. They usually used the indigenous population for labour and often as enslaved labour. These early colonisations were probably more brutal and savage than those which came later, but they go back far enough in time to generally escape sanctimonious censure today.

Delving further back into antiquity, reveals that colonisation was a major means of expansion even then. On land, those who mastered horses had a crucial advantage – at least for a while. Voyages across oceans were limited by the capability for navigating open seas. But sailing boats were quite well developed even if navigation was not, and coast-hugging allowed many communities to colonise by sea. But these expansions by the Phoenicians, Carthaginians, Greeks, Romans, Egyptians, Persians, Indians, and Han Chinese are far enough in the past to escape moral judgements in the present. If we go back even further to pre-historical times (5,000 – 10,000 years ago), all of hunter-gatherer Europe was colonised by the rampaging Yamnaya and the Anatolians from central Asia. They brought agriculture and cities and civilisation in their wake, and so are now forgiven the conquest and pillage and enslavement they surely utilised along the way.

 But let us not forget that human colonisation was started long before antiquity by the Africans.

Colonisation from Africa

When it comes to the origins of human colonisation we need to go back to before we were ever human. I take humans to mean Anatomically Modern Humans – AMH – who appeared around 300,000 years ago. Around 2.5 million years ago, before AMH had evolved, homo erectus had already spread from Africa and colonised most of Europe and Asia but never reached Australia. These were the oldest-Africans, but they did not then have control of fire. These oldest-Africans were more settlers than colonists. 

Some time after we had evolved from apes, and perhaps around 1,000,000 – 800,000 years ago, a common ancestor of AMH, Neanderthals, Denisovans, and a couple of unknown hominin species (call them homo x and homo y), emigrated from Africa and colonised most of Europe, Central Asia, and South-East Asia. These were the old-Africans. The control of fire was achieved sometime during the homo erectus period and was certainly available to the colonising old-Africans. Most likely the movement of whole populations was motivated to move out of Africa, not so much by a shortage of space, but by growth or by changes of climate and a shortage of food. No doubt some were just seen as trouble-makers at home and encouraged to leave. These old-Africans were emigrants and probably the first ever hominin colonists. They were not strictly immigrants into existing societies since the territories they moved into had no other AMH inhabitants. But they probably did displace the oldest-Africans they found. There were probably many waves of old-Africans and later waves of emigrants may well have been immigrants into existing communities. By 700,000 years ago old-Africans covered all of Africa, Europe and Asia. Many of the areas they moved into may have had indigenous near-hominin populations. However, their indigenous cultures and technologies were not sufficiently competitive to prevent the wave of old-African colonists establishing themselves. The key distinction in these times was probably the control of fire. The colonisation of the world by the old-Africans led also to the demise of many animal species which could not resist the advanced cultures and technologies they were faced with. Some were hunted to extinction as prey, while others were unable to adapt quickly enough, and still others were just crowded out by the newcomers.

In due course (a small matter of a few hundred thousand years) the old-Africans in Central Asia and Europe evolved to become the Neanderthals. From about 500,000 years ago they were the dominant species, and that continued for about 300,000 years. In South-East Asia, the old-Africans evolved to become the Denisovans. In the rest of Asia (S China, India, and the Middle East), the old-Africans were still around but had evolved to become some as yet unknown hominin species. In Africa, the old-Africans evolved, sidestepped the evolution of any Neanderthal-like species, and eventually gave rise to Anatomically Modern Humans (AMH) by about 300,000 years ago. Let us call them the new-Africans.

Then from about 200,000 years ago there were a number of waves of new-African emigration/colonisation into Europe and Asia. These emigrant waves continued sporadically for 100,000 years while, in Africa, evolution had created the even-newer-Africans. Around 60 – 70,000 years ago, they were responsible for the major wave of emigration now termed the “Out of Africa” event. The new-Africans and the even-newer-Africans found indigenous populations all across the new territories they expanded into. They were sometimes just other new-Africans and sometimes they were blended populations of old-Africans (Neanderthals, Denisovans, …) and new-Africans. In India, for example, the even-newer-Africans arrived after the Toba eruption and mingled genetically with small surviving populations of old-Africans already mingled with new-Africans. By around 50,000 years ago the even-newer-Africans had reached Australia.

Whether there was conflict between indigenous and arriving populations, or whether one culture was gradually displaced by, or submerged into, the more dominant one is unknown. Whether advanced colonists enslaved less advanced populations is unknown but is quite likely to have occurred. What is known is that the arrival of the even-newer-Africans caused the Neanderthals and the Denisovans and any other remaining hominin species around to disappear. By around 50,000 years ago the Denisovans were extinct and by 40,000 years ago there were no Neanderthals left. However, their genes still survive in tiny quantities and live on in us. So some contact leading to gene admixture certainly occurred. However, whether intentional or not, many communities were overwhelmed. Eventually, entire human species were wiped out.

The time-line for African colonists into Europe and Asia:

  • oldest-Africans – c. 2 m years ago
  • old-Africans – c. 1 million years ago >> evolved to be Neanderthals, Denisovans, …
  • new-Africans – c. 500,000 years ago >> evolved, mixed with old-Africans and other strains
  • new-Africans (AMH) – c. 300,000 years ago >>admixture with previous colonists
  • even-newer-Africans – c. 100,000 years ago >> colonised the known world, eradicated the Neanderthals , Denisovans and others

In current-day, politically correct language it could be called the Genocide of the Neanderthals by the even-newer-Africans.


The Genocide of the Neanderthals by the Even Newer Africans

May 13, 2021

In the politically correct and virtue signalling world, where pseudo-morality reigns, colonisation has become a dirty word. Colonists are considered evil. Statues of colonists are even more depraved. The colonised of the past are always considered victims by the present. Needless to say, a place in the kingdom of heaven is reserved for the colonised. The reality is that any living species which does not colonise is doomed either to stagnation in a niche habitat or to failure and extinction. Colonisation is the stuff of life. Geographical spaces are colonised when expanding communities invade and bring more competitive cultures or technologies than existing in that space. Populations are colonised when their culture and technology cannot compete with incoming ones. Strangely, it is only the European colonisations between about 1400 and 1900 CE which have become politically incorrect. But what is conveniently forgotten is that the colonised populations in Australia and N America and even S America at that time were so backward in technology that they were ripe for colonisation by any invading community with superior technology. If not the Europeans, it would have been someone else. It is, of course, politically incorrect to point out that the colonised were once colonists too, and have themselves primarily to blame. Colonisations in antiquity by the Phoenicians, Carthaginians, Greeks, Romans, and Han Chinese are too far in the past for moral judgements in the present. The Mongols, the Normans, and the Vikings generally escape censure today.

But it is worth remembering that human colonisation was started by the Africans.

Colonisation is primarily about the expansion of the physical space being occupied by a biological community. The community may be a whole species or just a particular strain within a species. It is a phenomenon exhibited by every successful biological community from viruses and bacteria and fungi to plants and animals (including humans). A new territory or habitat may already be occupied by other species, or strains of the same species, or unoccupied. The incoming community are the colonists. Any communities already existing in the space are the colonised. Many attempts at colonisation fail; either because the colonists cannot adapt to the new habitat or because they cannot compete (biologically, culturally, or technologically) with the existing inhabitants.

That living things exist in every conceivable corner of the earth’s surface is a consequence of colonisation. That living things find it necessary to search for new habitats is a consequence of surviving changing environments, of growth, and of the genetic diversity inherent in every species. There are a few species which have stagnated in tiny niche habitats, exhibit unusually little genetic diversity and are unable to change. They have become so specialised to fit their habitat that they are incapable of adapting to any other and have reached evolutionary “dead-ends”. Panda bears and theridiid spiders are examples. They have become incapable of growth or of colonisation and are probably on their slow path to extinction.

When it comes to the origins of human colonisation we need to go back to before we were ever human. (I take humans to mean Anatomically Modern Humans who appeared around 300,000 years ago). Some little time after we had evolved from hominids to hominin, and perhaps around 800,000 years ago, a common hominin ancestor of Neanderthals, Denisovans, a couple of unknown hominin species and of AMH, emigrated from Africa and colonised most of Europe, Central Asia, and South East Asia. Most likely the movement of whole populations was driven, not by a shortage of space, but by changes of climate and a shortage of food. (Note that immigration is not necessarily colonisation, but colonisation always involves emigration). These Old Africans were emigrants and the first ever colonists. They were not initially immigrants since the territories they moved into had no other hominin inhabitants. There were probably many waves of Old Africans and later emigrants may well have been immigrants. Many of the areas they moved into did have indigenous hominid populations. However, the indigenous culture and technology was not sufficiently competitive to prevent the wave of hominin colonisation. Hominins had fire while hominids and other species did not. The colonisation of the world by the Old Africans led to the demise of many species which could not compete against the advanced culture and technology they exhibited. Some were hunted to extinction as prey, while others were unable to adapt quickly enough, and still others were just crowded out by the newcomers.

In due course (a small matter of a few hundred thousand years) the Old Africans in Central Asia and Europe evolved to become the Neanderthals. From about 500,000 years ago they were the dominant species for about 300,000 years. In South East Asia, the Old Africans evolved to become the Denisovans. In the rest of Asia (S China, India, and the Middle East), the Old Africans were still around but had evolved to become some as yet unknown hominin species. In Africa, the Old Africans gave way eventually to Anatomically Modern Humans (AMH) by about 300,000 years ago. Let us call them the New Africans.

Then from about 200,000 years ago there were a number of waves of New African emigration/colonisation into Europe and Asia. These emigrant waves continued sporadically for 100,000 years culminating in the Even Newer Africans coming “Out of Africa” around 60 – 70,000 years ago. The New Africans and the Even Newer Africans found indigenous hominin populations all across the new territories they expanded into. They were sometimes just other New Africans and sometimes they were blended populations of Old Africans (Neanderthals, Denisovans, …) and New Africans. In India, for example, the Even Newer Africans arrived after the Toba eruption and mingled genetically with surviving populations of Old Africans already mingled with New Africans.

Whether there was conflict between indigenous and arriving populations, or whether one culture was gradually submerged into the more dominant one is unknown. What is known is that the arrival of the Even Newer Africans caused the Neanderthals and the Denisovans and some other hominin species around to disappear. By around 50,000 years ago the Denisovans were extinct and by 40,000 years ago there were no Neanderthals left. However, their genes still survive and live on in us.

In current-day politically correct terms and to signal great virtue in ourselves, it could be called the Genocide of the Neanderthals by the Even Newer Africans.


Should Adolf Hitler be cancelled?

April 20, 2021

It is 132 years since Adolf Hitler was born on 20th April 1899 in Braunau am Inn, Austria.

Of course there are no statues of Hitler around, or so one might think.

But you can buy a bust of Hitler on the net for $59.99.

And there are plenty of paintings of him, physical copies of books by him and about him and any number of photographs. And then we come to all the digital statues of Hitler that exist. Of course he has an entry in Wikipedia. A Google search which took 0.57 seconds generates 67 million results.

It would be politically correct to ensure that anything even faintly positive about Hitler be destroyed. All historical records should be thoroughly cleansed so that only derogatory material about him remains. 

Or maybe erect a statue every year to be then pulled down. Erect it on 20th April and pull it down with great ceremony on April 30th every year?


Lawyers are to humans as fungi are to trees

April 16, 2021

I have suggested in the past that cooking may be the oldest art form and that chefs may have been members of the oldest profession. However it may be that lawyers came first.


Reblogged from A short history of lawyers (upcounsel.com)

Imbricate fruiting of Phaeolus schweinitzii. image forestpathology.org

Like the symbiotic relationship between trees and fungus, lawyers and humans have an important, interlocking relationship going back to the dawn of man.

The following is excerpted from “Some Lawyers Are People Too!” by Hugh L. Dewey, Esq. (2009). 

Legal anthropologists have not yet discovered the proverbial first lawyer. No briefs or pleadings remain from the proto-lawyer that is thought to have been in existence more than 5 million years ago.

Chimpanzees, man’s and lawyer’s closest relative, share 99% of the same genes. New research has definitely proven that chimpanzees do not have the special L1a gene that distinguishes lawyers from everyone else. (See Johnson, Dr. Mark. “Lawyers in the Mist?” Science Digest, May 1990: pp. 43-52.) This disproved the famous outcome of the Scopes Monkey Trial in which Clarence Darrow proved that monkeys were also lawyers.

Charles Darwin, Esquire, theorized in the mid-1800s that tribes of lawyers existed as early as 2.5 million years ago. However, in his travels, he found little evidence to support this theory.

Legal anthropology suffered a setback at the turn of the century in the famous Piltdown Lawyer scandal. In order to prove the existence of the missing legal link, a scientist claimed he had found the skull of an ancient lawyer. The skull later turned out to be homemade, combining the large jaw of a modern lawyer with the skull cap of a gorilla. When the hoax was discovered, the science of legal anthropology was set back 50 years.

The first hard scientific proof of the existence of lawyers was discovered by Dr. Margaret Leakey at the Olduvai Gorge in Tanzania. Her find consisted of several legal fragments, but no full case was found intact at the site. Carbon dating has estimated the find at between 1 million and 1.5 million years ago. However, through legal anthropology methods, it has been theorized that the site contains the remains of a fraud trial in which the defendant sought to disprove liability on the basis of his inability to stand erect. The case outcome is unknown, but it coincides with the decline of the Australopithecus and the rise of Homo Erectus in the world. (See Leakey, Margaret A. “The case of erectus hominid.” Legal Anthropology, March 1947: pp. 153.)

In many sites dating from 250,000 to 1,000,000 years ago, legal tools have been uncovered. Unfortunately, the tools are often in fragments, making it difficult to gain much knowledge.

The first complete site discovered has been dated to 150,000 years ago. Stone pictograph briefs were found concerning a land boundary dispute between a tribe of Neanderthals and a tribe of Cro-Magnons. This decision in favor of the Cro-Magnon tribe led to a successive set of cases, spelling the end for the Neanderthal tribe. (See Widget, Dr. John B. “Did Cro-Magnon have better lawyers?” Natural History, June 1926: p. 135. See also Cook, Benjamin. Very Very Early Land Use Cases. Legal Press, 1953.)

Until 10,000 years ago, lawyers wandered around in small tribes, seeking out clients. Finally, small settlements of lawyers began to spring up in the Ur Valley, the birthplace of modern civilization. With settlement came the invention of writing. Previously, lawyers had relied on oral bills for collection of payment, which made collection difficult and meant that if a client died before payment (with life expectancy between 25 and 30 and the death penalty for all cases, most clients died shortly after their case was resolved), the bill would remain uncollected. With written bills, lawyers could continue collection indefinitely.

In the late 1880s, legal anthropologists cracked the legal hieroglyphic language when they were able to determine the meaning of the now famous Rosetta Stone Contract. (See Harrison, Franklin D. The Rosetta Bill. Doubleday, 1989.) The famous first paragraph can be recited verbatim by almost every lawyer:

“In consideration of 20,000 Assyrians workers, 3,512 live goats, and 400,000 hectares of dates, the undersigned hereby conveys all of the undersigned’s right, title, and interest in and to the property commonly known as the Sphinx, more particularly described on Stone A attached hereto and made a part hereof.”

The attempted sale of the Sphinx resulted in the Pharaoh issuing a country-wide purge of all lawyers. Many were slaughtered, and the rest wandered in the desert for years looking for a place to practice.

Greece and Rome saw the revival of the lawyer in society. Lawyers were again allowed to freely practice, and they took full advantage of this opportunity. Many records exist from this classic period. Legal cases ranged from run-of-the-mill goat contract cases to the well-known product liability case documented in the Estate of Socrates vs. Hemlock Wine Company. (See Wilson, Phillips ed. Famous Roman Cases. Houghton, Mifflin publishers, 1949.)

The most famous lawyer of this period was Hammurabi the Lawyer. His code of law gave lawyers hundreds of new business opportunities. By creating a massive legal system, the demand for lawyers increased ten-fold. In those days, almost any thief or crook could kill a sheep, hang-up a sheepskin, and practice law, unlike the highly regulated system today which limits law degrees to only those thieves and crooks who haven’t been convicted of a major felony.

The explosion in the number of lawyers coincided with the development of algebra, the mathematics of legal billing. Pythagoras, a famous Greek lawyer, is revered for his Pythagorean Theorem, which proved the mathematical quandary of double billing. This new development allowed lawyers to become wealthy members of their community, as well as to enter politics, an area previously off-limits to lawyers. Despite the mathematical soundness of double billing, some lawyers went to extremes. Julius Caesar, a Roman lawyer and politician, was murdered by several clients for his record hours billed in late February and early March of 44 B.C. (His murder was the subject of a play by lawyer William Shakespeare. When Caesar discovered that one of his murderers was his law partner Brutus, he murmured the immortal lines, “Et tu Brute,” which can be loosely translated from Latin as “my estate keeps twice the billings.”)

Before the Roman Era, lawyers did not have specific areas of practice. During the period, legal specialists arose to meet the demands of the burgeoning Roman population. Sports lawyers counseled gladiators, admiralty lawyers drafted contracts for the great battles in the Coliseum, international lawyers traveled with the great Roman armies to force native lawyers to sign treaties of adhesion — many of which lasted hundreds of years until they were broken by the barbarian lawyers who descended on Rome from the North and East — and the ever-popular Pro Bono lawyers (Latin for “can’t get a real job”) who represented Christians and lost all their cases for 300 years.

As time went on, the population of lawyers continued to grow until 1 out of every 2 Romans was a lawyer. Soon lawyers were intermarrying. This produced children who were legally entitled to practice Roman law, but with the many defects that such a match produced, the quality of lawyers degenerated, resulting in an ever-increasing defective legal society and the introduction of accountants. Pressured by the legal barbarians from the North with their sign-or-die negotiating skills, Rome fell, and the world entered the Dark Ages.

During the Dark Ages, many of the legal theories and practice developed during the golden age were forgotten. Lawyers lost the art of double billing, the thirty-hour day, the 15-minute phone call, and the conference stone. Instead, lawyers became virtually manual laborers, sharing space with primitive doctor-barbers. Many people sought out magicians and witches instead of lawyers since they were cheaper and easier to understand.

The Dark Ages for lawyers ended in England in 1078. Norman lawyers discovered a loophole in Welsh law that allowed William the Conqueror to foreclose an old French loan and take most of England, Scotland, and Wales. William rewarded the lawyers for their work, and soon lawyers were again accepted in society.

Lawyers became so popular during this period that they were able to heavily influence the kings of Britain, France, and Germany. After a Turkish corporation stiffed the largest and oldest English law firm, the partners of the firm convinced these kings to start a Bill Crusade, sending collection knights all the way to Jerusalem to seek payment.

A major breakthrough for lawyers occurred in the 17th century. Blackstone the Magician, on a trip through Rome, unearthed several dozen ancient Roman legal texts. This new knowledge spread through the legal community like the black plague. Up until that point, lawyers used the local language of the community for their work. Since many smart non-lawyers could then determine what work, if any, the lawyer had done, lawyers often lost clients, and sometimes their head.

Using Blackstone’s finds, lawyers could use Latin to hide what they did so that only other lawyers understood what was happening in any lawsuit. Blackstone was a hero to all lawyers until, of course, he was sued for copyright infringement by another lawyer. Despite his loss, Blackstone is still fondly remembered by most lawyers as the father of legal Latin. “Res ipsa loquitur” was Blackstone’s favorite saying (“my bill speaks for itself”), and it is still heard today.

Many lawyers made history during the Middle Ages. Genghis Kahn, Esq., from a family of Jewish lawyers, Hun & Kahn, pioneered the practice of merging with law offices around Asia Minor at any cost. At one time, the firm was the largest in Asia and Europe. Their success was their downfall. Originally a large personal injury firm (if you didn’t pay their bill, they personally injured you), they became conservative over time and were eventually overwhelmed by lawyers from the West. Vlad Dracul, Esq., a medical malpractice specialist, was renowned for his knowledge of anatomy, and few jurors would side against him for fear of his special bill (his bill was placed atop 20-foot wooden spears on which the non-paying client was placed).

Leonardo di ser Piero da Vinci, Esq., was multi-talented. Besides having a busy law practice, he was an artist and inventor. His most famous case was in defense of himself. M. Lisa vs. da Vinci (Italian Superior Court 1513) involved a product liability suit over a painting da Vinci delivered to the plaintiff. The court, in ruling that the painting was not defective despite the missing eyebrows, issued the famous line, “This court may not know art, but it knows what it likes, and it likes the painting.” This was not surprising since the plaintiff was known for her huge, caterpillar-like eyebrows. Da Vinci was able to convince the court that he was entitled not only to damages but to attorneys’ fees, costs, and punitive damages as well. The court, taking one last look at the plaintiff, granted the request.

A land dispute case in the late 15th century is still studied today for the clever work of Christopher Columbus, Esq. He successfully convinced an Aztec court, in Columbus vs. 1,000,000 Acres that since the Indians did not believe in possession, they could not claim the land in question. Therefore, his claim had to be given priority. Despite the fact that the entire court was sacrificed to the gods, the case held and Spain took an early legal lead in the New World.

As the New World was colonized, England eventually surpassed Spain as the leading colonizer. England began sending all of its criminals and thieves to the New World. This mass dumping of lawyers to the states would come back to haunt England. Eventually, the grandchildren of these pioneer lawyers would successfully defeat King George III in the now famous King George III v. 100 Bags of Tea. England by this time was now dreadfully short of lawyers. The new American lawyers exploited this shortfall and, after a seven-year legal war, defeated the British and created the United States, under the famous motto, “All lawyers are created equal.”

England never forgot this lesson and immediately stopped its practice of sending lawyers to the colonies. This policy left Australia woefully deficient in lawyers.

With stories of legal success common in the late 1700s, more and more people attempted to become lawyers. This process of stealing a shingle worried the more successful lawyers. To stem this tide as well as to create a new profit center, these lawyers passed laws requiring all future lawyers to be restricted from practice unless they went to an approved law school. The model school from which all legal education rules developed was Harvard Law School.

Harvard, established in 1812, set the standard for legal education when, in 1816, it created the standardized system for legal education. This system was based on the Socratic method. At most universities, the students questioned the teacher/professor to gain knowledge. These students would bill their professors, and if the bill went unfulfilled, the students usually hung up their law professor for failure of payment At Harvard, the tables were turned, with the professors billing the students. This method enriched the professors and remains the standard in use in most law schools in America and England.

As developed by Harvard, law students took a standard set of courses as follows:

  1. Jurisprudence: The history of legal billing, from early Greek and Roman billing methods to modern collection techniques.
  2. Torts: French law term for “you get injury, we keep 40%.” Teaches students ambulance-chasing techniques.
  3. Contracts: Teaches that despite an agreement between two parties (the contract), a lawsuit can still be brought.
  4. Civil Procedure: Teaches the tricky arcane rules of court, which were modernized only 150 years ago in New York.
  5. Criminal Law: Speaks for itself.

These courses continue to be used in most law schools throughout the United States.

Despite the restrictions imposed on the practice of law (a four-year college degree, three years of graduate school, and a state-sponsored examination), the quantity of lawyers continues to increase to the point that three out of every five Americans are lawyers. (In fact, there are over 750,000 lawyers in this country.) Every facet of life today is controlled by lawyers. Even Dan Quayle (a lawyer) claims, surprise, that there are too many lawyers. Yet until limits are imposed on legal birth control, the number of lawyers will continue to increase. Is there any hope? We don’t know and frankly don’t care since the author of this book is a successful, wealthy lawyer, the publishers of this book are lawyers, the cashier at the bookstore is a law student, and your mailman is a lawyer. So instead of complaining, join us and remember, there is no such thing as a one-lawyer town.


Lawyers are members of a parasitic life-form which emerges in the cracks of human society.


How Google search creates Fake News

January 15, 2021

Fake News is created just as much by excluding selected news as by inventing stories. Cancelling news also creates fake news.

Google’s “experiment” in Australia has been exposed recently. However, this is not the first such “experiment” and it won’t be the last. But exclusion is a tool used widely by every news outlet to try and control the narrative (and it is noticeable that every outlet does try to control the narrative). There is no news outlet anymore that does not have its own agenda which does not engage in excluding what is unpalatable. All social media platforms have self-serving agendas. They all indulge in “exclusion” as a tool. Sometimes it is simply to create a false (favourable) picture to increase revenues from advertising. Sometimes it is to be politically correct and avoid legal, political or social sanction. It is the same phenomenon which drives the “cancel culture”. We are all familiar with paid advertising always getting preference in Google searches. But Googles’s search algorithms are secret and supposedly untouched by human hand, but they are always changing. They know very well that few go beyond the second page of search results. The algorithms are constantly being tweaked. And in every tweak there is some new exclusion and some new Fake News.

Perceived reality has little to do with “facts” and is entirely about the current narrative. History has become (has always been) a servant of the current narrative. Google Search is primarily a tool for the creation of advertising revenue. The search is always biased in the algorithm. The perceived objectivity of the search is secondary to the revenue objective. Fake News has become a major part of the output of Mainstream Media and exclusion is just another tool for the creation of a false narrative.


Mathematics started in prehistory with counting and the study of shapes

January 8, 2021
Compass box

Mathematics today is classified into more than 40 different areas by journals for publication. I probably heard about the 3 R’s (Reading, Riting and Rithmetic) first at primary school level. At high school, which was 60 years ago, mathematics consisted of arithmetic, geometry, algebra, trigonometry, statistics and logic – in that order. I remember my first class where trigonometry was introduced as a “marriage of algebra and geometry”. Calculus was touched on as advanced algebra. Some numerical methods were taught as a part of statistics. If I take my own progression, it starts with arithmetic, moves on to geometry, algebra and trigonometry and only then to calculus and statistics, symbolic logic and computer science. It was a big deal when, at 10, I got my first “compass box” for geometry, and another big deal, at 13, with my first copy of trigonometric tables. At university in the 70s, Pure Mathematics was distinguished from Applied Engineering Mathematics and from Computing. In my worldview, Mathematics and Physics Departments were where the specialist, esoteric mysteries of such things as topology, number theory, quantum algebra, non-Euclidean geometry and combinatorics could be studied.

I don’t doubt that many animals can distinguish between more and less. It is sometimes claimed that some primates have the ability to count up to about 3. Perhaps they do, but except for in the studies reporting such abilities, they never actually do count. No animals apply counting, They don’t exhibit any explicit understanding of geometrical shapes or structures, though birds, bees, ants and gorillas seem to apply some structural principles, intuitively, when building their nests. Humans, as a species, are unique in not only imagining but also in applying mathematics. We couldn’t count when we left the trees. We had no tools then and we built no shelters. So how did it all begin?

Sometimes Arithmetic, Geometry and Algebra are considered the three core areas of mathematics. But I would contend that it must all start with counting and with shapes – which later developed into Arithmetic and Geometry. Algebra and its abstractions came much later. Counting and the study of shapes must lie at the heart of how prehistoric humans first came to mathematics. But I would also contend that counting and observing the relationship between shapes would have started separately and independently. They both require a certain level of cognition but they differ in that the study of shapes is based on observations of physical surroundings while counting requires invention of concepts in the abstract plane. They may have been contemporaneous but they must, I think, have originated separately.

No circle of standing stones would have been possible without some arithmetic (rather than merely counting) and some geometry. No pyramid, however simple, was ever built without both. No weight was dragged or rolled up an inclined plane without some understanding of both shapes and numbers. No water channel that was ever dug did not involve some arithmetic and some geometry. Already by the time of Sumer and Babylon, and certainly by the time of the Egyptians and the Harappans, the practical application of arithmetic and geometry and even trigonometry in trade, surveying, town planning, time-keeping and building were well established. The sophisticated management of water that we can now glimpse in the ancient civilizations needed both arithmetic and geometry. There is not much recorded history that is available before the Greeks. Arithmetic and Geometry were well established by the time we come to the Greeks who even conducted a vigorous discourse about the nobility (or divinity) of the one versus the other. Pythagoras is not happy with arithmetic since numbers cannot give him – exactly – the hypotenuse of a right triangle of sides of equal length (√2). Which he can so easily draw. Numbers could not exactly reflect all that he could achieve with a straight edge and a compass. The circle could not be squared. The circumference was irrational. The irrationality of the numbers needed to reflect geometrical shapes was, for the purists, vulgar and an abomination. But the application of geometry and arithmetic were common long, long before the Greeks. There is a great distance before counting becomes arithmetic and the study of shapes becomes geometry but the roots of mathematics lie there. That takes us back to well before the Neolithic (c. 12,000 years ago).

That geometry derives from the study of shapes and the patterns and relationships between shapes, given some threshold level of cognition, seems both obvious and inevitable. Shapes are real and ubiquitous. They can be seen in all aspects of the natural world and can be mimicked and constructed. The arc of the sun curving across the sky creates a shape. Shadows create shapes. Light creates straight lines as the elevation of the sun creates angles. Shapes can be observed. And constructed. A taut string to give a straight line and the calm surface of a pond to give a level plane. A string and a weight to give the vertical. A liquid level to give the horizontal. Sticks and shadows. A human turning around to observe the surroundings created a circle. Strings and compasses. Cave paintings from c. 30,000 years ago contain regular shapes. Circles and triangles and squares. Humans started not only observing, but also applying, the relationships between shapes a very long time ago.

Numbers are more mystical. They don’t exist in the physical world. But counting the days from new moon to new moon for a lunar month, or the days in a year, were also known at least 30,000 years ago. Ancient tally sticks to count to 29 testify to that. It would seem that the origins of arithmetic (and numbers) lie in our ancient prehistory and probably more than 50,000 years ago. Counting, the use of specific sounds as the representation of abstract numbers, and number systems are made possible only by first having a concept of identity which allows the definition of one. Dealing with identity and the nature of existence take us before and beyond the realms of philosophy or even theology and are in the metaphysical world. The metaphysics of existence remain mystical and mysterious and beyond human cognition, as much today as in prehistoric times. Nevertheless, it is the cognitive capability of having the concept of a unique identity which enables the concept of one. That one day is distinguishable from the next. That one person, one fruit, one animal or one thing is uniquely different to another. That unique things, similar or dissimilar, can be grouped to create a new identity. That one grouping (us) is distinguishable from another group (them). Numbers are not physically observable. They are all abstract concepts. Linguistically they are sometimes bad nouns and sometimes bad adjectives. The concept of one does not, by itself, lead automatically to a number system. That needs in addition a logic system and invention (a creation of something new which presupposes a certain cognitive capacity). It is by definition, and not by logic or reason or inevitability, that two is defined as one more than the identity represented by one, and three is defined as one more than two, and so on. Note that without the concept of identity and the uniqueness of things setting a constraint, a three does not have to be separated from a two by the same separation as from two to one. The inherent logic is not itself invented but emerges from the concept of identity and uniqueness. That 1 + 1 = 2 is a definition not a discovery. It assumes that addition is possible. It is also significant that nothingness is a much wider (and more mysterious and mystical) concept than the number zero. Zero derives, not from nothingness, but from the assumption of subtraction and then of being defined as one less than one. That in turn generalises to zero being any thing less than itself. Negative numbers emerge by extending that definition. The properties of zero are conferred by convention and by definition. Numbers and number systems are thus a matter of “invention by definition”, but constrained by the inherent logic which emerges from the concept of identity. The patterns and relationships between numbers have been the heady stuff of number theory and a matter of great wonder when they are discovered, but they are all consequent to the existence of the one, the invention of numerals and the subsequent definition that 1 + 1 = 2. Number theory exists only because the numbers are defined as they are. Whereas the concept of identity provides the basis for one and all integers, a further cognitive step is needed to imagine that the one is not indivisible and then to consider the infinite parts of one.

Mere counting is sometimes disparaged, but it is, of course, the most rudimentary form of a rigorous arithmetic with its commutative, associative and distributive laws.

Laws of arithmetic

The cognitive step of getting to count in the first place is a huge leap compared to the almost inevitable evolution of counting into numbers and then into an arithmetic with rigorous laws. We will never know when our ancestors began to count but it seems to me – in comparison with primates of today – that it must have come after a cognitive threshold had been achieved. Quite possibly with the control of fire and after the brain size of the species had undergone a step change. That takes us back to the time of homo erectus and perhaps around a million years ago.

Nearly all animals have shape recognition to some extent. Some primates can even recognise patterns in similar shapes. It is plausible that recognition of patterns and relationships between shapes only took off when our human ancestors began construction either of tools or of rudimentary dwellings. The earliest tools (after the use of clubs) were probably cutting edges and these are first seen around 1.8 million years ago. The simplest constructed shelters would have been lean-to structures of some kind. Construction of both tools and shelters lend themselves naturally to the observation of many geometrical shapes; rectangles, polygons, cones, triangles, similar triangles and the rules of proportion between similar shapes. Arches may also have first emerged with the earliest shelters. More sophisticated tools and very simple dwellings take us back to around 400,000 years ago and certainly to a time before anatomically modern humans had appeared (c. 200,000 years ago). Both rudimentary counting and a sense of shapes would have been present by then. It would have been much later that circles and properties of circles were observed and discovered. (Our earliest evidence of a wheel goes back some 8,000 years and is the application of a much older mathematics). Possibly the interest in the circle came after a greater interest in time keeping had emerged. Perhaps from the first “astronomical” observations of sunrise and sunset and the motion of the moon and the seasons. Certainly our ancestors were well-versed with circles and spheres and their intersections and relationships by the time they became potters (earlier than c. 30,000 years ago). 

I suspect it was the blossoming of trade – rather than the growth of astronomy – which probably helped take counting to number systems and arithmetic. The combination of counting and shapes starts, I think, with the invention of tools and the construction of dwellings. By the time we come to the Neolithic and settlements and agriculture and fortified settlements, arithmetic and geometry and applied mathematics is an established reality. Counting could have started around a million years ago. The study of shapes may have started even earlier. But if we take the origin of “mathematics” to be when counting ability was first combined with a sense of shapes, then we certainly have to step back to at least 50,000 years ago.


The accidental story of two times five and base ten

November 23, 2020

Humans have used many different bases for number systems but the use of base 10 is overwhelmingly dominant. There are instances of the use of base 5, base 6, base 20 and even base 27. In spite of many attempts to replace it by base 10, base 60 has fended off all rationalist suggestions and remnants remain entrenched for our current mapping of time and space. For time periods, base 60 is used exclusively for hours, minutes and seconds but base 10 for subdivisions of the second. Similarly for spatial coordinates, degrees, minutes and seconds of arc are still used but subdivisions of the second use base 10. (Some of the other bases that appear in language are listed at the end of this post).

In terms of mathematics there is no great inherent advantage in the use of one particular number base or another. The utility of a particular choice is a trade off first between size and practicality. The size of the base determines how many unique number symbols are needed (binary needs 2, decimal needs 10 and hexagesimal 16). There are many proponents of the advantages of 2, 3, 8, 12 or 16 being used as our primary number base. Certainly base 12 is the most “fraction friendly”. But all our mathematics  could, in reality, be performed in any number base.

At first glance the reasons for the use of base 10 seems blindingly obvious and looking for origins seems trivial. Our use of base 10 comes simply – and inevitably – from two hands times five digits. In recent times other bases (binary – base 2- and hexadecimal – base 16 – for example) are used more extensively with computers, but base 10 (with some base 60) still predominates in human-human interactions (except when Sheldon is showing off). The use of base 10 predates the use of base 60 which has existed for at least 5,000 years.

It is ubiquitous now but (2 x 5) is not a consequence of design. It derives from a chain of at least three crucial, evolutionary accidents which gave us

  1. four limbs, and
  2. five digits on each limb, and finally
  3. human bipedalism which reserved two limbs for locomotion and left our hands free.

The subsequent evolutionary accidents which led to increased brain size would still have been necessary for the discovery of counting and the invention of number systems. But if, instead of two, we had evolved three limbs free from the responsibilities of locomotion, with three digits on each limb, we might well have had base 9 at the foundations of counting and a nonary number system. The benefits of a place value system and the use of nonecimals would still apply.

It is more difficult to imagine what might have happened if limbs were not symmetrical or the number of digits on each limb were different. No human society has not been predominantly (c. 85%) right-handed. But left-handedness has never been a sufficient handicap to have been eliminated by evolution. Almost certainly right-handedness comes from the asymmetrical functions established in the left and right-brains. The distinction between the functions of the two sides of the brain goes back perhaps 500 million years and long before limbs and tetrapods. By the time limbs evolved, the brain functions giving our predilection for right-handedness must already have been established. So, it is possible to imagine evolution having led to, say, 6 digits on right fore-limbs and 5 digits on left fore-limbs.

I wonder what a natural base of 11 or 13 would have done to the development of counting and number systems?

Why four limbs?

All land vertebrates (mammals, birds, reptiles and amphibians) derive from tetrapods which have two sets of paired limbs. Even snakes evolved from four-limbed lizards. 

Tetrapods evolved from a group of animals known as the Tetrapodomorpha which, in turn, evolved from ancient sarcopterygians around 390 million years ago in the middle Devonian period; their forms were transitional between lobe-finned fishes and the four-limbed tetrapods. The first tetrapods (from a traditional, apomorphy-based perspective) appeared by the late Devonian, 367.5 million years ago. Wikipedia

It would seem that – by trial and error – a land-based creature, fortuitously possessing two pairs of limbs, just happened to be the one which survived and become the ancestor of all tetrapods. The evolutionary advantage of having 4 limbs (two pairs)  – rather than one or three or five pairs – is not at all clear. Insects have evolved three pairs while arachnids have four pairs. Myriapoda are multi-segmented creatures which have a pair of limbs per segment. They can vary from having five segments (10 legs) to about 400 segments (800 legs). The genes that determine the number of limbs determine many other features also and why two pairs would be particularly advantageous is not understood.  It could well be that the two pairs of limbs were incidental and merely followed other survival characteristics. The best bet currently is that

“You could say that the reason we have four limbs is because we have a belly,”

All of us backboned animals — at least the ones who also have jaws — have four fins or limbs, one pair in front and one pair behind. These have been modified dramatically in the course of evolution, into a marvelous variety of fins, legs, arms, flippers, and wings. But how did our earliest ancestors settle into such a consistent arrangement of two pairs of appendages? — Because we have a belly.

According to our hypothesis, the influence of the developing gut suppresses limb initiation along the midgut region and the ventral body wall owing to an “endodermal predominance.” From an evolutionary perspective, the lack of gut regionalization in agnathans reflects the ancestral absence of these conditions, and the elaboration of the gut together with the concomitant changes to the LMD in the gnathostomes could have led to the origin of paired fins.

The critical evolutionary accident then is that the intrepid sea creature which first colonised the land, some 390 million years ago, and gave rise to all tetrapods was one with a developing belly and therefore just happened to have two pairs of appendages.

The tail, however, is an asymmetrical appendage which may also once have been a pair (one on top of the other) but is now generally a solitary appendage. But it is controlled by a different gene-set to those which specify limbs. In mammals it has disappeared for some and performs stability functions for others. In some primates it has functions close to that of a fifth limb. But in no case has a tail ever evolved digits.

Why five digits on each limb?

When our ancestor left the oceans and became the origin of all tetrapods, four limbs had appeared but the number of digits on each limb had not then been decided. It took another 50 million years before a split distinguished amphibians from mammals, birds and reptiles. The timeline is thought to be:

  • 390 million years ago – tetrapod ancestor leaves the oceans
  • 360 million years ago – tetrapods with 6,7 and 8 digits per limb
  • 340 million years ago – amphibians go their separate way
  • 320 million years ago – reptiles slither away on a path giving dinosaurs and birds
  • 280 million years ago – the first mammals appear

SciAm

The condition of having no more than five fingers or toes …. probably evolved before the evolutionary divergence of amphibians (frogs, toads, salamanders and caecilians) and amniotes (birds, mammals, and reptiles in the loosest sense of the term). This event dates to approximately 340 million years ago in the Lower Carboniferous Period. Prior to this split, there is evidence of tetrapods from about 360 million years ago having limbs bearing arrays of six, seven and eight digits. Reduction from these polydactylous patterns to the more familiar arrangements of five or fewer digits accompanied the evolution of sophisticated wrist and ankle joints–both in terms of the number of bones present and the complex articulations among the constituent parts.

By the time we reach the mammals, five digits per limb has become the norm though many mammals then follow paths for the reduction of the number of effective digits in play. Moles and pandas evolve an extra sort-of adjunct digit from their wrists but do not (or cannot) create an additional digit.

…….. Is there really any good evidence that five, rather than, say, four or six, digits was biomechanically preferable for the common ancestor of modern tetrapods? The answer has to be “No,” in part because a whole range of tetrapods have reduced their numbers of digits further still. In addition, we lack any six-digit examples to investigate. This leads to the second part of the answer, which is to note that although digit numbers can be reduced, they very rarely increase. In a general sense this trait reflects the developmental-evolutionary rule that it is easier to lose something than it is to regain it. Even so, given the immensity of evolutionary time and the extraordinary variety of vertebrate bodies, the striking absence of truly six-digit limbs in today’s fauna highlights some sort of constraint. Moles’ paws and pandas’ thumbs are classic instances in which strangely re-modeled wrist bones serve as sixth digits and represent rather baroque solutions to the apparently straightforward task of growing an extra finger.

Five digits is apparently the result of evolutionary trial and error, but as with all things genetic, the selection process was probably selecting for something other than the number of digits. 

Science Focus

All land vertebrates today are descended from a common ancestor that had four legs, with five toes on each foot. This arrangement is known as the pentadactyl limb. Some species have subsequently fused these fingers into hooves or lost them altogether, but every mammal, bird, reptile and amphibian traces its family tree back to a pentadactyl ancestor that lived around 340 million years ago. Before, there were animals with six, seven and even eight toes on each foot, but they all went extinct at the end of the Devonian period, 360 million years ago. These other creatures were more aquatic than the pentadactyl animals. Evidence in the fossil record suggests that their ribs weren’t strong enough to support their lungs out of water and their shoulder and hip joints didn’t allow them to walk effectively on land. 

Five digits on our limbs are an evolutionary happenstance. There is nothing special that we can identify with being five. It could just as well have been six or seven or eight. That the number of digits on each limb are not unequal is also an evolutionary happenstance predating the tetrapods. It is more efficient genetically, when multiple limbs are needed, to duplicate the pattern (with some variations for mirror symmetry and for differences between paired sets). When each limb is to carry many digits it is more efficient to follow a base pattern and keep the necessary genetic variations to a minimum. 

By 280 million years ago, four limbs with five digits on each limb had become the base pattern for all land-based creatures and the stage was set for base 20. And then came bipedalism.

Why bipedalism?

Bipedalism is not uncommon among land creatures and even birds. Some dinosaurs exhibited bipedalism. Along the human ancestral line, bipedalism first shows up around 7 million years ago (Sahelanthropus). It may then have disappeared for a while and then appeared again around 4 million years ago in a more resilient form (Australopithecus) which has continued through till us. What actually drove us from the trees to bipedalism is a matter of many theories and much conjecture. Whatever the reasons the large brain evolved only in bipedal hominins who had a straightened spine, and who had maintained two limbs for locomotion while freeing up the other two for many other activities. The advantages of being able to carry things and throw things and shape things are considered the drivers for this development. And these two free limbs became the counting limbs.

It seems unlikely that a large brain could have developed in a creature which did not have some limbs freed from the tasks of locomotion. Locomotion itself and the preference for symmetry would have eliminated a three-limbed creature with just one free limb.

Two limbs for counting, rather than 3 of 4 or 4 of 4, is also happenstance. But it may be less accidental than the 4 limbs to begin with and the 5 digits on each limb. An accidental four limbs reduced inevitably to two counting limbs. Together with an accidental five digits they gave us base 10.


Other bases

1. Oksapmin, base-27 body part counting

The Oksapmin people of New Guinea have a base-27 counting system. The words for numbers are the words for the 27 body parts they use for counting, starting at the thumb of one hand, going up to the nose, then down the other side of the body to the pinky of the other hand …… . ‘One’ is tip^na (thumb), 6 is dopa (wrist), 12 is nata (ear), 16 is tan-nata (ear on the other side), all the way to 27, or tan-h^th^ta (pinky on the other side).

2. Tzotzil, base-20 body part counting

Tzotzil, a Mayan language spoken in Mexico, has a vigesimal, or base-20, counting system. ….. For numbers above 20, you refer to the digits of the next full man (vinik). ..

3. Yoruba, base-20 with subtraction

Yoruba, a Niger-Congo language spoken in West Africa, also has a base-20 system, but it is complicated by the fact that for each 10 numbers you advance, you add for the digits 1-4 and subtract for the digits 5-9. Fourteen (??rinlá) is 10+4 while 17 (eétàdílógún) is 20-3. So, combining base-20 and subtraction means 77 is …. (20×4)-3.

4. Traditional Welsh, base-20 with a pivot at 15

Though modern Welsh uses base-10 numbers, the traditional system was base-20, with the added twist of using 15 as a reference point. Once you advance by 15 (pymtheg) you add units to that number. So 16 is un ar bymtheg (one on 15), 36 is un ar bymtheg ar hugain (one on 15 on 20), and so on.

5. Alamblak, numbers built from 1, 2, 5, and 20

In Alamblak, a language of Papua New Guinea, there are only words for 1, 2, 5, and 20, and all other numbers are built out of those. So 14 is (5×2)+2+2, or tir hosfi hosfihosf, and 59 is (20×2)+(5x(2+1))+(2+2) or yima hosfi tir hosfirpati hosfihosf.

6. Ndom, base-6

Ndom, another language of Papua New Guinea, has a base-6, or senary number system. It has basic words for 6, 18, and 36 (mer, tondor, nif) and other numbers are built with reference to those. The number 25 is tondor abo mer abo sas (18+6+1), and 90 is nif thef abo tondor ((36×2)+18).

7. Huli, base-15

The Papua New Guinea language Huli uses a base-15, or pentadecimal system. Numbers which are multiples of 15 are simple words. Where the English word for 225 is quite long, the Huli word is ngui ngui, or 15 15. However 80 in Huli is ngui dau, ngui waragane-gonaga duria ((15×5)+the 5th member of the 6th 15).

8. Bukiyip, base-3 and base-4 together

In Bukiyip, another Papua New Guinea language also known as Mountain Arapesh, there are two counting systems, and which one you use depends on what you are counting. Coconuts, days, and fish are counted in base-3. Betel nuts, bananas, and shields are counted in base-4. The word anauwip means 6 in the base-3 system and 24 in the base-4 system!

9. Supyire, numbers built from 1, 5, 10, 20, 80, and 400

Supyire, a Niger-Congo language spoken in Mali has basic number words for 1, 5, 10, 20, 80 and 400, and builds the rest of the numbers from those. The word for 600 is kàmpwòò ná ?kwuu shuuní ná bééshùùnnì, or 400+(80×2)+(20×2)

10. Danish, forms some multiples of ten with fractions

Danish counting looks pretty familiar until you get to 50, and then things get weird with fractions. The number 50 is halvtreds, a shortening of halv tred sinds tyve (“half third times 20” or 2½x20). The number 70 is 3½x20, and 90 is 4½x20.

11. French, mix of base-10 and base-20

French uses base-10 counting until 70, at which point it transitions to a mixture with base-20. The number 70 is soixante-dix (60+10), 80 is quatre-vingts (4×20), and 90 is quatre-vingts-dix ((4×20)+10).

12. Nimbia, base-12

Even though, as the dozenalists claim, 12 is the best base mathematically, there are relatively few base-12 systems found in the world’s languages. In Nimbia, a dialect of the Gwandara language of Nigeria, multiples of 12 are the basic number words around which everything else is built. The number 29 is gume bi ni biyar ((12×2)+5), and 95 is gume bo’o ni kwada ((12×7)+11).



%d bloggers like this: