I am always rather amused when people in the now try and bask in the past glories (usually exaggerated and always presumed) of their ancestors. Especially when someone claims descent from some very famous person. As if they chose them. To be proud of a famous father or grandfather is perfectly reasonable but to claim credit in the now for their deeds in the past is illogical. To claim credit for ancestors even further back in time verges on the ridiculous.
I find it especially silly when someone proudly declaims an ancestor’s presumed qualities or famous deeds and misses that they themselves suffer by comparison. I am equally unimpressed when someone proudly claims a long line of descent. Every single one of the 7.3 billion alive today (poor-man, rich-man, beggar-man, thief) has exactly the same number of ancestors as everyone else. One can now bask not only in famous ancestors but even in their past shame or misery. Nowadays it has become fashionable to try and gain “victimhood credits” for the sufferings and failings of long-gone ancestors. Entitlement culture has now given us “victimhood privilege” as a new phenomenon of the 21st century.
Nationalist groups in many countries who are insecure about their own identities often bask in the presumed past glories of ancient civilizations. The one common feature of all these “great civilizations is, of course, that they all failed. It applies to all the classical “great civilizations” in Egypt, China, India, Greece and Rome. Some lasted much longer than others but they all eventually collapsed. Civilizations and societies which succumbed to others gives rise to claims of current victimhood credits for the sufferings and the failings of their ancestors. To be descended from the Phoenicians or the Mayans or Aztecs is now creditable in the now. To be descended from slaves of 200 years ago or from the colonised 500 years ago allows victimhood credits to be claimed in the present. Nowadays, in India, the Hindu right tries to take credit for the exaggerated, and often quite dubious, wonders of past “golden ages”, some two or five (or even ten!) thousand years ago. Never mind that the “golden ages” collapsed due to their own stresses, faults and imperfections. Never mind that the “golden ages” were always followed by millennia of “dark and dismal ages”. Never mind that glorious ages were followed by inglorious times because the glorious ages all led to decadence and depravity. Never mind that the “dark ages” and their misery were a direct consequence of the preceding “golden ages”,
Every person alive today had some ancestor who was a thief, a murderer, a cheat, a ruler or a slave. That includes every claimed descendant of Genghis Khan (40 generations) or Confucius (80 generations), and every current member of any “aristocracy” or “royal lineage” (the Norwegian House of Schleswig-Holstein-Sonderburg-Glücksburg from 1106 CE is probably the oldest recorded). There is nobody alive today who can even presume to trace a direct line of descent for more than about 40 generations. Even the most detailed line of descent leaves out more ancestors than are included. In practice nobody has a record of all their ancestors for more than about 10 generations and very few for more than 5. And if we want to go back to the heyday of ancient Greece (500 BCE) we would need 125 generations. And to reach back to the first cities ever we would need 500 – 600 generations. Modern humans started around 10,000 generations ago.
Every person alive today has more ancestors who were quite ordinary and forgettable than famous ones. There are more villains in each person’s ancestry than there are “good guys”. Basking in the fame or the shame of ancestors is about as silly as the human mind allows. There is no person alive today who does not have an ancestor who was an illiterate, speechless, murderous, selfish, tree-swinging ape.
Humans have used many different bases for number systems but the use of base 10 is overwhelmingly dominant. There are instances of the use of base 5, base 6, base 20 and even base 27. In spite of many attempts to replace it by base 10, base 60 has fended off all rationalist suggestions and remnants remain entrenched for our current mapping of time and space. For time periods, base 60 is used exclusively for hours, minutes and seconds but base 10 for subdivisions of the second. Similarly for spatial coordinates, degrees, minutes and seconds of arc are still used but subdivisions of the second use base 10. (Some of the other bases that appear in language are listed at the end of this post).
In terms of mathematics there is no great inherent advantage in the use of one particular number base or another. The utility of a particular choice is a trade off first between size and practicality. The size of the base determines how many unique number symbols are needed (binary needs 2, decimal needs 10 and hexagesimal 16). There are many proponents of the advantages of 2, 3, 8, 12 or 16 being used as our primary number base. Certainly base 12 is the most “fraction friendly”. But all our mathematics could, in reality, be performed in any number base.
At first glance the reasons for the use of base 10 seems blindingly obvious and looking for origins seems trivial. Our use of base 10 comes simply – and inevitably – from two hands times five digits. In recent times other bases (binary – base 2- and hexadecimal – base 16 – for example) are used more extensively with computers, but base 10 (with some base 60) still predominates in human-human interactions (except when Sheldon is showing off). The use of base 10 predates the use of base 60 which has existed for at least 5,000 years.
It is ubiquitous now but (2 x 5) is not a consequence of design. It derives from a chain of at least three crucial, evolutionary accidents which gave us
four limbs, and
five digits on each limb, and finally
human bipedalism which reserved two limbs for locomotion and left our hands free.
The subsequent evolutionary accidents which led to increased brain size would still have been necessary for the discovery of counting and the invention of number systems. But if, instead of two, we had evolved three limbs free from the responsibilities of locomotion, with three digits on each limb, we might well have had base 9 at the foundations of counting and a nonary number system. The benefits of a place value system and the use of nonecimals would still apply.
It is more difficult to imagine what might have happened if limbs were not symmetrical or the number of digits on each limb were different. No human society has not been predominantly (c. 85%) right-handed. But left-handedness has never been a sufficient handicap to have been eliminated by evolution. Almost certainly right-handedness comes from the asymmetrical functions established in the left and right-brains. The distinction between the functions of the two sides of the brain goes back perhaps 500 million years and long before limbs and tetrapods. By the time limbs evolved, the brain functions giving our predilection for right-handedness must already have been established. So, it is possible to imagine evolution having led to, say, 6 digits on right fore-limbs and 5 digits on left fore-limbs.
I wonder what a natural base of 11 or 13 would have done to the development of counting and number systems?
Why four limbs?
All land vertebrates (mammals, birds, reptiles and amphibians) derive from tetrapods which have two sets of paired limbs. Even snakes evolved from four-limbed lizards.
Tetrapods evolved from a group of animals known as the Tetrapodomorpha which, in turn, evolved from ancient sarcopterygians around 390 million years ago in the middle Devonian period; their forms were transitional between lobe-finned fishes and the four-limbed tetrapods. The first tetrapods (from a traditional, apomorphy-based perspective) appeared by the late Devonian, 367.5 million years ago. – Wikipedia
It would seem that – by trial and error – a land-based creature, fortuitously possessing two pairs of limbs, just happened to be the one which survived and become the ancestor of all tetrapods. The evolutionary advantage of having 4 limbs (two pairs) – rather than one or three or five pairs – is not at all clear. Insects have evolved three pairs while arachnids have four pairs. Myriapoda are multi-segmented creatures which have a pair of limbs per segment. They can vary from having five segments (10 legs) to about 400 segments (800 legs). The genes that determine the number of limbs determine many other features also and why two pairs would be particularly advantageous is not understood. It could well be that the two pairs of limbs were incidental and merely followed other survival characteristics. The best bet currently is that
All of us backboned animals — at least the ones who also have jaws — have four fins or limbs, one pair in front and one pair behind. These have been modified dramatically in the course of evolution, into a marvelous variety of fins, legs, arms, flippers, and wings. But how did our earliest ancestors settle into such a consistent arrangement of two pairs of appendages? — Because we have a belly.
According to our hypothesis, the influence of the developing gut suppresses limb initiation along the midgut region and the ventral body wall owing to an “endodermal predominance.” From an evolutionary perspective, the lack of gut regionalization in agnathans reflects the ancestral absence of these conditions, and the elaboration of the gut together with the concomitant changes to the LMD in the gnathostomes could have led to the origin of paired fins.
The critical evolutionary accident then is that the intrepid sea creature which first colonised the land, some 390 million years ago, and gave rise to all tetrapods was one with a developing belly and therefore just happened to have two pairs of appendages.
The tail, however, is an asymmetrical appendage which may also once have been a pair (one on top of the other) but is now generally a solitary appendage. But it is controlled by a different gene-set to those which specify limbs. In mammals it has disappeared for some and performs stability functions for others. In some primates it has functions close to that of a fifth limb. But in no case has a tail ever evolved digits.
Why five digits on each limb?
When our ancestor left the oceans and became the origin of all tetrapods, four limbs had appeared but the number of digits on each limb had not then been decided. It took another 50 million years before a split distinguished amphibians from mammals, birds and reptiles. The timeline is thought to be:
390 million years ago – tetrapod ancestor leaves the oceans
360 million years ago – tetrapods with 6,7 and 8 digits per limb
340 million years ago – amphibians go their separate way
320 million years ago – reptiles slither away on a path giving dinosaurs and birds
The condition of having no more than five fingers or toes …. probably evolved before the evolutionary divergence of amphibians (frogs, toads, salamanders and caecilians) and amniotes (birds, mammals, and reptiles in the loosest sense of the term). This event dates to approximately 340 million years ago in the Lower Carboniferous Period. Prior to this split, there is evidence of tetrapods from about 360 million years ago having limbs bearing arrays of six, seven and eight digits. Reduction from these polydactylous patterns to the more familiar arrangements of five or fewer digits accompanied the evolution of sophisticated wrist and ankle joints–both in terms of the number of bones present and the complex articulations among the constituent parts.
By the time we reach the mammals, five digits per limb has become the norm though many mammals then follow paths for the reduction of the number of effective digits in play. Moles and pandas evolve an extra sort-of adjunct digit from their wrists but do not (or cannot) create an additional digit.
…….. Is there really any good evidence that five, rather than, say, four or six, digits was biomechanically preferable for the common ancestor of modern tetrapods? The answer has to be “No,” in part because a whole range of tetrapods have reduced their numbers of digits further still. In addition, we lack any six-digit examples to investigate. This leads to the second part of the answer, which is to note that although digit numbers can be reduced, they very rarely increase. In a general sense this trait reflects the developmental-evolutionary rule that it is easier to lose something than it is to regain it. Even so, given the immensity of evolutionary time and the extraordinary variety of vertebrate bodies, the striking absence of truly six-digit limbs in today’s fauna highlights some sort of constraint. Moles’ paws and pandas’ thumbs are classic instances in which strangely re-modeled wrist bones serve as sixth digits and represent rather baroque solutions to the apparently straightforward task of growing an extra finger.
Five digits is apparently the result of evolutionary trial and error, but as with all things genetic, the selection process was probably selecting for something other than the number of digits.
All land vertebrates today are descended from a common ancestor that had four legs, with five toes on each foot. This arrangement is known as the pentadactyl limb. Some species have subsequently fused these fingers into hooves or lost them altogether, but every mammal, bird, reptile and amphibian traces its family tree back to a pentadactyl ancestor that lived around 340 million years ago. Before, there were animals with six, seven and even eight toes on each foot, but they all went extinct at the end of the Devonian period, 360 million years ago. These other creatures were more aquatic than the pentadactyl animals. Evidence in the fossil record suggests that their ribs weren’t strong enough to support their lungs out of water and their shoulder and hip joints didn’t allow them to walk effectively on land.
Five digits on our limbs are an evolutionary happenstance. There is nothing special that we can identify with being five. It could just as well have been six or seven or eight. That the number of digits on each limb are not unequal is also an evolutionary happenstance predating the tetrapods. It is more efficient genetically, when multiple limbs are needed, to duplicate the pattern (with some variations for mirror symmetry and for differences between paired sets). When each limb is to carry many digits it is more efficient to follow a base pattern and keep the necessary genetic variations to a minimum.
By 280 million years ago, four limbs with five digits on each limb had become the base pattern for all land-based creatures and the stage was set for base 20. And then came bipedalism.
Why bipedalism?
Bipedalism is not uncommon among land creatures and even birds. Some dinosaurs exhibited bipedalism. Along the human ancestral line, bipedalism first shows up around 7 million years ago (Sahelanthropus). It may then have disappeared for a while and then appeared again around 4 million years ago in a more resilient form (Australopithecus) which has continued through till us. What actually drove us from the trees to bipedalism is a matter of many theories and much conjecture. Whatever the reasons the large brain evolved only in bipedal hominins who had a straightened spine, and who had maintained two limbs for locomotion while freeing up the other two for many other activities. The advantages of being able to carry things and throw things and shape things are considered the drivers for this development. And these two free limbs became the counting limbs.
It seems unlikely that a large brain could have developed in a creature which did not have some limbs freed from the tasks of locomotion. Locomotion itself and the preference for symmetry would have eliminated a three-limbed creature with just one free limb.
Two limbs for counting, rather than 3 of 4 or 4 of 4, is also happenstance. But it may be less accidental than the 4 limbs to begin with and the 5 digits on each limb. An accidental four limbs reduced inevitably to two counting limbs. Together with an accidental five digits they gave us base 10.
The Oksapmin people of New Guinea have a base-27 counting system. The words for numbers are the words for the 27 body parts they use for counting, starting at the thumb of one hand, going up to the nose, then down the other side of the body to the pinky of the other hand …… . ‘One’ is tip^na (thumb), 6 is dopa (wrist), 12 is nata (ear), 16 is tan-nata (ear on the other side), all the way to 27, or tan-h^th^ta (pinky on the other side).
2. Tzotzil, base-20 body part counting
Tzotzil, a Mayan language spoken in Mexico, has a vigesimal, or base-20, counting system. ….. For numbers above 20, you refer to the digits of the next full man (vinik). ..
3. Yoruba, base-20 with subtraction
Yoruba, a Niger-Congo language spoken in West Africa, also has a base-20 system, but it is complicated by the fact that for each 10 numbers you advance, you add for the digits 1-4 and subtract for the digits 5-9. Fourteen (??rinlá) is 10+4 while 17 (eétàdílógún) is 20-3. So, combining base-20 and subtraction means 77 is …. (20×4)-3.
4. Traditional Welsh, base-20 with a pivot at 15
Though modern Welsh uses base-10 numbers, the traditional system was base-20, with the added twist of using 15 as a reference point. Once you advance by 15 (pymtheg) you add units to that number. So 16 is un ar bymtheg (one on 15), 36 is un ar bymtheg ar hugain (one on 15 on 20), and so on.
5. Alamblak, numbers built from 1, 2, 5, and 20
In Alamblak, a language of Papua New Guinea, there are only words for 1, 2, 5, and 20, and all other numbers are built out of those. So 14 is (5×2)+2+2, or tir hosfi hosfihosf, and 59 is (20×2)+(5x(2+1))+(2+2) or yima hosfi tir hosfirpati hosfihosf.
6. Ndom, base-6
Ndom, another language of Papua New Guinea, has a base-6, or senary number system. It has basic words for 6, 18, and 36 (mer, tondor, nif) and other numbers are built with reference to those. The number 25 is tondor abo mer abo sas (18+6+1), and 90 is nif thef abo tondor ((36×2)+18).
7. Huli, base-15
The Papua New Guinea language Huli uses a base-15, or pentadecimal system. Numbers which are multiples of 15 are simple words. Where the English word for 225 is quite long, the Huli word is ngui ngui, or 15 15. However 80 in Huli is ngui dau, ngui waragane-gonaga duria ((15×5)+the 5th member of the 6th 15).
8. Bukiyip, base-3 and base-4 together
In Bukiyip, another Papua New Guinea language also known as Mountain Arapesh, there are two counting systems, and which one you use depends on what you are counting. Coconuts, days, and fish are counted in base-3. Betel nuts, bananas, and shields are counted in base-4. The word anauwip means 6 in the base-3 system and 24 in the base-4 system!
9. Supyire, numbers built from 1, 5, 10, 20, 80, and 400
Supyire, a Niger-Congo language spoken in Mali has basic number words for 1, 5, 10, 20, 80 and 400, and builds the rest of the numbers from those. The word for 600 is kàmpwòò ná ?kwuu shuuní ná bééshùùnnì, or 400+(80×2)+(20×2)
10. Danish, forms some multiples of ten with fractions
Danish counting looks pretty familiar until you get to 50, and then things get weird with fractions. The number 50 is halvtreds, a shortening of halv tred sinds tyve (“half third times 20” or 2½x20). The number 70 is 3½x20, and 90 is 4½x20.
11. French, mix of base-10 and base-20
French uses base-10 counting until 70, at which point it transitions to a mixture with base-20. The number 70 is soixante-dix (60+10), 80 is quatre-vingts (4×20), and 90 is quatre-vingts-dix ((4×20)+10).
12. Nimbia, base-12
Even though, as the dozenalists claim, 12 is the best base mathematically, there are relatively few base-12 systems found in the world’s languages. In Nimbia, a dialect of the Gwandara language of Nigeria, multiples of 12 are the basic number words around which everything else is built. The number 29 is gume bi ni biyar ((12×2)+5), and 95 is gume bo’o ni kwada ((12×7)+11).
Once upon a time, a designated chef at an ancient hominin hearth demanded compensation for his culinary art and started the oldest profession. Cooking predates the oldest cave paintings and may well be the oldest human art form.
Preserving is unambiguous but salting is a word that is rarely used anymore. The distinction in language between seasoning and flavouring is not so much ambiguous as wishful thinking. Theoretically, seasoning is considered the use of additives which allegedly enhance existing flavours, whereas flavouring adds different flavours. In practice this is a nonsense distinction. We have our five or possibly seven basic taste receptors (sweet, sour, bitter, salty, umami and maybe pungency and a fatty richness) and our olfactory receptors which can distinguish a myriad smells.
Five basic tastes – sweet, sour, bitter, salty and umami (savory) are universally recognized, although some cultures also include pungency and oleogustus (“fattiness”). The number of food smells is unbounded; a food’s flavor, therefore, can be easily altered by changing its smell while keeping its taste similar.
Any particular flavour we perceive in our brains is then due to a particular combination of activated taste and smell receptors together. With a change in sufficient activated taste or smell receptors our brains recognize a change in flavour. Generally seasoning involves salt (always) and sometimes some pepper and acidic matter (lime, vinegar, ….). Flavouring is considered predominantly to be through the use of herbs and spices. However, the difference between seasoned and unseasoned is a difference of perceived flavour in our brains. No self-respecting chef will ever admit that seasoning is merely a sub-set of flavouring, but even chefs must be allowed their self aggrandizement. It is entirely false that proper seasoning cannot be tasted. A lack of salt is perceived when there is a lack of an expected activation of salt receptors. Adding salt always changes the combination of activated receptors and is always a change of flavour. Cook books generally perpetuate the misconceptions.
Many ingredients are used to enhance the taste of foods. These ingredients can be used to provide both seasoning and flavouring.
Seasoning means to bring out or intensify the natural flavour of the food without changing it. Seasonings are usually added near the end of the cooking period. The most common seasonings are salt, pepper, and acids (such as lemon juice). When seasonings are used properly, they cannot be tasted; their job is to heighten the flavours of the original ingredients.
Flavouring refers to something that changes or modifies the original flavour of the food. Flavouring can be used to contrast a taste such as adding liqueur to a dessert where both the added flavour and the original flavour are perceptible. Or flavourings can be used to create a unique flavour in which it is difficult to discern what the separate flavourings are.
Seasoning is always about changing perceived flavour and is a particular sub-set of flavouring. The story that seasoning originates with food preservation through the use of salt, whereas the use of herbs and spices for flavouring derives from when hunter-gatherers wrapped food in aromatic leaves for transport is plausible but little more than speculation. Salt is inorganic and is not considered a spice but is the major ingredient for seasoning as opposed to flavouring. Herbs and spices are always organic and plant-based. (The proposed use of crushed insects as flavouring can safely be ignored. The use of cochineal insects – E120 – to give a carmine food colouring is not relevant.) Yet the manner we use small quantities of salt with foods is much too similar to the manner we use small quantities of herbs and spices not to have been the role-model and the precursor for the culinary use of herbs and spices.
Though this history is as presented by a purveyor of spices, it is both informative and credible.
Abundant anecdotal information documents the historical use of herbs and spices for their health benefits. Early documentation suggests that hunters and gatherers wrapped meat in the leaves of bushes, accidentally discovering that this process enhanced the taste of the meat, as did certain nuts, seeds, berries, and bark. Over the years, spices and herbs were used for medicinal purposes. Spices and herbs were also used as a way to mask unpleasant tastes and odors of food, and later, to keep food fresh. Ancient civilizations did not distinguish between those spices and herbs used for flavoring from those used for medicinal purposes. When leaves, seeds, roots, or gums had a pleasant taste or agreeable odor, it became in demand and gradually became a norm for that culture as a condiment.
Our taste receptors did not evolve for the purposes of culinary pleasure. Bitterness detection is clearly a defense mechanism. Most animals reject bitter foods as a defense against toxins and poisons. All animals need salt. Mammal brains are designed to prevent a debilitating lack of sodium and have evolved the detection of saltiness as a tool. A craving for salty food has been shown to emerge spontaneously (and not as learned behaviour) with sodium deficiency. This has been shown to apply to many animals including sheep, elephants, moose, and primates who seek out salty food when suffering sodium deficiency. It is very likely that the capability to detect sweetness has also evolved as a way of urgently seeking energy rich foods. Exactly how or why it became important to detect sourness or umaminess is in the realm of speculation. Vegetarian food contains less salt than meat or fish. Our primate ancestors were mainly vegetarian and, like primates today, would have resorted to eating pith and rotting wood to counter sodium deficiencies.
When multicellular organisms evolved and crawled up the beaches to dry land, they had to take the seawater with them in the blood and other body fluids. The mineral content of human blood plasma today is still much like that of the seas of the Precambrian era in which life arose. ….. And the ancestors of man for at least 25 million of the last 30 million years were almost certainly vegetarians, and therefore got little salt in their diets because most plants store little salt. To compensate for the scarcity of a substance vital to life, the brains of our ancestors and those of other mammals developed powerful strategies for getting and keeping salt. Inborn, Not Learned.
….. sudden improvement after one copious salt meal may also help explain the ritual acts of cannibalism once practiced by tribes in the Amazon jungles, the highland regions of New Guinea and elsewhere. Sometimes the body of a fallen foe was eaten in a final act of triumph and to absorb magically the strength of the defeated enemy. In other cultures, bones or other parts of a departed relative were eaten as a final act of devotion and also to gain back the strength of the dead person.
There are those who suggest that human use of salt as seasoning (as opposed to for preservation) only took off in the Neolithic after the advent of agriculture and our diet became more vegetarian. I don’t find this theory entirely plausible. Before hominins and bipedalism (c. 4 million years ago) our ancestors were primarily vegetarian. Meat eating became more prevalent once bipedalism led to a more actively predatory life-style as hunter gatherers. With more meat, diet now included larger amounts of salt and detection of saltiness was needed less for survival and could be diverted to culinary aesthetics. The control of fire appears around 2 million years ago and coincides roughly with a shift to eating cooked meats and the rapid (in evolutionary terms) increase of hominin brain size. I can well imagine a hominin individual – perhaps even a Neanderthal – designated as the chef for the day and being berated for lack of seasoning with the grilled mammoth steak.
In my story, the use of salt with cooked food as seasoning and to enhance flavour must go back – perhaps a million years – to our hunter-gatherer forbears who had shifted to a meat-rich diet. It is thus my contention that it is this shift to cooked meat which released our flavour receptors from survival tasks and enabled them to be diverted to culinary aesthetics. Even the use of herbs and spices comes well before the Neolithic agricultural revolution (around 12,000 years ago). Herbs and spices being organic do not survive long and are very rare in the archaeological record. However, pots from about 25,000 years ago containing residues of cumin and coriander have been found. The theory that hunter-gatherers packaged meats for travel in large leaves and added – by trial and error – other plant-based preservatives or flavourings, is not implausible. The medicinal use of herbs and spices must also have been discovered around this time. In any event, even the first use of herbs and spices purely for flavouring must go back at least 50,000 years. Though diet must have included more vegetarian food after the advent of agriculture, the culinary arts of seasoning and flavouring had already been well established before the Neolithic. By the time we come to the ancient civilizations of 7 – 8,000 years ago, more than 100 herbs and spices were known and regularly used.
Whether first for food-wrapping or for medicinal use or for use as preservatives, the use of salt and herbs and spices entirely and specifically to make food taste better marks the beginning of the culinary art. No doubt there were many cases of trial and accident and failures and error. The failed attempts did not make it to the stories of spices though some are now probably included in the history of poisons. There is a case to be made for the culinary profession to be considered the oldest in the world.
There is much rhetoric about walls these days. Usually about walls at the boundaries of nations.
But the concept of walls (along with fire and the wheel and all that they enabled) was one of the critical developments which enabled humans to differentiate themselves from all other species and enabled human civilisation to develop. It would not be an exaggeration to claim that without walls – first around shelters and then around dwellings, settlements, places of work, and eventually around whole cities and at nation boundaries – civilisation itself would not have been possible. Human civilisation would have been still-born without the ability to create safe, protected enclosures within which to live (and work) defying the elements and any external threats. In fact, walls are integral and necessary to our lives today. We could not live without walls.
Humans control the environment they live and work in. This ability is what allows us to live anywhere in the world irrespective of the prevailing environment. From blistering deserts to the frigid reaches of Antarctica, it is walls which enable roofs and which together allow us to create volumes of controlled environments for ourselves. We not only live within walls, we travel in walled containers which provide enclosed volumes of controlled environments. Our carts, our cars our trains and boats and planes all rely on walls to create our enclosures. The walls in my home are what give me my controlled environment and my security and my sense of security.
It was always thought that cave dwelling probably preceded the building of huts and dwellings. But modern humans appeared first in areas where caves were not so numerous and primitive walls probably appeared to protect small groups spending the night on open ground. There is some suggestion that some kind of walled shelters were used by homo erectus – perhaps 500,000 years ago. (Homo erectus had the controlled use of fire as early as 1.5 million years ago). It is not implausible that the earliest walls were fences built to protect an area around a camp-fire.
BBC:Japanese archaeologists have uncovered the remains of what is believed to be the world’s oldest artificial structure, on a hillside at Chichibu, north of Tokyo. The shelter would have been built by an ancient ancestor of humans, Homo erectus, who is known to have used stone tools. The site has been dated to half a million years ago, according to a report in New Scientist. It consists of what appear to be 10 post holes, forming two irregular pentagons which may be the remains of two huts. Thirty stone tools were also found scattered around the site. …… Before the discovery, the oldest remains of a structure were those at Terra Amata in France, from around 200,000 to 400,000 years ago. …….
John Rick, an anthropologist at Stanford University, says that if the find is confirmed it will be interesting because it shows that hominids could conceive of using technology to organise things. “They had the idea of actually making a structure, a place where you might sleep. It represents a conceptual division between inside and outside.”
There is little doubt that while city walls are at most 15-20,000 years old, even hunter gatherers from 100,000 years ago were no strangers to walls. Even those who used caves in temperate zones probably only used caves as winter quarters. In summers they would have used lightweight, temporary walls.
Aerial view of part of the Great Wall
The idea of a safe, protected, enclosure lies deep in the human psyche. Walls are existential. What would we be without our houses, buildings, dams, sea-walls, siege walls, curtain walls, walls around fields, walled enclosures, prisons or walls at nation boundaries? Walls between nations are at least 5,000 years old and probably predate even the definition of nation sates.
European Union states have built over 1,000km of border walls since the fall of the Berlin Wall in 1989, a new study into Fortress Europe has found. …… the EU has gone from just two walls in the 1990s to 15 by 2017. ……. Despite celebrations this year that the Berlin Wall had now been down for longer than it was ever up, Europe has now completed the equivalent length of six Berlin walls during the same period.
It is the classic dilemma of our age which shows up everywhere.
I nearly always tend towards prioritising the one primarily because without the one there cannot be the many, without the local you cannot get to global, without the national there is no international and without the excellence of the few you cannot get the good of the herd. I prefer the highest multiple possible to the lowest common factor. It colours my politics. I prefer the search for excellence rather than the common mediocrity of socialism. I prefer the the internationalism which comes as a consequence of strong nationalism to the bullying of the UN or the EU.
I side with “to each as he deserves” rather than to “each as he desires”. Freedoms flow bottom up from the individual to the group rather than privileges and sanctions flowing from the group down to the oppressed individual.
A foot print found in Crete is probably that of an early hominid and was made 5.7 million years ago.
The single Out-of-Africa theory sometime around 70,000 years ago is already obsolete. It was more likely to have been multiple hominin crossings out of Africa into both Europe and Asia. The expansion of homo sapiens sapiens is more likely to have been in at least two waves out of Africarabia.Homo erectus appeared around 2 million years ago while homo sapiens appeared about 1 million years ago. Homo sapiens sapiens then split from homo sapiens neanderthalensis around 600,000 years ago. So the footprint found is that of a human ancestor after the split with chimpanzees (c. 8-9 million years ago), after the beginning of bipedalism but well before homo erectus appeared.
….. Ever since the discovery of fossils of Australopithecus in South and East Africa during the middle years of the 20th century, the origin of the human lineage has been thought to lie in Africa. More recent fossil discoveries in the same region, including the iconic 3.7 million year old Laetoli footprints from Tanzania which show human-like feet and upright locomotion, have cemented the idea that hominins (early members of the human lineage) not only originated in Africa but remained isolated there for several million years before dispersing to Europe and Asia. The discovery of approximately 5.7 million year old human-like footprints from Crete, published online this week by an international team of researchers, overthrows this simple picture and suggests a more complex reality. …….
The new footprints, from Trachilos in western Crete, have an unmistakably human-like form. This is especially true of the toes. The big toe is similar to our own in shape, size and position; it is also associated with a distinct ‘ball’ on the sole, which is never present in apes. The sole of the foot is proportionately shorter than in the Laetoli prints, but it has the same general form. In short, the shape of the Trachilos prints indicates unambiguously that they belong to an early hominin, somewhat more primitive than the Laetoli trackmaker. They were made on a sandy seashore, possibly a small river delta, whereas the Laetoli tracks were made in volcanic ash. …….
…… During the time when the Trachilos footprints were made, a period known as the late Miocene, the Sahara Desert did not exist; savannah-like environments extended from North Africa up around the eastern Mediterranean. Furthermore, Crete had not yet detached from the Greek mainland. It is thus not difficult to see how early hominins could have ranged across south-east Europe and well as Africa, and left their footprints on a Mediterranean shore that would one day form part of the island of Crete.
‘This discovery challenges the established narrative of early human evolution head-on and is likely to generate a lot of debate. Whether the human origins research community will accept fossil footprints as conclusive evidence of the presence of hominins in the Miocene of Crete remains to be seen,’ says Per Ahlberg.
Abstract
We describe late Miocene tetrapod footprints (tracks) from the Trachilos locality in western Crete (Greece), which show hominin-like characteristics. They occur in an emergent horizon within an otherwise marginal marine succession of Messinian age (latest Miocene), dated to approximately 5.7 Ma (million years), just prior to the Messinian Salinity Crisis. The tracks indicate that the trackmaker lacked claws, and was bipedal, plantigrade, pentadactyl and strongly entaxonic. The impression of the large and non-divergent first digit (hallux) has a narrow neck and bulbous asymmetrical distal pad. The lateral digit impressions become progressively smaller so that the digital region as a whole is strongly asymmetrical. A large, rounded ball impression is associated with the hallux. Morphometric analysis shows the footprints to have outlines that are distinct from modern non-hominin primates and resemble those of hominins. The interpretation of these footprints is potentially controversial. The print morphology suggests that the trackmaker was a basal member of the clade Hominini, but as Crete is some distance outside the known geographical range of pre-Pleistocene hominins we must also entertain the possibility that they represent a hitherto unknown late Miocene primate that convergently evolved human-like foot anatomy.
Nothing wrong with fantasy of course. It just makes for bad science. The real problem here is that it is very bad science being encouraged by the journal Nature. The whole paper is based on analysing some crushed Mastodon bones which were found 25 years ago, a doubtful application of a dating technique and then the assertion that it was impossible for the bones to have been crushed by anything other than human activity. They made some experiments to crush bones and then they leap to their fantastic conclusion that the crushing was (was and not might have been) by stone tools (of which there are no traces) made by unknown humans (who also have left no other trace).
This is not just fantasy. It is borderline rubbish.
Prof Michael R Waters, from Texas A&M University in College Station, described the new paper as “provocative”. He told BBC News the study “purports to provide evidence of human occupation of the Americas some 115,000 years before the earliest well established evidence”.
Prof Waters explained: “I have no issues with the geological information – although I would like to know more about the broader geological context – and the likely age of the locality. However, I am sceptical of the evidence presented that humans interacted with the mastodon at the Cerutti Mastodon site. …… To demonstrate such early occupation of the Americas requires the presence of unequivocal stone artefacts. There are no unequivocal stone tools associated with the bones… this site is likely just an interesting paleontological locality.”
Prof Tom Dillehay, from Vanderbilt University in Nashville, Tennessee, told BBC News the claim was not plausible. Another authority on early American archaeology, Prof David Meltzer from Southern Methodist University in Dallas, Texas, said: “Nature is mischievous and can break bones and modify stones in a myriad of ways. ……. With evidence as inherently ambiguous as the broken bones and non-descript broken stones described in the paper, it is not enough to demonstrate they could have been broken/modified by humans; one has to demonstrate they could not have been broken by nature. ….. This is an equifinality problem: multiple processes can cause the same product.”
Sometimes, I’ve noticed, I irritate people around me who would rather be sad. I wondered why I was an optimist and always saw the glass half full.
This example came to mind.
It is perhaps not widely known that the world is facing a new crisis. It is an inescapable conclusion if two assumptions are correct. First, that intelligence – however it is defined – is hereditary. Second, that more intelligent people have fewer children. If intelligence is inherited and the intelligent have fewer children, it does not take an Einstein to realise that the world is getting dumber every day.
We know that intelligence is at least partly hereditary. Furthermore, all over the world, the number of children the intelligent have has fallen sharply. It is simple arithmetic that generation after generation, the world must be dumbing down. Or rather, generation after generation, children of the world must have dumber parents.
This reasoning has a few flaws. Intelligence is not just hereditary. It also depends on nutrition, education and the environment the child grows up in. Knowledge is also not the same as intelligence, and measurement of intelligence cannot avoid including some influence of knowledge. It has been calculated that even if we do know that knowledge is increasing, and has increased continuously, human intelligence peaked when we were still hunter-gatherers about 15,000 years ago. So although human intelligence has probably reduced, it has done so very slowly and is partly compensated for by the increase of knowledge. What is clear, however, is that intelligence is not increasing at the rate it would if it were a survival factor for natural selection.
In any case, if and when an intelligence problem becomes a crisis, we can always solve it with the right choice of tax system. As you all and every politician knows, no problem exists that cannot be solved by an appropriate tax system. So, in the event of a crisis, my solution would be very simple. Income tax would be scrapped and replaced by a tax on intelligence. The tax would increase with intelligence, but those with higher intelligence than the average would have their tax rate reduced for each child, while those who were below average would have their rate increased with every child.
The world would be a very boring place without problems to solve.
Perhaps it is so that fiddling with the tax system is not the solution to every problem, but in my worldview, problems exist to be solved. Not a problem in every challenge, but a challenge in every problem.
The Indus-Saraswati Valley civilisation reached its peak around 1,900 BCE. It had been flourishing there for over a millennium from about 3300 BCE. But various proto-Harappan cultures had existed in those fertile plains for almost 4,000 years before that (from about 7,000BCE). At their peak they occupied the entire Indus -Saraswati Valley and stretched as far as the Indo-Gangetic plain. At its peak there were some 1,000 settlements and at least 5 “great” cities that we now know of; Mohenjo-Daro, Harappa, Ganweriwala, Rakhigarhi and Dholavira. None of these are truly coastal and it is not improbable that one or perhaps two “great” coastal cities are now submerged and waiting to be discovered. Only about 10% of the known sites have been investigated and the Indus Valley script – which I call Harappan for convenience – has yet to deciphered.
But by about 1,000 BCE the glories of the civilisation had disappeared; not swept away in one fell swoop by some marauding invaders or by some great pestilence or some cataclysmic natural catastrophe, but gradually as cities and settlements were abandoned and the population gradually thinned out and reduced to a shadow of its heyday. Coming out of the ice-age around 20,000 years ago, sea-levels were almost 100m lower than today. By 7,000 BCE (9,000 years ago) sea levels were already about 30m lower than at present and were rising fast at around 8-10 m/millennium. The settlements in the region were either on the coast or followed the course of the great rivers. It was a 300 – 500 year process of desertification which saw the Saraswati dry up and the creation of the Thar desert.
Saraswati and Thar Desert
Where they all went is mainly conjecture but it is likely that they “followed the water”. Some of the sources of the Saraswati would have diverted to flow into the Ganges. That would have taken some people westwards, back along the coast towards the then fertile Persian Gulf, some eastwards across the Indo-Gangetic plain and some southwards along the coast of the Indian subcontinent. Quite possibly some reached the Bay of Bengal and others reached south India and the Indian Ocean. But they did not move into empty spaces. The Indian subcontinent had been continuously settled from the times of homo erectus but by the time of the Toba eruption 74,000 years ago homo erectus had already been replaced by homo sapiens. So when the Harappans moved in, modern humans were already there, but not in large numbers. The earlier settlers probably included the few survivors of a pre-Toba wave of expansion who were then absorbed by later settlers – probably many arrival instances – over some 50,000 years.
Where the Harappans probably went
In my narrative it is the Harappans and their language which provided the nucleus for, and eventually became, the family of Dravidian languages. In fact it is probable that some of the roots of what became Hinduism came also with them. I would even suggest that the specialisation of functions (administrators, priests, traders, craftsmen and labour) that must have existed in the meticulously planned, water-resourceful, trading cities of the Indus-Saraswati Valley led to the foundation of guilds and a stratified society. That probably laid the foundations of the caste system which, in its perverted form, currently disgraces the subcontinent.
Andrew Robinson looks at the state of the decipherment of the Harappan script in Nature.
Nature 526, 499–501 (22 October 2015) doi:10.1038/526499a.
Indus unicorn on a roughly 4,000-year-old sealstone, found at the Mohenjo-daro site. photo – Robert Harding/Corbis
The Indus civilization flourished for half a millennium from about 2600 bc to 1900 bc. Then it mysteriously declined and vanished from view. It remained invisible for almost 4,000 years until its ruins were discovered by accident in the 1920s by British and Indian archaeologists. Following almost a century of excavation, it is today regarded as a civilization worthy of comparison with those of ancient Egypt and Mesopotamia, as the beginning of Indian civilization and possibly as the origin of Hinduism.
More than a thousand Indus settlements covered at least 800,000 square kilometres of what is now Pakistan and northwestern India. It was the most extensive urban culture of its period, with a population of perhaps 1 million and a vigorous maritime export trade to the Gulf and cities such as Ur in Mesopotamia, where objects inscribed with Indus signs have been discovered. Astonishingly, the culture has left no archaeological evidence of armies or warfare.
Most Indus settlements were villages; some were towns, and at least five were substantial cities … boasted street planning and house drainage worthy of the twentieth century ad. They hosted the world’s first known toilets, along with complex stone weights, elaborately drilled gemstone necklaces and exquisitely carved seal stones featuring one of the world’s stubbornly undeciphered scripts. …
The Indus script is made up of partially pictographic signs and human and animal motifs including a puzzling ‘unicorn’. …..
Whatever their differences, all Indus researchers agree that there is no consensus on the meaning of the script. There are three main problems. First, no firm information is available about its underlying language. Was this an ancestor of Sanskrit or Dravidian, or of some other Indian language family, such as Munda, or was it a language that has disappeared? Linear B was deciphered because the tablets turned out to be in an archaic form of Greek; Mayan glyphs because Mayan languages are still spoken. Second, no names of Indus rulers or personages are known from myths or historical records: no equivalents of Rameses or Ptolemy, who were known to hieroglyphic decipherers from records of ancient Egypt available in Greek. ……
……. Nevertheless, almost every researcher accepts that the script contains too many signs to be either an alphabet or a syllabary (in which signs represent syllables), like Linear B. It is probably a logo-syllabic script — such as Sumerian cuneiform or Mayan glyphs — that is, a mixture of hundreds of logographic signs representing words and concepts, such as &, £ and %, and a much smaller subset representing syllables.
As for the language, the balance of evidence favours a proto-Dravidian language, not Sanskrit. Many scholars have proposed plausible Dravidian meanings for a few groups of characters based on Old Tamil, although none of these ‘translations’ has gained universal acceptance. ……… A minority of researchers query whether the Indus script was capable of expressing a spoken language, mainly because of the brevity of inscriptions. ……. This theory seems unlikely, for various reasons. Notably, sequential ordering and an agreed direction of writing are universal features of writing systems. Such rules are not crucial in symbolic systems. Moreover, the Indus civilization must have been well aware through its trade links of how cuneiform functioned as a full writing system. ……….
What the Harappans wrote and spoke was not Dravidian itself, but it was very likely a proto-Dravidian language, which, with many other influences from what already existed in the South Indian regions they moved into, became the family of Dravidian languages existing today. And it could explain why a Dravidian language can be found today in what is Afghanistan.
Over at 6,000 generationsI post about the new paper about the 47 human teeth found in Fuyan Cave, Daoxian, China which are between 80,000 and 120,000 years old.
The 47 human teeth found in Fuyan Cave, Daoxian, China. photo S. XING AND X-J. WU via DiscoveryNews
There were clearly many Out of Africa or Africarabia events starting from 130,000 years ago both before and after the Toba explosion.
The single Out of Africa event for modern humans is clearly far too simplistic. It is also clear that there were many back to Africa movements as well. Humans expanded sometimes because their old habitats were no longer viable. But, it seems, humans also explored and expanded into new territories from regions of plenty and where they maintained some contact with where they had come from. Probably, just because they could.