Archive for the ‘History’ Category

The Genocide of the Neanderthals by the Even Newer Africans

May 13, 2021

In the politically correct and virtue signalling world, where pseudo-morality reigns, colonisation has become a dirty word. Colonists are considered evil. Statues of colonists are even more depraved. The colonised of the past are always considered victims by the present. Needless to say, a place in the kingdom of heaven is reserved for the colonised. The reality is that any living species which does not colonise is doomed either to stagnation in a niche habitat or to failure and extinction. Colonisation is the stuff of life. Geographical spaces are colonised when expanding communities invade and bring more competitive cultures or technologies than existing in that space. Populations are colonised when their culture and technology cannot compete with incoming ones. Strangely, it is only the European colonisations between about 1400 and 1900 CE which have become politically incorrect. But what is conveniently forgotten is that the colonised populations in Australia and N America and even S America at that time were so backward in technology that they were ripe for colonisation by any invading community with superior technology. If not the Europeans, it would have been someone else. It is, of course, politically incorrect to point out that the colonised were once colonists too, and have themselves primarily to blame. Colonisations in antiquity by the Phoenicians, Carthaginians, Greeks, Romans, and Han Chinese are too far in the past for moral judgements in the present. The Mongols, the Normans, and the Vikings generally escape censure today.

But it is worth remembering that human colonisation was started by the Africans.

Colonisation is primarily about the expansion of the physical space being occupied by a biological community. The community may be a whole species or just a particular strain within a species. It is a phenomenon exhibited by every successful biological community from viruses and bacteria and fungi to plants and animals (including humans). A new territory or habitat may already be occupied by other species, or strains of the same species, or unoccupied. The incoming community are the colonists. Any communities already existing in the space are the colonised. Many attempts at colonisation fail; either because the colonists cannot adapt to the new habitat or because they cannot compete (biologically, culturally, or technologically) with the existing inhabitants.

That living things exist in every conceivable corner of the earth’s surface is a consequence of colonisation. That living things find it necessary to search for new habitats is a consequence of surviving changing environments, of growth, and of the genetic diversity inherent in every species. There are a few species which have stagnated in tiny niche habitats, exhibit unusually little genetic diversity and are unable to change. They have become so specialised to fit their habitat that they are incapable of adapting to any other and have reached evolutionary “dead-ends”. Panda bears and theridiid spiders are examples. They have become incapable of growth or of colonisation and are probably on their slow path to extinction.

When it comes to the origins of human colonisation we need to go back to before we were ever human. (I take humans to mean Anatomically Modern Humans who appeared around 300,000 years ago). Some little time after we had evolved from hominids to hominin, and perhaps around 800,000 years ago, a common hominin ancestor of Neanderthals, Denisovans, a couple of unknown hominin species and of AMH, emigrated from Africa and colonised most of Europe, Central Asia, and South East Asia. Most likely the movement of whole populations was driven, not by a shortage of space, but by changes of climate and a shortage of food. (Note that immigration is not necessarily colonisation, but colonisation always involves emigration). These Old Africans were emigrants and the first ever colonists. They were not initially immigrants since the territories they moved into had no other hominin inhabitants. There were probably many waves of Old Africans and later emigrants may well have been immigrants. Many of the areas they moved into did have indigenous hominid populations. However, the indigenous culture and technology was not sufficiently competitive to prevent the wave of hominin colonisation. Hominins had fire while hominids and other species did not. The colonisation of the world by the Old Africans led to the demise of many species which could not compete against the advanced culture and technology they exhibited. Some were hunted to extinction as prey, while others were unable to adapt quickly enough, and still others were just crowded out by the newcomers.

In due course (a small matter of a few hundred thousand years) the Old Africans in Central Asia and Europe evolved to become the Neanderthals. From about 500,000 years ago they were the dominant species for about 300,000 years. In South East Asia, the Old Africans evolved to become the Denisovans. In the rest of Asia (S China, India, and the Middle East), the Old Africans were still around but had evolved to become some as yet unknown hominin species. In Africa, the Old Africans gave way eventually to Anatomically Modern Humans (AMH) by about 300,000 years ago. Let us call them the New Africans.

Then from about 200,000 years ago there were a number of waves of New African emigration/colonisation into Europe and Asia. These emigrant waves continued sporadically for 100,000 years culminating in the Even Newer Africans coming “Out of Africa” around 60 – 70,000 years ago. The New Africans and the Even Newer Africans found indigenous hominin populations all across the new territories they expanded into. They were sometimes just other New Africans and sometimes they were blended populations of Old Africans (Neanderthals, Denisovans, …) and New Africans. In India, for example, the Even Newer Africans arrived after the Toba eruption and mingled genetically with surviving populations of Old Africans already mingled with New Africans.

Whether there was conflict between indigenous and arriving populations, or whether one culture was gradually submerged into the more dominant one is unknown. What is known is that the arrival of the Even Newer Africans caused the Neanderthals and the Denisovans and some other hominin species around to disappear. By around 50,000 years ago the Denisovans were extinct and by 40,000 years ago there were no Neanderthals left. However, their genes still survive and live on in us.

In current-day politically correct terms and to signal great virtue in ourselves, it could be called the Genocide of the Neanderthals by the Even Newer Africans.


Should Adolf Hitler be cancelled?

April 20, 2021

It is 132 years since Adolf Hitler was born on 20th April 1899 in Braunau am Inn, Austria.

Of course there are no statues of Hitler around, or so one might think.

But you can buy a bust of Hitler on the net for $59.99.

And there are plenty of paintings of him, physical copies of books by him and about him and any number of photographs. And then we come to all the digital statues of Hitler that exist. Of course he has an entry in Wikipedia. A Google search which took 0.57 seconds generates 67 million results.

It would be politically correct to ensure that anything even faintly positive about Hitler be destroyed. All historical records should be thoroughly cleansed so that only derogatory material about him remains. 

Or maybe erect a statue every year to be then pulled down. Erect it on 20th April and pull it down with great ceremony on April 30th every year?


Lawyers are to humans as fungi are to trees

April 16, 2021

I have suggested in the past that cooking may be the oldest art form and that chefs may have been members of the oldest profession. However it may be that lawyers came first.


Reblogged from A short history of lawyers (upcounsel.com)

Imbricate fruiting of Phaeolus schweinitzii. image forestpathology.org

Like the symbiotic relationship between trees and fungus, lawyers and humans have an important, interlocking relationship going back to the dawn of man.

The following is excerpted from “Some Lawyers Are People Too!” by Hugh L. Dewey, Esq. (2009). 

Legal anthropologists have not yet discovered the proverbial first lawyer. No briefs or pleadings remain from the proto-lawyer that is thought to have been in existence more than 5 million years ago.

Chimpanzees, man’s and lawyer’s closest relative, share 99% of the same genes. New research has definitely proven that chimpanzees do not have the special L1a gene that distinguishes lawyers from everyone else. (See Johnson, Dr. Mark. “Lawyers in the Mist?” Science Digest, May 1990: pp. 43-52.) This disproved the famous outcome of the Scopes Monkey Trial in which Clarence Darrow proved that monkeys were also lawyers.

Charles Darwin, Esquire, theorized in the mid-1800s that tribes of lawyers existed as early as 2.5 million years ago. However, in his travels, he found little evidence to support this theory.

Legal anthropology suffered a setback at the turn of the century in the famous Piltdown Lawyer scandal. In order to prove the existence of the missing legal link, a scientist claimed he had found the skull of an ancient lawyer. The skull later turned out to be homemade, combining the large jaw of a modern lawyer with the skull cap of a gorilla. When the hoax was discovered, the science of legal anthropology was set back 50 years.

The first hard scientific proof of the existence of lawyers was discovered by Dr. Margaret Leakey at the Olduvai Gorge in Tanzania. Her find consisted of several legal fragments, but no full case was found intact at the site. Carbon dating has estimated the find at between 1 million and 1.5 million years ago. However, through legal anthropology methods, it has been theorized that the site contains the remains of a fraud trial in which the defendant sought to disprove liability on the basis of his inability to stand erect. The case outcome is unknown, but it coincides with the decline of the Australopithecus and the rise of Homo Erectus in the world. (See Leakey, Margaret A. “The case of erectus hominid.” Legal Anthropology, March 1947: pp. 153.)

In many sites dating from 250,000 to 1,000,000 years ago, legal tools have been uncovered. Unfortunately, the tools are often in fragments, making it difficult to gain much knowledge.

The first complete site discovered has been dated to 150,000 years ago. Stone pictograph briefs were found concerning a land boundary dispute between a tribe of Neanderthals and a tribe of Cro-Magnons. This decision in favor of the Cro-Magnon tribe led to a successive set of cases, spelling the end for the Neanderthal tribe. (See Widget, Dr. John B. “Did Cro-Magnon have better lawyers?” Natural History, June 1926: p. 135. See also Cook, Benjamin. Very Very Early Land Use Cases. Legal Press, 1953.)

Until 10,000 years ago, lawyers wandered around in small tribes, seeking out clients. Finally, small settlements of lawyers began to spring up in the Ur Valley, the birthplace of modern civilization. With settlement came the invention of writing. Previously, lawyers had relied on oral bills for collection of payment, which made collection difficult and meant that if a client died before payment (with life expectancy between 25 and 30 and the death penalty for all cases, most clients died shortly after their case was resolved), the bill would remain uncollected. With written bills, lawyers could continue collection indefinitely.

In the late 1880s, legal anthropologists cracked the legal hieroglyphic language when they were able to determine the meaning of the now famous Rosetta Stone Contract. (See Harrison, Franklin D. The Rosetta Bill. Doubleday, 1989.) The famous first paragraph can be recited verbatim by almost every lawyer:

“In consideration of 20,000 Assyrians workers, 3,512 live goats, and 400,000 hectares of dates, the undersigned hereby conveys all of the undersigned’s right, title, and interest in and to the property commonly known as the Sphinx, more particularly described on Stone A attached hereto and made a part hereof.”

The attempted sale of the Sphinx resulted in the Pharaoh issuing a country-wide purge of all lawyers. Many were slaughtered, and the rest wandered in the desert for years looking for a place to practice.

Greece and Rome saw the revival of the lawyer in society. Lawyers were again allowed to freely practice, and they took full advantage of this opportunity. Many records exist from this classic period. Legal cases ranged from run-of-the-mill goat contract cases to the well-known product liability case documented in the Estate of Socrates vs. Hemlock Wine Company. (See Wilson, Phillips ed. Famous Roman Cases. Houghton, Mifflin publishers, 1949.)

The most famous lawyer of this period was Hammurabi the Lawyer. His code of law gave lawyers hundreds of new business opportunities. By creating a massive legal system, the demand for lawyers increased ten-fold. In those days, almost any thief or crook could kill a sheep, hang-up a sheepskin, and practice law, unlike the highly regulated system today which limits law degrees to only those thieves and crooks who haven’t been convicted of a major felony.

The explosion in the number of lawyers coincided with the development of algebra, the mathematics of legal billing. Pythagoras, a famous Greek lawyer, is revered for his Pythagorean Theorem, which proved the mathematical quandary of double billing. This new development allowed lawyers to become wealthy members of their community, as well as to enter politics, an area previously off-limits to lawyers. Despite the mathematical soundness of double billing, some lawyers went to extremes. Julius Caesar, a Roman lawyer and politician, was murdered by several clients for his record hours billed in late February and early March of 44 B.C. (His murder was the subject of a play by lawyer William Shakespeare. When Caesar discovered that one of his murderers was his law partner Brutus, he murmured the immortal lines, “Et tu Brute,” which can be loosely translated from Latin as “my estate keeps twice the billings.”)

Before the Roman Era, lawyers did not have specific areas of practice. During the period, legal specialists arose to meet the demands of the burgeoning Roman population. Sports lawyers counseled gladiators, admiralty lawyers drafted contracts for the great battles in the Coliseum, international lawyers traveled with the great Roman armies to force native lawyers to sign treaties of adhesion — many of which lasted hundreds of years until they were broken by the barbarian lawyers who descended on Rome from the North and East — and the ever-popular Pro Bono lawyers (Latin for “can’t get a real job”) who represented Christians and lost all their cases for 300 years.

As time went on, the population of lawyers continued to grow until 1 out of every 2 Romans was a lawyer. Soon lawyers were intermarrying. This produced children who were legally entitled to practice Roman law, but with the many defects that such a match produced, the quality of lawyers degenerated, resulting in an ever-increasing defective legal society and the introduction of accountants. Pressured by the legal barbarians from the North with their sign-or-die negotiating skills, Rome fell, and the world entered the Dark Ages.

During the Dark Ages, many of the legal theories and practice developed during the golden age were forgotten. Lawyers lost the art of double billing, the thirty-hour day, the 15-minute phone call, and the conference stone. Instead, lawyers became virtually manual laborers, sharing space with primitive doctor-barbers. Many people sought out magicians and witches instead of lawyers since they were cheaper and easier to understand.

The Dark Ages for lawyers ended in England in 1078. Norman lawyers discovered a loophole in Welsh law that allowed William the Conqueror to foreclose an old French loan and take most of England, Scotland, and Wales. William rewarded the lawyers for their work, and soon lawyers were again accepted in society.

Lawyers became so popular during this period that they were able to heavily influence the kings of Britain, France, and Germany. After a Turkish corporation stiffed the largest and oldest English law firm, the partners of the firm convinced these kings to start a Bill Crusade, sending collection knights all the way to Jerusalem to seek payment.

A major breakthrough for lawyers occurred in the 17th century. Blackstone the Magician, on a trip through Rome, unearthed several dozen ancient Roman legal texts. This new knowledge spread through the legal community like the black plague. Up until that point, lawyers used the local language of the community for their work. Since many smart non-lawyers could then determine what work, if any, the lawyer had done, lawyers often lost clients, and sometimes their head.

Using Blackstone’s finds, lawyers could use Latin to hide what they did so that only other lawyers understood what was happening in any lawsuit. Blackstone was a hero to all lawyers until, of course, he was sued for copyright infringement by another lawyer. Despite his loss, Blackstone is still fondly remembered by most lawyers as the father of legal Latin. “Res ipsa loquitur” was Blackstone’s favorite saying (“my bill speaks for itself”), and it is still heard today.

Many lawyers made history during the Middle Ages. Genghis Kahn, Esq., from a family of Jewish lawyers, Hun & Kahn, pioneered the practice of merging with law offices around Asia Minor at any cost. At one time, the firm was the largest in Asia and Europe. Their success was their downfall. Originally a large personal injury firm (if you didn’t pay their bill, they personally injured you), they became conservative over time and were eventually overwhelmed by lawyers from the West. Vlad Dracul, Esq., a medical malpractice specialist, was renowned for his knowledge of anatomy, and few jurors would side against him for fear of his special bill (his bill was placed atop 20-foot wooden spears on which the non-paying client was placed).

Leonardo di ser Piero da Vinci, Esq., was multi-talented. Besides having a busy law practice, he was an artist and inventor. His most famous case was in defense of himself. M. Lisa vs. da Vinci (Italian Superior Court 1513) involved a product liability suit over a painting da Vinci delivered to the plaintiff. The court, in ruling that the painting was not defective despite the missing eyebrows, issued the famous line, “This court may not know art, but it knows what it likes, and it likes the painting.” This was not surprising since the plaintiff was known for her huge, caterpillar-like eyebrows. Da Vinci was able to convince the court that he was entitled not only to damages but to attorneys’ fees, costs, and punitive damages as well. The court, taking one last look at the plaintiff, granted the request.

A land dispute case in the late 15th century is still studied today for the clever work of Christopher Columbus, Esq. He successfully convinced an Aztec court, in Columbus vs. 1,000,000 Acres that since the Indians did not believe in possession, they could not claim the land in question. Therefore, his claim had to be given priority. Despite the fact that the entire court was sacrificed to the gods, the case held and Spain took an early legal lead in the New World.

As the New World was colonized, England eventually surpassed Spain as the leading colonizer. England began sending all of its criminals and thieves to the New World. This mass dumping of lawyers to the states would come back to haunt England. Eventually, the grandchildren of these pioneer lawyers would successfully defeat King George III in the now famous King George III v. 100 Bags of Tea. England by this time was now dreadfully short of lawyers. The new American lawyers exploited this shortfall and, after a seven-year legal war, defeated the British and created the United States, under the famous motto, “All lawyers are created equal.”

England never forgot this lesson and immediately stopped its practice of sending lawyers to the colonies. This policy left Australia woefully deficient in lawyers.

With stories of legal success common in the late 1700s, more and more people attempted to become lawyers. This process of stealing a shingle worried the more successful lawyers. To stem this tide as well as to create a new profit center, these lawyers passed laws requiring all future lawyers to be restricted from practice unless they went to an approved law school. The model school from which all legal education rules developed was Harvard Law School.

Harvard, established in 1812, set the standard for legal education when, in 1816, it created the standardized system for legal education. This system was based on the Socratic method. At most universities, the students questioned the teacher/professor to gain knowledge. These students would bill their professors, and if the bill went unfulfilled, the students usually hung up their law professor for failure of payment At Harvard, the tables were turned, with the professors billing the students. This method enriched the professors and remains the standard in use in most law schools in America and England.

As developed by Harvard, law students took a standard set of courses as follows:

  1. Jurisprudence: The history of legal billing, from early Greek and Roman billing methods to modern collection techniques.
  2. Torts: French law term for “you get injury, we keep 40%.” Teaches students ambulance-chasing techniques.
  3. Contracts: Teaches that despite an agreement between two parties (the contract), a lawsuit can still be brought.
  4. Civil Procedure: Teaches the tricky arcane rules of court, which were modernized only 150 years ago in New York.
  5. Criminal Law: Speaks for itself.

These courses continue to be used in most law schools throughout the United States.

Despite the restrictions imposed on the practice of law (a four-year college degree, three years of graduate school, and a state-sponsored examination), the quantity of lawyers continues to increase to the point that three out of every five Americans are lawyers. (In fact, there are over 750,000 lawyers in this country.) Every facet of life today is controlled by lawyers. Even Dan Quayle (a lawyer) claims, surprise, that there are too many lawyers. Yet until limits are imposed on legal birth control, the number of lawyers will continue to increase. Is there any hope? We don’t know and frankly don’t care since the author of this book is a successful, wealthy lawyer, the publishers of this book are lawyers, the cashier at the bookstore is a law student, and your mailman is a lawyer. So instead of complaining, join us and remember, there is no such thing as a one-lawyer town.


Lawyers are members of a parasitic life-form which emerges in the cracks of human society.


How Google search creates Fake News

January 15, 2021

Fake News is created just as much by excluding selected news as by inventing stories. Cancelling news also creates fake news.

Google’s “experiment” in Australia has been exposed recently. However, this is not the first such “experiment” and it won’t be the last. But exclusion is a tool used widely by every news outlet to try and control the narrative (and it is noticeable that every outlet does try to control the narrative). There is no news outlet anymore that does not have its own agenda which does not engage in excluding what is unpalatable. All social media platforms have self-serving agendas. They all indulge in “exclusion” as a tool. Sometimes it is simply to create a false (favourable) picture to increase revenues from advertising. Sometimes it is to be politically correct and avoid legal, political or social sanction. It is the same phenomenon which drives the “cancel culture”. We are all familiar with paid advertising always getting preference in Google searches. But Googles’s search algorithms are secret and supposedly untouched by human hand, but they are always changing. They know very well that few go beyond the second page of search results. The algorithms are constantly being tweaked. And in every tweak there is some new exclusion and some new Fake News.

Perceived reality has little to do with “facts” and is entirely about the current narrative. History has become (has always been) a servant of the current narrative. Google Search is primarily a tool for the creation of advertising revenue. The search is always biased in the algorithm. The perceived objectivity of the search is secondary to the revenue objective. Fake News has become a major part of the output of Mainstream Media and exclusion is just another tool for the creation of a false narrative.


Mathematics started in prehistory with counting and the study of shapes

January 8, 2021
Compass box

Mathematics today is classified into more than 40 different areas by journals for publication. I probably heard about the 3 R’s (Reading, Riting and Rithmetic) first at primary school level. At high school, which was 60 years ago, mathematics consisted of arithmetic, geometry, algebra, trigonometry, statistics and logic – in that order. I remember my first class where trigonometry was introduced as a “marriage of algebra and geometry”. Calculus was touched on as advanced algebra. Some numerical methods were taught as a part of statistics. If I take my own progression, it starts with arithmetic, moves on to geometry, algebra and trigonometry and only then to calculus and statistics, symbolic logic and computer science. It was a big deal when, at 10, I got my first “compass box” for geometry, and another big deal, at 13, with my first copy of trigonometric tables. At university in the 70s, Pure Mathematics was distinguished from Applied Engineering Mathematics and from Computing. In my worldview, Mathematics and Physics Departments were where the specialist, esoteric mysteries of such things as topology, number theory, quantum algebra, non-Euclidean geometry and combinatorics could be studied.

I don’t doubt that many animals can distinguish between more and less. It is sometimes claimed that some primates have the ability to count up to about 3. Perhaps they do, but except for in the studies reporting such abilities, they never actually do count. No animals apply counting, They don’t exhibit any explicit understanding of geometrical shapes or structures, though birds, bees, ants and gorillas seem to apply some structural principles, intuitively, when building their nests. Humans, as a species, are unique in not only imagining but also in applying mathematics. We couldn’t count when we left the trees. We had no tools then and we built no shelters. So how did it all begin?

Sometimes Arithmetic, Geometry and Algebra are considered the three core areas of mathematics. But I would contend that it must all start with counting and with shapes – which later developed into Arithmetic and Geometry. Algebra and its abstractions came much later. Counting and the study of shapes must lie at the heart of how prehistoric humans first came to mathematics. But I would also contend that counting and observing the relationship between shapes would have started separately and independently. They both require a certain level of cognition but they differ in that the study of shapes is based on observations of physical surroundings while counting requires invention of concepts in the abstract plane. They may have been contemporaneous but they must, I think, have originated separately.

No circle of standing stones would have been possible without some arithmetic (rather than merely counting) and some geometry. No pyramid, however simple, was ever built without both. No weight was dragged or rolled up an inclined plane without some understanding of both shapes and numbers. No water channel that was ever dug did not involve some arithmetic and some geometry. Already by the time of Sumer and Babylon, and certainly by the time of the Egyptians and the Harappans, the practical application of arithmetic and geometry and even trigonometry in trade, surveying, town planning, time-keeping and building were well established. The sophisticated management of water that we can now glimpse in the ancient civilizations needed both arithmetic and geometry. There is not much recorded history that is available before the Greeks. Arithmetic and Geometry were well established by the time we come to the Greeks who even conducted a vigorous discourse about the nobility (or divinity) of the one versus the other. Pythagoras is not happy with arithmetic since numbers cannot give him – exactly – the hypotenuse of a right triangle of sides of equal length (√2). Which he can so easily draw. Numbers could not exactly reflect all that he could achieve with a straight edge and a compass. The circle could not be squared. The circumference was irrational. The irrationality of the numbers needed to reflect geometrical shapes was, for the purists, vulgar and an abomination. But the application of geometry and arithmetic were common long, long before the Greeks. There is a great distance before counting becomes arithmetic and the study of shapes becomes geometry but the roots of mathematics lie there. That takes us back to well before the Neolithic (c. 12,000 years ago).

That geometry derives from the study of shapes and the patterns and relationships between shapes, given some threshold level of cognition, seems both obvious and inevitable. Shapes are real and ubiquitous. They can be seen in all aspects of the natural world and can be mimicked and constructed. The arc of the sun curving across the sky creates a shape. Shadows create shapes. Light creates straight lines as the elevation of the sun creates angles. Shapes can be observed. And constructed. A taut string to give a straight line and the calm surface of a pond to give a level plane. A string and a weight to give the vertical. A liquid level to give the horizontal. Sticks and shadows. A human turning around to observe the surroundings created a circle. Strings and compasses. Cave paintings from c. 30,000 years ago contain regular shapes. Circles and triangles and squares. Humans started not only observing, but also applying, the relationships between shapes a very long time ago.

Numbers are more mystical. They don’t exist in the physical world. But counting the days from new moon to new moon for a lunar month, or the days in a year, were also known at least 30,000 years ago. Ancient tally sticks to count to 29 testify to that. It would seem that the origins of arithmetic (and numbers) lie in our ancient prehistory and probably more than 50,000 years ago. Counting, the use of specific sounds as the representation of abstract numbers, and number systems are made possible only by first having a concept of identity which allows the definition of one. Dealing with identity and the nature of existence take us before and beyond the realms of philosophy or even theology and are in the metaphysical world. The metaphysics of existence remain mystical and mysterious and beyond human cognition, as much today as in prehistoric times. Nevertheless, it is the cognitive capability of having the concept of a unique identity which enables the concept of one. That one day is distinguishable from the next. That one person, one fruit, one animal or one thing is uniquely different to another. That unique things, similar or dissimilar, can be grouped to create a new identity. That one grouping (us) is distinguishable from another group (them). Numbers are not physically observable. They are all abstract concepts. Linguistically they are sometimes bad nouns and sometimes bad adjectives. The concept of one does not, by itself, lead automatically to a number system. That needs in addition a logic system and invention (a creation of something new which presupposes a certain cognitive capacity). It is by definition, and not by logic or reason or inevitability, that two is defined as one more than the identity represented by one, and three is defined as one more than two, and so on. Note that without the concept of identity and the uniqueness of things setting a constraint, a three does not have to be separated from a two by the same separation as from two to one. The inherent logic is not itself invented but emerges from the concept of identity and uniqueness. That 1 + 1 = 2 is a definition not a discovery. It assumes that addition is possible. It is also significant that nothingness is a much wider (and more mysterious and mystical) concept than the number zero. Zero derives, not from nothingness, but from the assumption of subtraction and then of being defined as one less than one. That in turn generalises to zero being any thing less than itself. Negative numbers emerge by extending that definition. The properties of zero are conferred by convention and by definition. Numbers and number systems are thus a matter of “invention by definition”, but constrained by the inherent logic which emerges from the concept of identity. The patterns and relationships between numbers have been the heady stuff of number theory and a matter of great wonder when they are discovered, but they are all consequent to the existence of the one, the invention of numerals and the subsequent definition that 1 + 1 = 2. Number theory exists only because the numbers are defined as they are. Whereas the concept of identity provides the basis for one and all integers, a further cognitive step is needed to imagine that the one is not indivisible and then to consider the infinite parts of one.

Mere counting is sometimes disparaged, but it is, of course, the most rudimentary form of a rigorous arithmetic with its commutative, associative and distributive laws.

Laws of arithmetic

The cognitive step of getting to count in the first place is a huge leap compared to the almost inevitable evolution of counting into numbers and then into an arithmetic with rigorous laws. We will never know when our ancestors began to count but it seems to me – in comparison with primates of today – that it must have come after a cognitive threshold had been achieved. Quite possibly with the control of fire and after the brain size of the species had undergone a step change. That takes us back to the time of homo erectus and perhaps around a million years ago.

Nearly all animals have shape recognition to some extent. Some primates can even recognise patterns in similar shapes. It is plausible that recognition of patterns and relationships between shapes only took off when our human ancestors began construction either of tools or of rudimentary dwellings. The earliest tools (after the use of clubs) were probably cutting edges and these are first seen around 1.8 million years ago. The simplest constructed shelters would have been lean-to structures of some kind. Construction of both tools and shelters lend themselves naturally to the observation of many geometrical shapes; rectangles, polygons, cones, triangles, similar triangles and the rules of proportion between similar shapes. Arches may also have first emerged with the earliest shelters. More sophisticated tools and very simple dwellings take us back to around 400,000 years ago and certainly to a time before anatomically modern humans had appeared (c. 200,000 years ago). Both rudimentary counting and a sense of shapes would have been present by then. It would have been much later that circles and properties of circles were observed and discovered. (Our earliest evidence of a wheel goes back some 8,000 years and is the application of a much older mathematics). Possibly the interest in the circle came after a greater interest in time keeping had emerged. Perhaps from the first “astronomical” observations of sunrise and sunset and the motion of the moon and the seasons. Certainly our ancestors were well-versed with circles and spheres and their intersections and relationships by the time they became potters (earlier than c. 30,000 years ago). 

I suspect it was the blossoming of trade – rather than the growth of astronomy – which probably helped take counting to number systems and arithmetic. The combination of counting and shapes starts, I think, with the invention of tools and the construction of dwellings. By the time we come to the Neolithic and settlements and agriculture and fortified settlements, arithmetic and geometry and applied mathematics is an established reality. Counting could have started around a million years ago. The study of shapes may have started even earlier. But if we take the origin of “mathematics” to be when counting ability was first combined with a sense of shapes, then we certainly have to step back to at least 50,000 years ago.


The accidental story of two times five and base ten

November 23, 2020

Humans have used many different bases for number systems but the use of base 10 is overwhelmingly dominant. There are instances of the use of base 5, base 6, base 20 and even base 27. In spite of many attempts to replace it by base 10, base 60 has fended off all rationalist suggestions and remnants remain entrenched for our current mapping of time and space. For time periods, base 60 is used exclusively for hours, minutes and seconds but base 10 for subdivisions of the second. Similarly for spatial coordinates, degrees, minutes and seconds of arc are still used but subdivisions of the second use base 10. (Some of the other bases that appear in language are listed at the end of this post).

In terms of mathematics there is no great inherent advantage in the use of one particular number base or another. The utility of a particular choice is a trade off first between size and practicality. The size of the base determines how many unique number symbols are needed (binary needs 2, decimal needs 10 and hexagesimal 16). There are many proponents of the advantages of 2, 3, 8, 12 or 16 being used as our primary number base. Certainly base 12 is the most “fraction friendly”. But all our mathematics  could, in reality, be performed in any number base.

At first glance the reasons for the use of base 10 seems blindingly obvious and looking for origins seems trivial. Our use of base 10 comes simply – and inevitably – from two hands times five digits. In recent times other bases (binary – base 2- and hexadecimal – base 16 – for example) are used more extensively with computers, but base 10 (with some base 60) still predominates in human-human interactions (except when Sheldon is showing off). The use of base 10 predates the use of base 60 which has existed for at least 5,000 years.

It is ubiquitous now but (2 x 5) is not a consequence of design. It derives from a chain of at least three crucial, evolutionary accidents which gave us

  1. four limbs, and
  2. five digits on each limb, and finally
  3. human bipedalism which reserved two limbs for locomotion and left our hands free.

The subsequent evolutionary accidents which led to increased brain size would still have been necessary for the discovery of counting and the invention of number systems. But if, instead of two, we had evolved three limbs free from the responsibilities of locomotion, with three digits on each limb, we might well have had base 9 at the foundations of counting and a nonary number system. The benefits of a place value system and the use of nonecimals would still apply.

It is more difficult to imagine what might have happened if limbs were not symmetrical or the number of digits on each limb were different. No human society has not been predominantly (c. 85%) right-handed. But left-handedness has never been a sufficient handicap to have been eliminated by evolution. Almost certainly right-handedness comes from the asymmetrical functions established in the left and right-brains. The distinction between the functions of the two sides of the brain goes back perhaps 500 million years and long before limbs and tetrapods. By the time limbs evolved, the brain functions giving our predilection for right-handedness must already have been established. So, it is possible to imagine evolution having led to, say, 6 digits on right fore-limbs and 5 digits on left fore-limbs.

I wonder what a natural base of 11 or 13 would have done to the development of counting and number systems?

Why four limbs?

All land vertebrates (mammals, birds, reptiles and amphibians) derive from tetrapods which have two sets of paired limbs. Even snakes evolved from four-limbed lizards. 

Tetrapods evolved from a group of animals known as the Tetrapodomorpha which, in turn, evolved from ancient sarcopterygians around 390 million years ago in the middle Devonian period; their forms were transitional between lobe-finned fishes and the four-limbed tetrapods. The first tetrapods (from a traditional, apomorphy-based perspective) appeared by the late Devonian, 367.5 million years ago. Wikipedia

It would seem that – by trial and error – a land-based creature, fortuitously possessing two pairs of limbs, just happened to be the one which survived and become the ancestor of all tetrapods. The evolutionary advantage of having 4 limbs (two pairs)  – rather than one or three or five pairs – is not at all clear. Insects have evolved three pairs while arachnids have four pairs. Myriapoda are multi-segmented creatures which have a pair of limbs per segment. They can vary from having five segments (10 legs) to about 400 segments (800 legs). The genes that determine the number of limbs determine many other features also and why two pairs would be particularly advantageous is not understood.  It could well be that the two pairs of limbs were incidental and merely followed other survival characteristics. The best bet currently is that

“You could say that the reason we have four limbs is because we have a belly,”

All of us backboned animals — at least the ones who also have jaws — have four fins or limbs, one pair in front and one pair behind. These have been modified dramatically in the course of evolution, into a marvelous variety of fins, legs, arms, flippers, and wings. But how did our earliest ancestors settle into such a consistent arrangement of two pairs of appendages? — Because we have a belly.

According to our hypothesis, the influence of the developing gut suppresses limb initiation along the midgut region and the ventral body wall owing to an “endodermal predominance.” From an evolutionary perspective, the lack of gut regionalization in agnathans reflects the ancestral absence of these conditions, and the elaboration of the gut together with the concomitant changes to the LMD in the gnathostomes could have led to the origin of paired fins.

The critical evolutionary accident then is that the intrepid sea creature which first colonised the land, some 390 million years ago, and gave rise to all tetrapods was one with a developing belly and therefore just happened to have two pairs of appendages.

The tail, however, is an asymmetrical appendage which may also once have been a pair (one on top of the other) but is now generally a solitary appendage. But it is controlled by a different gene-set to those which specify limbs. In mammals it has disappeared for some and performs stability functions for others. In some primates it has functions close to that of a fifth limb. But in no case has a tail ever evolved digits.

Why five digits on each limb?

When our ancestor left the oceans and became the origin of all tetrapods, four limbs had appeared but the number of digits on each limb had not then been decided. It took another 50 million years before a split distinguished amphibians from mammals, birds and reptiles. The timeline is thought to be:

  • 390 million years ago – tetrapod ancestor leaves the oceans
  • 360 million years ago – tetrapods with 6,7 and 8 digits per limb
  • 340 million years ago – amphibians go their separate way
  • 320 million years ago – reptiles slither away on a path giving dinosaurs and birds
  • 280 million years ago – the first mammals appear

SciAm

The condition of having no more than five fingers or toes …. probably evolved before the evolutionary divergence of amphibians (frogs, toads, salamanders and caecilians) and amniotes (birds, mammals, and reptiles in the loosest sense of the term). This event dates to approximately 340 million years ago in the Lower Carboniferous Period. Prior to this split, there is evidence of tetrapods from about 360 million years ago having limbs bearing arrays of six, seven and eight digits. Reduction from these polydactylous patterns to the more familiar arrangements of five or fewer digits accompanied the evolution of sophisticated wrist and ankle joints–both in terms of the number of bones present and the complex articulations among the constituent parts.

By the time we reach the mammals, five digits per limb has become the norm though many mammals then follow paths for the reduction of the number of effective digits in play. Moles and pandas evolve an extra sort-of adjunct digit from their wrists but do not (or cannot) create an additional digit.

…….. Is there really any good evidence that five, rather than, say, four or six, digits was biomechanically preferable for the common ancestor of modern tetrapods? The answer has to be “No,” in part because a whole range of tetrapods have reduced their numbers of digits further still. In addition, we lack any six-digit examples to investigate. This leads to the second part of the answer, which is to note that although digit numbers can be reduced, they very rarely increase. In a general sense this trait reflects the developmental-evolutionary rule that it is easier to lose something than it is to regain it. Even so, given the immensity of evolutionary time and the extraordinary variety of vertebrate bodies, the striking absence of truly six-digit limbs in today’s fauna highlights some sort of constraint. Moles’ paws and pandas’ thumbs are classic instances in which strangely re-modeled wrist bones serve as sixth digits and represent rather baroque solutions to the apparently straightforward task of growing an extra finger.

Five digits is apparently the result of evolutionary trial and error, but as with all things genetic, the selection process was probably selecting for something other than the number of digits. 

Science Focus

All land vertebrates today are descended from a common ancestor that had four legs, with five toes on each foot. This arrangement is known as the pentadactyl limb. Some species have subsequently fused these fingers into hooves or lost them altogether, but every mammal, bird, reptile and amphibian traces its family tree back to a pentadactyl ancestor that lived around 340 million years ago. Before, there were animals with six, seven and even eight toes on each foot, but they all went extinct at the end of the Devonian period, 360 million years ago. These other creatures were more aquatic than the pentadactyl animals. Evidence in the fossil record suggests that their ribs weren’t strong enough to support their lungs out of water and their shoulder and hip joints didn’t allow them to walk effectively on land. 

Five digits on our limbs are an evolutionary happenstance. There is nothing special that we can identify with being five. It could just as well have been six or seven or eight. That the number of digits on each limb are not unequal is also an evolutionary happenstance predating the tetrapods. It is more efficient genetically, when multiple limbs are needed, to duplicate the pattern (with some variations for mirror symmetry and for differences between paired sets). When each limb is to carry many digits it is more efficient to follow a base pattern and keep the necessary genetic variations to a minimum. 

By 280 million years ago, four limbs with five digits on each limb had become the base pattern for all land-based creatures and the stage was set for base 20. And then came bipedalism.

Why bipedalism?

Bipedalism is not uncommon among land creatures and even birds. Some dinosaurs exhibited bipedalism. Along the human ancestral line, bipedalism first shows up around 7 million years ago (Sahelanthropus). It may then have disappeared for a while and then appeared again around 4 million years ago in a more resilient form (Australopithecus) which has continued through till us. What actually drove us from the trees to bipedalism is a matter of many theories and much conjecture. Whatever the reasons the large brain evolved only in bipedal hominins who had a straightened spine, and who had maintained two limbs for locomotion while freeing up the other two for many other activities. The advantages of being able to carry things and throw things and shape things are considered the drivers for this development. And these two free limbs became the counting limbs.

It seems unlikely that a large brain could have developed in a creature which did not have some limbs freed from the tasks of locomotion. Locomotion itself and the preference for symmetry would have eliminated a three-limbed creature with just one free limb.

Two limbs for counting, rather than 3 of 4 or 4 of 4, is also happenstance. But it may be less accidental than the 4 limbs to begin with and the 5 digits on each limb. An accidental four limbs reduced inevitably to two counting limbs. Together with an accidental five digits they gave us base 10.


Other bases

1. Oksapmin, base-27 body part counting

The Oksapmin people of New Guinea have a base-27 counting system. The words for numbers are the words for the 27 body parts they use for counting, starting at the thumb of one hand, going up to the nose, then down the other side of the body to the pinky of the other hand …… . ‘One’ is tip^na (thumb), 6 is dopa (wrist), 12 is nata (ear), 16 is tan-nata (ear on the other side), all the way to 27, or tan-h^th^ta (pinky on the other side).

2. Tzotzil, base-20 body part counting

Tzotzil, a Mayan language spoken in Mexico, has a vigesimal, or base-20, counting system. ….. For numbers above 20, you refer to the digits of the next full man (vinik). ..

3. Yoruba, base-20 with subtraction

Yoruba, a Niger-Congo language spoken in West Africa, also has a base-20 system, but it is complicated by the fact that for each 10 numbers you advance, you add for the digits 1-4 and subtract for the digits 5-9. Fourteen (??rinlá) is 10+4 while 17 (eétàdílógún) is 20-3. So, combining base-20 and subtraction means 77 is …. (20×4)-3.

4. Traditional Welsh, base-20 with a pivot at 15

Though modern Welsh uses base-10 numbers, the traditional system was base-20, with the added twist of using 15 as a reference point. Once you advance by 15 (pymtheg) you add units to that number. So 16 is un ar bymtheg (one on 15), 36 is un ar bymtheg ar hugain (one on 15 on 20), and so on.

5. Alamblak, numbers built from 1, 2, 5, and 20

In Alamblak, a language of Papua New Guinea, there are only words for 1, 2, 5, and 20, and all other numbers are built out of those. So 14 is (5×2)+2+2, or tir hosfi hosfihosf, and 59 is (20×2)+(5x(2+1))+(2+2) or yima hosfi tir hosfirpati hosfihosf.

6. Ndom, base-6

Ndom, another language of Papua New Guinea, has a base-6, or senary number system. It has basic words for 6, 18, and 36 (mer, tondor, nif) and other numbers are built with reference to those. The number 25 is tondor abo mer abo sas (18+6+1), and 90 is nif thef abo tondor ((36×2)+18).

7. Huli, base-15

The Papua New Guinea language Huli uses a base-15, or pentadecimal system. Numbers which are multiples of 15 are simple words. Where the English word for 225 is quite long, the Huli word is ngui ngui, or 15 15. However 80 in Huli is ngui dau, ngui waragane-gonaga duria ((15×5)+the 5th member of the 6th 15).

8. Bukiyip, base-3 and base-4 together

In Bukiyip, another Papua New Guinea language also known as Mountain Arapesh, there are two counting systems, and which one you use depends on what you are counting. Coconuts, days, and fish are counted in base-3. Betel nuts, bananas, and shields are counted in base-4. The word anauwip means 6 in the base-3 system and 24 in the base-4 system!

9. Supyire, numbers built from 1, 5, 10, 20, 80, and 400

Supyire, a Niger-Congo language spoken in Mali has basic number words for 1, 5, 10, 20, 80 and 400, and builds the rest of the numbers from those. The word for 600 is kàmpwòò ná ?kwuu shuuní ná bééshùùnnì, or 400+(80×2)+(20×2)

10. Danish, forms some multiples of ten with fractions

Danish counting looks pretty familiar until you get to 50, and then things get weird with fractions. The number 50 is halvtreds, a shortening of halv tred sinds tyve (“half third times 20” or 2½x20). The number 70 is 3½x20, and 90 is 4½x20.

11. French, mix of base-10 and base-20

French uses base-10 counting until 70, at which point it transitions to a mixture with base-20. The number 70 is soixante-dix (60+10), 80 is quatre-vingts (4×20), and 90 is quatre-vingts-dix ((4×20)+10).

12. Nimbia, base-12

Even though, as the dozenalists claim, 12 is the best base mathematically, there are relatively few base-12 systems found in the world’s languages. In Nimbia, a dialect of the Gwandara language of Nigeria, multiples of 12 are the basic number words around which everything else is built. The number 29 is gume bi ni biyar ((12×2)+5), and 95 is gume bo’o ni kwada ((12×7)+11).


Flavouring the seasoning gave us the oldest profession

November 20, 2020

Once upon a time, a designated chef at an ancient hominin hearth demanded compensation for his culinary art and started the oldest profession. Cooking predates the oldest cave paintings and may well be the oldest human art form.

Preserving is unambiguous but salting is a word that is rarely used anymore. The distinction in language between seasoning and flavouring is not so much ambiguous as wishful thinking. Theoretically, seasoning is considered the use of additives which allegedly enhance existing flavours, whereas flavouring adds different flavours. In practice this is a nonsense distinction. We have our five or possibly seven basic taste receptors (sweet, sour, bitter, salty, umami and maybe pungency and a fatty richness) and our olfactory receptors which can distinguish a myriad smells.

Five basic tastes – sweet, sour, bitter, salty and umami (savory) are universally recognized, although some cultures also include pungency and oleogustus (“fattiness”). The number of food smells is unbounded; a food’s flavor, therefore, can be easily altered by changing its smell while keeping its taste similar.

Any particular flavour we perceive in our brains is then due to a particular combination of activated taste and smell receptors together. With a change in sufficient activated taste or smell receptors our brains recognize a change in flavour. Generally seasoning involves salt (always) and sometimes some pepper and acidic matter (lime, vinegar, ….). Flavouring is considered predominantly to be through the use of herbs and spices. However, the difference between seasoned and unseasoned is a difference of perceived flavour in our brains. No self-respecting chef will ever admit that seasoning is merely a sub-set of flavouring, but even chefs must be allowed their self aggrandizement.  It is entirely false that proper seasoning cannot be tasted. A lack of salt is perceived when there is a lack of an expected activation of salt receptors. Adding salt always changes the combination of activated receptors and is always a change of flavour. Cook books generally perpetuate the misconceptions.

Canadian Baker 

Many ingredients are used to enhance the taste of foods. These ingredients can be used to provide both seasoning and flavouring.

  • Seasoning means to bring out or intensify the natural flavour of the food without changing it. Seasonings are usually added near the end of the cooking period. The most common seasonings are salt, pepper, and acids (such as lemon juice). When seasonings are used properly, they cannot be tasted; their job is to heighten the flavours of the original ingredients.
  • Flavouring refers to something that changes or modifies the original flavour of the food. Flavouring can be used to contrast a taste such as adding liqueur to a dessert where both the added flavour and the original flavour are perceptible. Or flavourings can be used to create a unique flavour in which it is difficult to discern what the separate flavourings are. 

Seasoning is always about changing perceived flavour and is a particular sub-set of flavouring. The story that seasoning originates with food preservation through the use of salt, whereas the use of herbs and spices for flavouring derives from when hunter-gatherers wrapped food in aromatic leaves for transport is plausible but little more than speculation.  Salt is inorganic and is not considered a spice but is the major ingredient for seasoning as opposed to flavouring. Herbs and spices are always organic and plant-based. (The proposed use of crushed insects as flavouring can safely be ignored. The use of cochineal insects – E120 – to give a carmine food colouring is not relevant.) Yet the manner we use small quantities of salt with foods is much too similar to the manner we use small quantities of herbs and spices not to have been the role-model and the precursor for the culinary use of herbs and spices.

Though this history is as presented by a purveyor of spices, it is both informative and credible.

History of Spices 

Abundant anecdotal information documents the historical use of herbs and spices for their health benefits. Early documentation suggests that hunters and gatherers wrapped meat in the leaves of bushes, accidentally discovering that this process enhanced the taste of the meat, as did certain nuts, seeds, berries, and bark. Over the years, spices and herbs were used for medicinal purposes. Spices and herbs were also used as a way to mask unpleasant tastes and odors of food, and later, to keep food fresh. Ancient civilizations did not distinguish between those spices and herbs used for flavoring from those used for medicinal purposes. When leaves, seeds, roots, or gums had a pleasant taste or agreeable odor, it became in demand and gradually became a norm for that culture as a condiment.

Our taste receptors did not evolve for the purposes of culinary pleasure. Bitterness detection is clearly a defense mechanism. Most animals reject bitter foods as a defense against toxins and poisons. All animals need salt. Mammal brains are designed to prevent a debilitating lack of sodium and have evolved the detection of saltiness as a tool. A craving for salty food has been shown to emerge spontaneously (and not as learned behaviour) with sodium deficiency. This has been shown to apply to many animals including sheep, elephants, moose, and primates who seek out salty food when suffering sodium deficiency. It is very likely that the capability to detect sweetness has also evolved as a way of urgently seeking energy rich foods. Exactly how or why it became important to detect sourness or umaminess is in the realm of speculationVegetarian food contains less salt than meat or fish. Our primate ancestors were mainly vegetarian and, like primates today, would have resorted to eating pith and rotting wood to counter sodium deficiencies. 

Hunger for salt

When multicellular organisms evolved and crawled up the beaches to dry land, they had to take the seawater with them in the blood and other body fluids. The mineral content of human blood plasma today is still much like that of the seas of the Precambrian era in which life arose. …..  And the ancestors of man for at least 25 million of the last 30 million years were almost certainly vegetarians, and therefore got little salt in their diets because most plants store little salt. To compensate for the scarcity of a substance vital to life, the brains of our ancestors and those of other mammals developed powerful strategies for getting and keeping salt. Inborn, Not Learned.

….. sudden improvement after one copious salt meal may also help explain the ritual acts of cannibalism once practiced by tribes in the Amazon jungles, the highland regions of New Guinea and elsewhere. Sometimes the body of a fallen foe was eaten in a final act of triumph and to absorb magically the strength of the defeated enemy. In other cultures, bones or other parts of a departed relative were eaten as a final act of devotion and also to gain back the strength of the dead person.

There are those who suggest that human use of salt as seasoning (as opposed to for preservation) only took off in the Neolithic after the advent of agriculture and our diet became more vegetarian. I don’t find this theory entirely plausible. Before hominins and bipedalism (c. 4 million years ago) our ancestors were primarily vegetarian. Meat eating became more prevalent once bipedalism led to a more actively predatory life-style as hunter gatherers. With more meat, diet now included larger amounts of salt and detection of saltiness was needed less for survival and could be diverted to culinary aesthetics. The control of fire appears around 2 million years ago and coincides roughly with a shift to eating cooked meats and the rapid (in evolutionary terms) increase of hominin brain size. I can well imagine a hominin individual – perhaps even a Neanderthal – designated as the chef for the day and being berated for lack of seasoning with the grilled mammoth steak.

In my story, the use of salt with cooked food as seasoning and to enhance flavour must go back – perhaps a million years – to our hunter-gatherer forbears who had shifted to a meat-rich diet.  It is thus my contention that it is this shift to cooked meat which released our flavour receptors from survival tasks and enabled them to be diverted to culinary aesthetics. Even the use of herbs and spices comes well before the Neolithic agricultural revolution (around 12,000 years ago). Herbs and spices being organic do not survive long and are very rare in the archaeological record. However, pots from about 25,000 years ago containing residues of cumin and coriander have been found. The theory that hunter-gatherers packaged meats for travel in large leaves and added – by trial and error – other plant-based preservatives or flavourings, is not implausible. The medicinal use of herbs and spices must also have been discovered around this time. In any event, even the first use of herbs and spices purely for flavouring must go back at least 50,000 years. Though diet must have included more vegetarian food after the advent of agriculture, the culinary arts of seasoning and flavouring had already been well established before the Neolithic. By the time we come to the ancient civilizations of 7 – 8,000 years ago, more than 100 herbs and spices were known and regularly used.

Whether first for food-wrapping or for medicinal use or for use as preservatives, the use of salt and herbs and spices entirely and specifically to make food taste better marks the beginning of the culinary art. No doubt there were many cases of trial and accident and failures and error. The failed attempts did not make it to the stories of spices though some are now probably included in the history of poisons. There is a case to be made for the culinary profession to be considered the oldest in the world.

image univ of minnesota

Why humans chose the 7-day week

November 17, 2020

It was another Sunday but, being retired and in these Corona-times, the days of the week are merging into each other and are difficult to tell apart. My thoughts turned, again, to when and how and why the seven-day week was invented. While the primary purpose of the “week” today is to define a recurring separation between days of rest and days of labour, the week is also used to organise many other recurring human activities. It occurs to me that the reasons for inventing an artificial, recurring period of a few days, shorter than a month, must be based on

  1. the occurrence of periodic and repetitive human activity within a society, and
  2. the need to organise and plan such activity

Both these requirements precede, I think, the choice of 5 or 10 or 6 or 7 days as the length of the period. The need to have a period shorter than a month must come before the choice of length of period. The most important function of the period is now to identify periodic days of “rest” from days of labour. It seems that even in prehistory, in predominantly agricultural communities, this separation of days of rest from days of labour was important. In the past, rest-days were often also days of regular and organised worship. Social traditions built up around these periods (meeting family and friends and congregations of society members). Working practices during industrialisation adapted to weekly cycles. Organised sport today depends existentially upon the regular, repeating days of “leisure”. If the length of this period, as a sub-period of the lunar cycle, were to be chosen today we would be faced with the same limited choices that humans faced perhaps 12,000 years ago. The practical choice lies only between 2, 3, 4, 5 or 6 sub-periods of the month, giving respectively weeks of 14/15, 10, 7, 6 or 5 days. (The Romans, for a while, used an 8-day week, but such a week is out of step with months, seasons and years. An 8-day week has nothing to recommend it). The days within each period needed to be identified separately so that tasks could be allocated to specific days. It is probably just the difficulty of remembering 14 or 15 specific weekdays which eliminated the choice of half-monthly weeks. It would have been entirely logical to choose 10 days in a week (and would have had the added advantage of very easily naming the days after numbers). A 5-day week would also have had a natural logic. In fact, 5-day and 10-day weeks have been attempted at various times but have not caught on. For some reason(s) the 7-day week has been the most resilient and now its global domination is unchallenged. But the compelling reasons for choosing seven day periods are lost in the mists of history.

A quick search revealed that I had written about this 7 years ago:

Another Sunday, another week — but why?

There are no discernible periodicities that we have been able to find outside ourselves which take 7 days. There are no periodicities within ourselves either that are 7 days or multiples of 7 days.  There are no celestial or astronomical cycles in tune with 7 days. There are no movements of the sun or the moon or the stars that give rise to a 7-day period. There are no weather or climate phenomena that repeat with a 7-day period. There are no human behavioural patterns that dance to a 7-day tune. There are no living things that have a 7-day life cycle. (There is a branch of pseudoscience which claims that living cells may be associated with a weekly or a half-weekly cycle – a circaseptan or a circasemiseptan rythm – but this is still in the realms of fantasy).

It would seem logical that our ancestors must have first noted the daily cycle long before they were even recognisable as human.  As humans they probably then noted the lunar cycle of about 29 days and the yearly cycle of about 365 days. Our distant ancestors would also have noted that the period of the yearly cycle was a little more than 12 lunar cycles. By about 35,000 years ago we have evidence that the lunar cycle was known and was being tracked. This evidence is in the form of a tally stick with 29 marks – the Lebombo bone.

The invention of the seven-day week can best be dated to be at least 5,000 years ago to the time of the Babylonians. It was certainly long before the Old Testament came to be written to fit with the 7-day week which had already been invented and established. The story goes that

the seven-day week was actually invented by the Assyrians, or by Sargon I (King of Akkad at around 2350 B.C.), passed on to the Babylonians, who then passed it on to the Jews during their captivity in Babylon around 600 B.C.  The ancient Romans used the eight-day week, but after the adoption of the Julian calendar in the time of Agustus, the seven-day week came into use in the Roman world. For a while, both the seven and eight day weeks coexisted in the Roman world, but by the time Constantine decided to Christianize the Roman world (around A.D. 321) the eight-day weekly cycle had fallen out of use in favor of the more popular seven-day week.

The idea that the 7-days originates from a division of the lunar cycle into 4 seems improbable. The lunar cycle (synodic period) is 29.5305882 days long. Three weeks of 10 days each or five 6 day weeks would fit better. That the annual cycle of 365.2425 days comes to dominate is not so surprising. Our calendar months are now attuned to the annual cycle and have no direct connection to the lunar cycle. But it is our 7 – day weeks which remain fixed. We adjust the length of our months and have exactly 365 days for each of our  normal years. We then add an extra day every 4 years  but omit 3 such extra days in every 400 years to cover the error. We make our adjustments by adding a day to the month of February for the identified leap years but we do not mess with the 7 days of the week.

It is far more likely that the 7 days comes from the seven celestial objects visible to the naked eye from earth and probably known to man some 5,000 to 10,000 years ago. They were familiar with the Sun, the Moon, Mars, Mercury, Jupiter, Venus, and Saturn by then. Naturally each was a god in his own heaven and had to have a day dedicated just to him/her/it. The same 7 celestial objects are used for the days of the week not only in the Greek/Roman Western tradition, but also in Indian astrology. The Chinese /East Asian tradition uses the Sun, Moon, Fire, Water, Wood, Gold and Earth to name the seven days of the week. But this must have come after the 7 day week had already been established elsewhere. (For example, to name up to 10 days they could just have chosen to add days named for the Air, Beasts, Birds ….). Some languages use a numbering system and some use a mixture of all of the above. Rationalists and philosophers and dreamers have tried to shift to 5 and 6, and 8 and 10 day weeks but none of these efforts has managed to challenge the practicality or to dislodge the dominance of the seven-day week.

And now the whole world lives and marches – socially, culturally, politically – to the inexorable beat of the 7-day week.

The seven-day week must have started earlier than 5,000 years ago. We must distinguish, I think, between the need for first having such a period and then the selection of the number of days in such a period. The invention of names must have come after the selection of the number of days. The 7-day period must already have been in use before the Sumerians and the Babylonians got around to naming the days.

The need for such a period must have come in the Neolithic (c. 12,000 years ago) and after the advent of settlements with substantial populations (cities). Human hunter-gatherers (and even their forebears) would have followed the annual cycles and the seasons and would have been subject to the vagaries of weather. The availability of moonlight and the lunar cycle would have been important and well observed and was something well known by 50,000 years ago. But hunter-gatherers with their semi-nomadic life style lived in small groups of perhaps 30 or 40 and conceivably up to a hundred people. There would have been no great need for such groups to invent a sub-period of a season or a lunar cycle. The numbers would not have been large enough to warrant the invention of a “week” to help organise repetitive tasks. 

The Neolithic brought population density and specialisation. Carpenters and masons and spinners and weavers performed their specialities for many different projects simultaneously. The need to combine different specialist functions towards a goal was the new model of cooperation. Houses had to be built using a variety of specialists. Their labour needed to be planned and coordinated. I can imagine that the need to be able to plan work from different sources for the same day became critical. To be able to tell everyone to do something on a Thursday needed the Thursday to be invented. 

It seems obvious that increasing population density and specialisation generates the need for defining a “week”. But it does not answer the question of why 7 days? I can only speculate that human physiology comes into play. Physiology and nutrition of the time must have determined that labouring for 9 days of 10 was too much for the human frame and that resting one day in 5 was considered too idle. (Of course, nowadays with 2 rest days in 7 there is far more leisure time than with 1 in 5). I speculate also that the choice of an odd number of days (7) rather than six days comes from a need to define a mid-week day. I suspect the priests of that time had to have their say and therefore the day of rest was hijacked for worship and support of their temples. They probably could not overrule the economic necessities of the time and take over any of the other six days of labour. They still had a go, though, by naming some of the other days after their gods.

Perhaps the choice of 7 days was the first example of implementation of workers’ demands.


Histories are always about justifying something in the present

November 8, 2020

Hardly a day goes by where I do not consider the origin of something. On some days I may ponder the origin of hundreds of things. It could be just curiosity or it could be to justify some current action or to decide upon some future action. Sometimes it is the etymology of a word or it could be the origins of an idea. It could be the story of what happened yesterday or something about my father or a thought about the origins of time. I know that when I seek the history of a place or a thing or a person, that what I get is just a story. In every case the story inevitably carries the biases of the story-teller. However, most stories are constrained by “evidence” though the point of the story may well lie in the narrative (inevitably biased) connecting the points of evidence.

The same evidence can generate as many stories as there are story-tellers. Often the narrative between sparse evidence forms the bulk of the story. When the past is being called upon to justify current or future actions, histories are invented and reinvented by playing with the narrative which lies between the evidence. Of course, the narrative cannot contain what is contradicted by the evidence. Human memory is always perception and perception itself is imperfect and varies. I know that the story I tell of some event in my own history changes with time. Thus the “history” I tell of all that lies between the “recorded facts” of my own existence is a variable and is a function of the “now”. It is experience and knowledge of the world and the people around us which provides the credibility for the stories which lie between the evidence. But bias plays its role here as well. A desired story-line is always more credible than one which is not.

A historian looks for the events which, incontrovertibly, took place. The further back events lie in the past the less evidence survives. But histories are never merely a tabulation of events with evidence (though even what constitutes incontrovertible evidence is not without controversy). The more there is evidence the more constrained is the inter-connecting narrative. But historians make their reputations on the stories they tell. Their histories are always a combination of evidenced events and the narrative connecting them. Historian bias is inevitable.

I have been writing a story – hardly a history – about my father’s early life and through the Second World War. For a period covering some 20 years I have documented evidence for about 30 separate events – dates when certain events occurred. The date he graduated, the date he joined up, the date he was promoted or the date he arrived somewhere. The documented events are, like all documented events, just events. If not this set of events then it would have been some other similar set of events. If not these particular dates then some other set of dates. The events are always silent about what his mood was or what he had for breakfast on the day of the event. They provide a fixed frame but the overwhelming bulk of my story is speculation about why and how he went from one event to the next. The story fits my understanding of how he was much later in his life. My story about his motivations and his behaviour are entirely speculation but always fit my central story-line. The documented events are just the bones on which to hang the flesh of my story. My story is not determined by the events. It is determined elsewhere but has to conform to the events.

Go back a little under 1,000 years and consider Genghis Khan. We have documentary evidence about the date he died but even his date of birth is speculation. Current histories vary according to whether the historian desires to describe a hero or a villain. Both can be hung upon the framework provided by the few documented events available. Go back another 1,000 years and even with the large (relatively) amount of evidence available about the Roman Empire, the range of speculation possible can justify the politics of any contemporary viewpoint.

And so it is with all histories. We claim that histories help us to understand the past and that this, in turn, helps us to choose our future actions. I am not so sure. The power of a history lies in the credibility of the narrative connecting the certain events. As with the story about my father, a history is not a narrative determined by the events. The narrative is determined by other imperatives but must conform to the events. I begin to think that we write (and rewrite) our histories, always in the present, and with our present understandings, to justify where we are or the choices we want to make. They are always a justification of something in the present.


History is a variable

September 30, 2020

Why do so many spend so much time in rewriting history?

Because, of course, history is not the immutable past but only ever a story. And rewriting and retelling stories to suit our current purposes is what we do.

Present misery is compensated for by wallowing in stories of past glories. Present failures are blamed on stories of past oppression. Present incompetence is attributed to stories of past suffering. Present stupidity is excused by stories of past undernourishment. Present duplicity is defended by stories of past exploitation. Present criminality is justified by stories of past deprivation. Present depravity is condoned by stories of past repression.


 


%d bloggers like this: