The Genocide of the Neanderthals by the Even Newer Africans

May 13, 2021

In the politically correct and virtue signalling world, where pseudo-morality reigns, colonisation has become a dirty word. Colonists are considered evil. Statues of colonists are even more depraved. The colonised of the past are always considered victims by the present. Needless to say, a place in the kingdom of heaven is reserved for the colonised. The reality is that any living species which does not colonise is doomed either to stagnation in a niche habitat or to failure and extinction. Colonisation is the stuff of life. Geographical spaces are colonised when expanding communities invade and bring more competitive cultures or technologies than existing in that space. Populations are colonised when their culture and technology cannot compete with incoming ones. Strangely, it is only the European colonisations between about 1400 and 1900 CE which have become politically incorrect. But what is conveniently forgotten is that the colonised populations in Australia and N America and even S America at that time were so backward in technology that they were ripe for colonisation by any invading community with superior technology. If not the Europeans, it would have been someone else. It is, of course, politically incorrect to point out that the colonised were once colonists too, and have themselves primarily to blame. Colonisations in antiquity by the Phoenicians, Carthaginians, Greeks, Romans, and Han Chinese are too far in the past for moral judgements in the present. The Mongols, the Normans, and the Vikings generally escape censure today.

But it is worth remembering that human colonisation was started by the Africans.

Colonisation is primarily about the expansion of the physical space being occupied by a biological community. The community may be a whole species or just a particular strain within a species. It is a phenomenon exhibited by every successful biological community from viruses and bacteria and fungi to plants and animals (including humans). A new territory or habitat may already be occupied by other species, or strains of the same species, or unoccupied. The incoming community are the colonists. Any communities already existing in the space are the colonised. Many attempts at colonisation fail; either because the colonists cannot adapt to the new habitat or because they cannot compete (biologically, culturally, or technologically) with the existing inhabitants.

That living things exist in every conceivable corner of the earth’s surface is a consequence of colonisation. That living things find it necessary to search for new habitats is a consequence of surviving changing environments, of growth, and of the genetic diversity inherent in every species. There are a few species which have stagnated in tiny niche habitats, exhibit unusually little genetic diversity and are unable to change. They have become so specialised to fit their habitat that they are incapable of adapting to any other and have reached evolutionary “dead-ends”. Panda bears and theridiid spiders are examples. They have become incapable of growth or of colonisation and are probably on their slow path to extinction.

When it comes to the origins of human colonisation we need to go back to before we were ever human. (I take humans to mean Anatomically Modern Humans who appeared around 300,000 years ago). Some little time after we had evolved from hominids to hominin, and perhaps around 800,000 years ago, a common hominin ancestor of Neanderthals, Denisovans, a couple of unknown hominin species and of AMH, emigrated from Africa and colonised most of Europe, Central Asia, and South East Asia. Most likely the movement of whole populations was driven, not by a shortage of space, but by changes of climate and a shortage of food. (Note that immigration is not necessarily colonisation, but colonisation always involves emigration). These Old Africans were emigrants and the first ever colonists. They were not initially immigrants since the territories they moved into had no other hominin inhabitants. There were probably many waves of Old Africans and later emigrants may well have been immigrants. Many of the areas they moved into did have indigenous hominid populations. However, the indigenous culture and technology was not sufficiently competitive to prevent the wave of hominin colonisation. Hominins had fire while hominids and other species did not. The colonisation of the world by the Old Africans led to the demise of many species which could not compete against the advanced culture and technology they exhibited. Some were hunted to extinction as prey, while others were unable to adapt quickly enough, and still others were just crowded out by the newcomers.

In due course (a small matter of a few hundred thousand years) the Old Africans in Central Asia and Europe evolved to become the Neanderthals. From about 500,000 years ago they were the dominant species for about 300,000 years. In South East Asia, the Old Africans evolved to become the Denisovans. In the rest of Asia (S China, India, and the Middle East), the Old Africans were still around but had evolved to become some as yet unknown hominin species. In Africa, the Old Africans gave way eventually to Anatomically Modern Humans (AMH) by about 300,000 years ago. Let us call them the New Africans.

Then from about 200,000 years ago there were a number of waves of New African emigration/colonisation into Europe and Asia. These emigrant waves continued sporadically for 100,000 years culminating in the Even Newer Africans coming “Out of Africa” around 60 – 70,000 years ago. The New Africans and the Even Newer Africans found indigenous hominin populations all across the new territories they expanded into. They were sometimes just other New Africans and sometimes they were blended populations of Old Africans (Neanderthals, Denisovans, …) and New Africans. In India, for example, the Even Newer Africans arrived after the Toba eruption and mingled genetically with surviving populations of Old Africans already mingled with New Africans.

Whether there was conflict between indigenous and arriving populations, or whether one culture was gradually submerged into the more dominant one is unknown. What is known is that the arrival of the Even Newer Africans caused the Neanderthals and the Denisovans and some other hominin species around to disappear. By around 50,000 years ago the Denisovans were extinct and by 40,000 years ago there were no Neanderthals left. However, their genes still survive and live on in us.

In current-day politically correct terms and to signal great virtue in ourselves, it could be called the Genocide of the Neanderthals by the Even Newer Africans.


Dimensions: where and when we are

May 10, 2021

“In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it”. – Wikipedia

In the concept of spacetime one might think that (x,y,z,t) are the four dimensional coordinates which are necessary and sufficient to specify the location of any object at any time within our universe. But that would be an oversimplification. It is true only for a relative location and not for any absolute location. In reality we have no idea – in absolute terms – of where we are or when we are.

The place where I was born on the surface of the Earth has – during my lifetime – drifted some 2.3 m North East across the earth’s surface. The Sun (along with the Earth) has moved 6.9 billion km around the centre of the Milky Way Galaxy. Using referents outside the Milky Way Galaxy, it has, during the same time, moved some 55 billion km in space. So, I was born some 60 billion km away from wherever in space we actually are now. In the context of the Universe this is still local space, and I do not need to account for the very expansion of space. The looming collision of the Andromeda Galaxy speeding towards us is still 4.5 billion years away and irrelevant in the scale of my lifetime. Taking my present location as (0,0,0,0) and the X-axis as the straight line from where we were then to now, the coordinates of my birth location become (-60, 0, 0, -73 years) where x, y and z are measured in billions of km.

Everything is relative to here and now.

Considering time to be a dimension is no more than a convention or, at best, an analogy. It does not help that either

  • we have no clear definition of what a dimension is, or
  • a dimension is anything that can be counted.

We can measure the oscillation of apparent motions and assume that such motion is regular and then infer the passage of time. But what time is other than a magical, necessary backdrop for everything is beyond our comprehension. We cannot be certain that a second now is the same, or longer, or shorter, than a second at some other time. (A second now must be longer than a second was then).

The world is what our perception tells us it is. But our perception is limited, and it limits the boundaries of our reality. We perceive space and everything around us as having 3 dimensions, yet we cannot truly conceive of any real thing having other than three spatial dimensions. In our 3-dimensional world we can define one- and two-dimensional things only as concepts (lines and surfaces) but we cannot identify any real-world objects which have only one or two dimensions. Moreover, real things having more than 3 dimensions are beyond our comprehension. How a fourth spatial dimension could be manifested lies outside of human reason. We have the language to describe – but only conceptually – any number of dimensions. Scientists and mathematicians speculate about 3 or 7 or 9 or infinite dimensions and claim either that 3 is the most probable or theorise that the others are hidden in the strings that make up the world, but the human brain can only perceive 3. (I note in passing that invoking the infinite is itself an admission of incomprehensibility). It is a fruitless and inevitably circular discussion to question whether it is our perception which is limited to 3 dimensions or whether the universe has only three to be perceived. Our universe is enabled, and strictly constrained, by what our cognition allows us to perceive. Every real thing in our universe has three spatial dimensions; no less, no more. Our universe has 3 spatial dimensions because that is all, and only what, we can perceive.

I probably read “Flatland” as a teenager where a sphere in Flatland can only be perceived as a circle.

Flatland: A Romance of Many Dimensions is a satirical novella by the English schoolmaster Edwin Abbott Abbott, first published in 1884 by Seeley & Co. of London. Written pseudonymously by “A Square”, the book used the fictional two-dimensional world of Flatland to comment on the hierarchy of Victorian culture, but the novella’s more enduring contribution is its examination of dimensions. – Wikipedia

No matter how many dimensions the universe may have, three dimensions is all human cognition can ever perceive. It is that reality which constrains all our thought. It becomes a fundamental assumption for science which the scientific method cannot penetrate. If other dimensions exist, then what we perceive in three are projections. As a shadow is perceived to be two-dimensional. But to have a projection or a shadow in our 3-dimensional world we would need some kind of cognitive light from the other, higher dimensions to create what we perceive.

But human cognition is limited. We cannot perceive what we cannot perceive. And we have no clue as to where and when we are.


Covid 19 : A Chinese biological weapons test gone wrong?

May 9, 2021

Just a naturally occurring mutation of a coronavirus? Unlikely.

An accidental virus crossover to humans from a Chinese wet market? Perhaps.

An accidental escape of the virus from a Wuhan laboratory? Possible.

Were Chinese scientists considering the coronavirus as a biological weapon? Certainly.

An accidental escape from a Chinese biological weapons program? Possible.

An intentional release of the virus as a biological weapons test? Unlikely

Just another conspiracy theory? Hardly.

Chinese Scientists Discussed Weaponising Coronavirus In 2015

A Chinese scientific paper titled “The Unnatural Origin of SARS and New Species of Man-Made Viruses as Genetic Bioweapons” suggested that World War Three would be fought with biological weapons.

Beijing: A document written by Chinese scientists and health officials before the pandemic in 2015 states that SARS coronaviruses were a “new era of genetic weapons” that could be “artificially manipulated into an emerging human disease virus, then weaponised and unleashed, reported Weekend Australian.
The paper titled The Unnatural Origin of SARS and New Species of Man-Made Viruses as Genetic Bioweapons suggested that World War Three would be fought with biological weapons. The document revealed that Chinese military scientists were discussing the weaponisation of SARS coronaviruses five years before the COVID-19 pandemic. The report by Weekend Australian was published in news.com.au.

Peter Jennings, the executive director of the Australian Strategic Policy Institute (ASPI), told news.com.au that the document is as close to a “smoking gun” as we’ve got. “I think this is significant because it clearly shows that Chinese scientists were thinking about military application for different strains of the coronavirus and thinking about how it could be deployed,” Jennings said. “It begins to firm up the possibility that what we have here is the accidental release of a pathogen for military use,” Jennings added.

He also said that the document may explain why China has been so reluctant for outside investigations into the origins of COVID-19.

…….. 

This is not a new theory.  By the criteria used for determining what makes a good biological weapon, Covid- 19 is not the best possible.

Forbes:

……. Overall, the SARS-CoV-2 virus has some “desirable” properties as a bioweapon, but probably not enough to make it a good choice for military purposes. Regardless, it has certainly reminded us of our vulnerabilities as a society to a new pathogen, and how crippling a pandemic can be, as we continue to watch the entire world grappling with how to contain it.  ………

Will China ever be held accountable? Hardly.


700 years of epidemiology: “Avoid contact, wear a mask, wash your hands, burn your dead”

May 3, 2021

It was practised in 1350 during the Black Death. It was practised during the Great Plague in 1666. And it was still the best advice during the Spanish Flu in 1919. And it is no different today.

In 700 years the advice for the prevention of infection has not changed.

Like any other social “science”, epidemiology is a discipline and a field of study but it is no science.

“COVID-19 is a major acute crisis with unpredictable consequences. Many scientists have struggled to make forecasts about its impact. However, despite involving many excellent modelers, best intentions, and highly sophisticated tools, forecasting efforts have largely failed”.

1997: The Failure of Academic Epidemiology: Witness for the Prosecution, Carl Shy, American Journal of Epidemiology, Volume 145, Issue 6, 15 March 1997, Pages 479– 84.

Academic epidemiology has failed to develop the scientific methods and the knowledge base to support the fundamental public health mission of preventing disease and promoting health through organized community efforts. As a basic science of public health, epidemiology should attempt to understand health and disease from a community and ecologic perspective as a consequence of how society is organized and behaves, what impact social and economic forces have on disease incidence rates, and what community actions will be effective in altering incidence rates. However, as taught in most textbooks and as widely practiced by academicians, epidemiology has become a biomedical discipline focused on the distribution and determinants of disease in groups of individuals who happen to have some common characteristics, exposures, or diseases. The ecology of human health has not been addressed, and the societal context in which disease occurs has been either disregarded or deliberately abstracted from consideration.

And more recently:

2020: Forecasting for COVID-19 has failed, Ioannidis, Cripps and Tanner

Epidemic forecasting has a dubious track-record, and its failures became more prominent with COVID-19. Poor data input, wrong modeling assumptions, high sensitivity of estimates, lack of incorporation of epidemiological features, poor past evidence on effects of available interventions, lack of transparency, errors, lack of determinacy, looking at only one or a few dimensions of the problem at hand, lack of expertise in crucial disciplines, groupthink and bandwagon effects and selective reporting are some of the causes of these failures. Nevertheless, epidemic forecasting is unlikely to be abandoned.

Of course, actual health care and the medications available have advanced immeasurably during this time. Medicine and the development of medicines and vaccines have come a very long way since the Spanish Flu. But the prediction of human behaviour – which is what epidemiology is – is as uncertain now as it was in the Middle Ages. Mathematical forecasts – whether for pandemics or for climate – are only as good as the most inaccurate assumption made. Very often assumptions made are to comply with some other agenda. Sometimes, the assumptions made are just downright stupid.

And so, for over 700 years the advice for the prevention of infection has been and remains “Avoid contact, wear a mask, wash your hands and burn your dead”.


Should Adolf Hitler be cancelled?

April 20, 2021

It is 132 years since Adolf Hitler was born on 20th April 1899 in Braunau am Inn, Austria.

Of course there are no statues of Hitler around, or so one might think.

But you can buy a bust of Hitler on the net for $59.99.

And there are plenty of paintings of him, physical copies of books by him and about him and any number of photographs. And then we come to all the digital statues of Hitler that exist. Of course he has an entry in Wikipedia. A Google search which took 0.57 seconds generates 67 million results.

It would be politically correct to ensure that anything even faintly positive about Hitler be destroyed. All historical records should be thoroughly cleansed so that only derogatory material about him remains. 

Or maybe erect a statue every year to be then pulled down. Erect it on 20th April and pull it down with great ceremony on April 30th every year?


Social media: I don’t know what I have missed – but I don’t miss it

April 18, 2021

Four weeks ago my irritation at Facebook’s intrusive ads and suggestions and WhatsApp’s insistence on new terms and conditions made my mind up for me.

  1. I stopped using Facebook.
  2. I stopped using Twitter (which I didn’t much use anyway).
  3. I deleted Instagram and Tik Tok (which I had installed but never used).
  4. I have not used Messenger in 6 months.
  5. I am weaning myself away from WhatsApp and shifting to Signal (95% done).
  6. I am ignoring LinkedIn and Pinterest.

It has only been a month and is still an experiment. But I have no immediate plans to return and it is my hope that I will not.

My use of text messages has increased somewhat.

I have not deleted FB and WhatsApp and LinkedIn yet and can still see that I have many hundreds of unread notifications. But I choose to decline to open the notifications.

I see that Facebook is working to “integrate” WhatsApp and Messenger. I see that the users of these platforms are merely suppliers of data and the customers are the advertisers. I see that the platforms are increasingly trying to first predict the behaviour of the users but are also actively trying to manipulate users into “preferred” behaviour. I choose to abstain.

So I don’t know what I am missing. But, more importantly, I don’t miss what I must have missed. I am pretty sure that more than 99% of what I have missed was not directed specifically at me.

If it is important enough they can call me or send me a text or send me an email.


Eleven Years, 4142 posts

April 16, 2021

Time flies.

Activity has been muted of late but 4142 posts in 11 years still maintains a target I once had – on average – of a post per day.


Lawyers are to humans as fungi are to trees

April 16, 2021

I have suggested in the past that cooking may be the oldest art form and that chefs may have been members of the oldest profession. However it may be that lawyers came first.


Reblogged from A short history of lawyers (upcounsel.com)

Imbricate fruiting of Phaeolus schweinitzii. image forestpathology.org

Like the symbiotic relationship between trees and fungus, lawyers and humans have an important, interlocking relationship going back to the dawn of man.

The following is excerpted from “Some Lawyers Are People Too!” by Hugh L. Dewey, Esq. (2009). 

Legal anthropologists have not yet discovered the proverbial first lawyer. No briefs or pleadings remain from the proto-lawyer that is thought to have been in existence more than 5 million years ago.

Chimpanzees, man’s and lawyer’s closest relative, share 99% of the same genes. New research has definitely proven that chimpanzees do not have the special L1a gene that distinguishes lawyers from everyone else. (See Johnson, Dr. Mark. “Lawyers in the Mist?” Science Digest, May 1990: pp. 43-52.) This disproved the famous outcome of the Scopes Monkey Trial in which Clarence Darrow proved that monkeys were also lawyers.

Charles Darwin, Esquire, theorized in the mid-1800s that tribes of lawyers existed as early as 2.5 million years ago. However, in his travels, he found little evidence to support this theory.

Legal anthropology suffered a setback at the turn of the century in the famous Piltdown Lawyer scandal. In order to prove the existence of the missing legal link, a scientist claimed he had found the skull of an ancient lawyer. The skull later turned out to be homemade, combining the large jaw of a modern lawyer with the skull cap of a gorilla. When the hoax was discovered, the science of legal anthropology was set back 50 years.

The first hard scientific proof of the existence of lawyers was discovered by Dr. Margaret Leakey at the Olduvai Gorge in Tanzania. Her find consisted of several legal fragments, but no full case was found intact at the site. Carbon dating has estimated the find at between 1 million and 1.5 million years ago. However, through legal anthropology methods, it has been theorized that the site contains the remains of a fraud trial in which the defendant sought to disprove liability on the basis of his inability to stand erect. The case outcome is unknown, but it coincides with the decline of the Australopithecus and the rise of Homo Erectus in the world. (See Leakey, Margaret A. “The case of erectus hominid.” Legal Anthropology, March 1947: pp. 153.)

In many sites dating from 250,000 to 1,000,000 years ago, legal tools have been uncovered. Unfortunately, the tools are often in fragments, making it difficult to gain much knowledge.

The first complete site discovered has been dated to 150,000 years ago. Stone pictograph briefs were found concerning a land boundary dispute between a tribe of Neanderthals and a tribe of Cro-Magnons. This decision in favor of the Cro-Magnon tribe led to a successive set of cases, spelling the end for the Neanderthal tribe. (See Widget, Dr. John B. “Did Cro-Magnon have better lawyers?” Natural History, June 1926: p. 135. See also Cook, Benjamin. Very Very Early Land Use Cases. Legal Press, 1953.)

Until 10,000 years ago, lawyers wandered around in small tribes, seeking out clients. Finally, small settlements of lawyers began to spring up in the Ur Valley, the birthplace of modern civilization. With settlement came the invention of writing. Previously, lawyers had relied on oral bills for collection of payment, which made collection difficult and meant that if a client died before payment (with life expectancy between 25 and 30 and the death penalty for all cases, most clients died shortly after their case was resolved), the bill would remain uncollected. With written bills, lawyers could continue collection indefinitely.

In the late 1880s, legal anthropologists cracked the legal hieroglyphic language when they were able to determine the meaning of the now famous Rosetta Stone Contract. (See Harrison, Franklin D. The Rosetta Bill. Doubleday, 1989.) The famous first paragraph can be recited verbatim by almost every lawyer:

“In consideration of 20,000 Assyrians workers, 3,512 live goats, and 400,000 hectares of dates, the undersigned hereby conveys all of the undersigned’s right, title, and interest in and to the property commonly known as the Sphinx, more particularly described on Stone A attached hereto and made a part hereof.”

The attempted sale of the Sphinx resulted in the Pharaoh issuing a country-wide purge of all lawyers. Many were slaughtered, and the rest wandered in the desert for years looking for a place to practice.

Greece and Rome saw the revival of the lawyer in society. Lawyers were again allowed to freely practice, and they took full advantage of this opportunity. Many records exist from this classic period. Legal cases ranged from run-of-the-mill goat contract cases to the well-known product liability case documented in the Estate of Socrates vs. Hemlock Wine Company. (See Wilson, Phillips ed. Famous Roman Cases. Houghton, Mifflin publishers, 1949.)

The most famous lawyer of this period was Hammurabi the Lawyer. His code of law gave lawyers hundreds of new business opportunities. By creating a massive legal system, the demand for lawyers increased ten-fold. In those days, almost any thief or crook could kill a sheep, hang-up a sheepskin, and practice law, unlike the highly regulated system today which limits law degrees to only those thieves and crooks who haven’t been convicted of a major felony.

The explosion in the number of lawyers coincided with the development of algebra, the mathematics of legal billing. Pythagoras, a famous Greek lawyer, is revered for his Pythagorean Theorem, which proved the mathematical quandary of double billing. This new development allowed lawyers to become wealthy members of their community, as well as to enter politics, an area previously off-limits to lawyers. Despite the mathematical soundness of double billing, some lawyers went to extremes. Julius Caesar, a Roman lawyer and politician, was murdered by several clients for his record hours billed in late February and early March of 44 B.C. (His murder was the subject of a play by lawyer William Shakespeare. When Caesar discovered that one of his murderers was his law partner Brutus, he murmured the immortal lines, “Et tu Brute,” which can be loosely translated from Latin as “my estate keeps twice the billings.”)

Before the Roman Era, lawyers did not have specific areas of practice. During the period, legal specialists arose to meet the demands of the burgeoning Roman population. Sports lawyers counseled gladiators, admiralty lawyers drafted contracts for the great battles in the Coliseum, international lawyers traveled with the great Roman armies to force native lawyers to sign treaties of adhesion — many of which lasted hundreds of years until they were broken by the barbarian lawyers who descended on Rome from the North and East — and the ever-popular Pro Bono lawyers (Latin for “can’t get a real job”) who represented Christians and lost all their cases for 300 years.

As time went on, the population of lawyers continued to grow until 1 out of every 2 Romans was a lawyer. Soon lawyers were intermarrying. This produced children who were legally entitled to practice Roman law, but with the many defects that such a match produced, the quality of lawyers degenerated, resulting in an ever-increasing defective legal society and the introduction of accountants. Pressured by the legal barbarians from the North with their sign-or-die negotiating skills, Rome fell, and the world entered the Dark Ages.

During the Dark Ages, many of the legal theories and practice developed during the golden age were forgotten. Lawyers lost the art of double billing, the thirty-hour day, the 15-minute phone call, and the conference stone. Instead, lawyers became virtually manual laborers, sharing space with primitive doctor-barbers. Many people sought out magicians and witches instead of lawyers since they were cheaper and easier to understand.

The Dark Ages for lawyers ended in England in 1078. Norman lawyers discovered a loophole in Welsh law that allowed William the Conqueror to foreclose an old French loan and take most of England, Scotland, and Wales. William rewarded the lawyers for their work, and soon lawyers were again accepted in society.

Lawyers became so popular during this period that they were able to heavily influence the kings of Britain, France, and Germany. After a Turkish corporation stiffed the largest and oldest English law firm, the partners of the firm convinced these kings to start a Bill Crusade, sending collection knights all the way to Jerusalem to seek payment.

A major breakthrough for lawyers occurred in the 17th century. Blackstone the Magician, on a trip through Rome, unearthed several dozen ancient Roman legal texts. This new knowledge spread through the legal community like the black plague. Up until that point, lawyers used the local language of the community for their work. Since many smart non-lawyers could then determine what work, if any, the lawyer had done, lawyers often lost clients, and sometimes their head.

Using Blackstone’s finds, lawyers could use Latin to hide what they did so that only other lawyers understood what was happening in any lawsuit. Blackstone was a hero to all lawyers until, of course, he was sued for copyright infringement by another lawyer. Despite his loss, Blackstone is still fondly remembered by most lawyers as the father of legal Latin. “Res ipsa loquitur” was Blackstone’s favorite saying (“my bill speaks for itself”), and it is still heard today.

Many lawyers made history during the Middle Ages. Genghis Kahn, Esq., from a family of Jewish lawyers, Hun & Kahn, pioneered the practice of merging with law offices around Asia Minor at any cost. At one time, the firm was the largest in Asia and Europe. Their success was their downfall. Originally a large personal injury firm (if you didn’t pay their bill, they personally injured you), they became conservative over time and were eventually overwhelmed by lawyers from the West. Vlad Dracul, Esq., a medical malpractice specialist, was renowned for his knowledge of anatomy, and few jurors would side against him for fear of his special bill (his bill was placed atop 20-foot wooden spears on which the non-paying client was placed).

Leonardo di ser Piero da Vinci, Esq., was multi-talented. Besides having a busy law practice, he was an artist and inventor. His most famous case was in defense of himself. M. Lisa vs. da Vinci (Italian Superior Court 1513) involved a product liability suit over a painting da Vinci delivered to the plaintiff. The court, in ruling that the painting was not defective despite the missing eyebrows, issued the famous line, “This court may not know art, but it knows what it likes, and it likes the painting.” This was not surprising since the plaintiff was known for her huge, caterpillar-like eyebrows. Da Vinci was able to convince the court that he was entitled not only to damages but to attorneys’ fees, costs, and punitive damages as well. The court, taking one last look at the plaintiff, granted the request.

A land dispute case in the late 15th century is still studied today for the clever work of Christopher Columbus, Esq. He successfully convinced an Aztec court, in Columbus vs. 1,000,000 Acres that since the Indians did not believe in possession, they could not claim the land in question. Therefore, his claim had to be given priority. Despite the fact that the entire court was sacrificed to the gods, the case held and Spain took an early legal lead in the New World.

As the New World was colonized, England eventually surpassed Spain as the leading colonizer. England began sending all of its criminals and thieves to the New World. This mass dumping of lawyers to the states would come back to haunt England. Eventually, the grandchildren of these pioneer lawyers would successfully defeat King George III in the now famous King George III v. 100 Bags of Tea. England by this time was now dreadfully short of lawyers. The new American lawyers exploited this shortfall and, after a seven-year legal war, defeated the British and created the United States, under the famous motto, “All lawyers are created equal.”

England never forgot this lesson and immediately stopped its practice of sending lawyers to the colonies. This policy left Australia woefully deficient in lawyers.

With stories of legal success common in the late 1700s, more and more people attempted to become lawyers. This process of stealing a shingle worried the more successful lawyers. To stem this tide as well as to create a new profit center, these lawyers passed laws requiring all future lawyers to be restricted from practice unless they went to an approved law school. The model school from which all legal education rules developed was Harvard Law School.

Harvard, established in 1812, set the standard for legal education when, in 1816, it created the standardized system for legal education. This system was based on the Socratic method. At most universities, the students questioned the teacher/professor to gain knowledge. These students would bill their professors, and if the bill went unfulfilled, the students usually hung up their law professor for failure of payment At Harvard, the tables were turned, with the professors billing the students. This method enriched the professors and remains the standard in use in most law schools in America and England.

As developed by Harvard, law students took a standard set of courses as follows:

  1. Jurisprudence: The history of legal billing, from early Greek and Roman billing methods to modern collection techniques.
  2. Torts: French law term for “you get injury, we keep 40%.” Teaches students ambulance-chasing techniques.
  3. Contracts: Teaches that despite an agreement between two parties (the contract), a lawsuit can still be brought.
  4. Civil Procedure: Teaches the tricky arcane rules of court, which were modernized only 150 years ago in New York.
  5. Criminal Law: Speaks for itself.

These courses continue to be used in most law schools throughout the United States.

Despite the restrictions imposed on the practice of law (a four-year college degree, three years of graduate school, and a state-sponsored examination), the quantity of lawyers continues to increase to the point that three out of every five Americans are lawyers. (In fact, there are over 750,000 lawyers in this country.) Every facet of life today is controlled by lawyers. Even Dan Quayle (a lawyer) claims, surprise, that there are too many lawyers. Yet until limits are imposed on legal birth control, the number of lawyers will continue to increase. Is there any hope? We don’t know and frankly don’t care since the author of this book is a successful, wealthy lawyer, the publishers of this book are lawyers, the cashier at the bookstore is a law student, and your mailman is a lawyer. So instead of complaining, join us and remember, there is no such thing as a one-lawyer town.


Lawyers are members of a parasitic life-form which emerges in the cracks of human society.


Science needs its Gods and religion is just politics

April 11, 2021

This essay has grown from the notes of an after-dinner talk I gave last year. As I recall it was just a 20 minute talk but making sense of my old notes led to this somewhat expanded essay. The theme, however, is true to the talk. The surrounding world is one of magic and mystery. And no amount of Science can deny the magic.

Anybody’s true belief or non-belief is a personal peculiarity, an exercise of mind and unobjectionable. I do not believe that true beliefs can be imposed from without. Imposition requires some level of coercion and what is produced can never be true belief. My disbelief can never disprove somebody else’s belief.

Disbelieving a belief brings us to zero – a null state. Disbelieving a belief (which by definition is the acceptance of a proposition which cannot be proved or disproved) brings us back to the null state of having no belief. It does not prove the negation of a belief.

[ (+G) – (+G) = 0, not (~G) ]

Of course Pooh puts it much better.


Science needs its Gods and religion is just politics


Navigating the Suez Canal

March 28, 2021


%d bloggers like this: