Archive for the ‘Language’ Category

“Random” is indistinguishable from Divine

November 2, 2020

“Why is there something rather than nothing?” is considered by some to be the most fundamental question in metaphysics, and by others to be an invalid question. The Big Bang, quantum mechanics, time, consciousness, and God are all attempts to answer this question. They all invoke randomness or chance or probabilistic universes to escape the First Cause Problem. Physics and mathematics cannot address the question. An implied God of Randomness is the cop-out for all atheists.

Stanford Encyclopedia of Philosophy

The Commonplace Thesis, and the close connection between randomness and chance it proposes, appears also to be endorsed in the scientific literature, as in this example from a popular textbook on evolution (which also throws in the notion of unpredictability for good measure):

scientists use chance, or randomness, to mean that when physical causes can result in any of several outcomes, we cannot predict what the outcome will be in any particular case. (Futuyma 2005: 225)

Some philosophers are, no doubt, equally subject to this unthinking elision, but others connect chance and randomness deliberately. Suppes approvingly introduces

the view that the universe is essentially probabilistic in character, or, to put it in more colloquial language, that the world is full of random happenings. (Suppes 1984: 27)

The scientific method is forced to introduce random into stories about the origin of time and causality and the universe and life and everything. Often the invocation of random is used to avoid any questions of Divine Origins. But random and chance and probability are all just commentaries about a state of knowledge. They are silent about causality or about Divinity. Random ought to be causeless. But that is pretense for such a random is outside our experience. The flip of a coin produces only one outcome. Multiple outcomes are not possible. The probability of one of several possible outcomes is only a measure of lack of knowledge. Particles with a probability of being in one place or another are also an expression of ignorance. However when it is claimed that a particle may be in two places simultaneously we encounter a challenge to our notion of identity for particle and for place. Is that just ignorance about location in time or do we have two particles or two places? Random collisions at the start of time are merely labels for ignorance. Invoking singularities which appear randomly and cause Big Bangs is also just an expression of ignorance.

Whenever science or the scientific method requires, or invokes, randomness or probability, it is about what we do not know. It says nothing about why existence must be. The fundamental question remains unaddressed “Why is there something rather than nothing?”

And every determinist or atheist, whether admitted to or not, believes in the God of Randomness. Everything Random is Unknown (which includes the Divine).


Why did we start to count?

October 12, 2020

Counting and the invention of numbers and the abstractions enabling mathematics are surely cognitive abilities. Counting itself involves an abstract ability. The simple act of raising two fingers to denote the number of stones or lions or stars implies first, the abstract ability to describe an observed quality and second, the desire to communicate that observation.

What led humans to counting and when?

Before an intelligence can turn to counting it must first have some concept of numbers. When and how did our ancient ancestors  first develop a concept of numbers and then start counting? …….. 

It seems clear that many animals do distinguish – in a primitive and elementary way – between “more” and “less, and “few” and “many”,and “bigger” and “smaller”, and even manage to distinguish between simple number counts. They show a sophisticated use of hierarchy and precedence.

Some primates show some primitive abilities when tested by humans

…..  Rhesus monkeys appear to understand that 1 + 1 = 2. They also seem to understand that 2 + 1 = 3, 2 – 1 = 1, and 3 – 1 = 2—but fail, however, to understand that 2 + 2 = 4. ……

But even chimpanzees and monkeys rarely, if ever, use counts or counting in interactions among themselves. The abilities for language and counting are not necessarily connected genetically (though it is probable), but they are both certainly abilities which appear gradually as cognition increases. Mathematics is, of course, just another language for describing the world around us. Number systems, as all invented languages, need that a system and its rules be shared before any communication is feasible. It is very likely that the expressions of the abilities to count and to have language follow much the same timeline. The invention of specific sounds or gestures to signify words surely coincided with the invention of gestures or sounds to signify numbers. The step change in the size of brains along the evolutionary path of humans is very likely closely connected with the expressions of the language and the counting abilities.

The ability to have language surely preceded the invention of languages just as the ability to count preceded the expressions of counting and numbering. It is not implausible that the first member of a homo erectus descendant who used his fingers to indicate one of something, or four of something else, to one of his peers, made a far, far greater discovery – relatively – than Newton or Einstein ever did.

We must have started counting and using counts (using gestures) long before we invented words to represent counts. Of course, it is the desire to communicate which is the driving force which takes us from having abilities to expressions of those abilities. The “cooperation gene” goes back to before the development of bipedalism and before the split with chimpanzees or even gorillas (at least 9 million years ago).

The simple answer to the question “Why did we start to count?” is because we could conceive of a count, observed it and wished to communicate it. But this presupposes the ability to count. Just as with language, the ability and the expression of the ability, are a consequence of the rapid increase in brain size which happened between 3 m and 1 m years ago.

I am persuaded that that rapid change was due to the control of fire and the change to eating cooked food and especially cooked meat. The digestion of many nutrients becomes possible only with cooked food and is the most plausible driver for the rapid increase in brain size.

Raw Food not enough to feed big brains

………. our brains would still be the size of an ape’s if H. erectus hadn’t played with fire: “Gorillas are stuck with this limitation of how much they can eat in a day; orangutans are stuck there; H. erectus would be stuck there if they had not invented cooking,” she says. “The more I think about it, the more I bow to my kitchen. It’s the reason we are here.”


Part 2 – The brain and our senses enable language but physiology limits languages

September 22, 2020

Part 1 –  “Language” is discovered but “languages” are invented

Part 2 – The brain and our senses enable language but physiology limits languages

The capability for language is an evolved ability and clearly a species-specific, cognitive attribute. This capability is not digital (On/Off) but varies first along the axis of cognition and second, the ability (both cognitive and physiological) to generate and receive signals. The capability for language, discovered within ourselves, together with the need and desire to communicate meanings, has led to humans inventing specific systems of language (Khoisan clicks, proto-Indo-European, Egyptian, Sanskrit, English, Braille, mathematics, ….). There are those who claim that humans are the only species having language and while it is true that only humans have all the characteristics of language (as defined by humans), the claim reduces to that “only humans have human language”.

Language exists not because humans exist, but because entities with brains, having the cognitive capability for language and desirous of communicating, exist.


(I take communication to be the intentional transfer of information, where information consists of facts or knowledge. Defining meaning leads either to circular logic (a meaning is what is conveyed by language and language communicates meanings) or to metaphysics. For this post I take meaning to simply be any coherent thought).

Brain 1>>meaning >>encoding>>output signal>>detection>>decoding>>meaning 2>>Brain 2


There is no doubt that most animal species have communication. Whether dogs or tigers or horses or even bees or ants, individuals of many species do communicate with each other. Individuals of some few species communicate in ways which suggest they may have a rudimentary capacity for language. Within some communities of monkeys and elephants and dolphins, for example, specific, repeatable sounds are used, voluntarily and with intent, to communicate specific meanings. The sounds and their meanings are learned and shared within particular communities. Monkeys within a troop are known to use different sounds to distinguish between snakes and lions, and then to communicate warnings about their approach. Even prairie dogs make different warning sounds for different kinds of predator. They even have a specific sound to sound an “All Clear”. However monkeys are not capable of forming or communicating more complex meanings such as “The lion is closer than the snake”. Only humans, it seems, even attempt to communicate abstract meanings, including any related to time or numbers. Animals may deceive but cannot, it seems, create false meanings (lies).

In the main, animals use sound and gestures for communication. Ants may communicate by the pheromones they emit, by sounds and even by touch. However, much of this is probably involuntary. Animals generally use their olfactory sense to garner information about the world around them. They even produce smells to mark territory and generate mating information, but it does not seem that they can produce different smells, at will, for communication purposes. Elephants use infra-sound to communicate over long distances. Bats use ultra-sound not only for echo-location but also, it seems, for communication. Even tigers, it is thought, produce infra-sounds at mating time. No animal system of communication remotely approaches the sophistication of human language, but that is not to say that their capability for language is zero. The capability for language exists when any entity having a brain

  1. desires to communicate with another similar entity, and
  2. shares a code with that entity wherein a meaning (including information) is represented by a particular signal, and
  3. can generate such a signal at will, and
  4. which signal can then be detected and interpreted as the intended meaning by the other entity.

Language: A shared system whereby two or more brains can communicate by the encoding of meanings into signals, which signals can then be transmitted and received and decoded back into their meanings.

All human attempts to communicate with animals are, in fact, a tacit acknowledgement that dogs and cats and dolphins and elephants and horses do have a rudimentary capability for language. They all seem to be able to generate specific signals to communicate specific meanings to others of their species. None have speech, but they can all make the cognitive leap that a particular human signal represents a particular meaning. Sometimes they generate their own particular signals (a certain bark or a rumble or a gesture) which humans are able to interpret as representing a particular meaning. It is apparent that the capability for language of a pet dog is greater than that of sheep, but it is also clear that neither is zero. The capability for language is often conflated with the ability for speech, but it is more likely that while speech enabled and allowed for an unprecedented sophistication in the human invention and use of languages, the capability for language had already appeared long before humans came down from the trees.

When our human ancestors achieved bipedalism they had brains about the size of current day chimpanzees. Australopithecus lived in Africa between 4 and 2 million years ago and had an average cranial capacity of about 450 cc, which is comparable to that of chimpanzees. By 1.5 million years ago the homo habilis brain had grown to a size of about 600 cc. Between 1.5 million and 300,000 years ago, homo erectus had a brain volume of between 800 and 1000 cc. Modern humans have a cranial volume of about 1350 cc but this can vary in individuals from as little as 900 and up to as much as 2,000 cc. (Neanderthals, Denisovans and even homo sapiens of their time are thought to have had slightly larger cranial capacities averaging about 1400 cc). The combination of physiological wherewithal and the associated brain control needed for speech as we know it today, was probably in place during the latter stages of homo erectus. Some form of speech was then probably available for Neanderthals, Denisovans and the earliest homo sapiens. Mammals first appeared some 200 million years ago. A plausible evolutionary time-line is that the capacity for language first began to appear with creatures at least several tens of million years ago. However, the invention of well codified languages, coincident with the arrival of speech, only came within the last million years, and perhaps only within the last 100,000 years.

In any species the emergence of the capability for language must precede the invention of communication codes. The inputs to, and the outputs from, a brain are limited by the physiology available to the brain through the body it controls. Strictly, cognition, as the ability to comprehend, is not just of the brain but of a brain together with the sensory abilities it has access to for getting inputs. An entity with a brain, even in isolation, may develop cognition as long as it has access to sensory inputs. The capability for language is thus dependent on, and constrained by, the cognition available which in turn is a composite of the brain and its associated senses. This capability must then be different for brains having access to different senses. Communication is undefined without there existing more than one brain. Communication becomes possible only when one brain can generate output signals which can be detected as input by a different brain. (In theory an entity could generate signals that it could not, itself, detect). For all living creatures the band of available sensory inputs is much broader than the range of output signals that can be intentionally generated. For example all mammals can hear a much wider range of frequencies than they can generate. Our vision can differentiate shapes to a greater precision than our hands can draw. The bottleneck for the invention of languages is thus the ability to generate coded signals which can be detected and interpreted. We do not use smell or taste or even touch (except for Braille as a proxy for sight) for language because we cannot generate unique signals at will. Touch was probably discarded as a primary means for signals because communication at a distance – but within hearing distance – was preferred.

Of all the senses available to us, human languages use only sight and hearing for inputs (again excepting Braille where touch is a proxy for sight). The underlying reason is that we are unable to generate unique, coded, repeatable signals detectable by our other senses. The predominance of speech in the languages we invent is of necessity. The languages we invent are constrained primarily by the signals we can generate. An entity with a brain capable of language but a different physiology would inevitably invent languages constrained by the signals it can generate.



 

An Eternal Second is 6.5 Zettayears (Zy) in an eternity of eternities

September 11, 2020

Alexander Atkins has an interesting post up at his blog about the literary treatment of “eternity”.

How long is eternity

……. The first to address this question, were two brothers, Jacob and Wilhelm Grimm (better known as the Brothers Grimm), German cultural researchers, philologists, and lexicographers that wrote a seminal collection of folktales titled Children’s and Household Tales (Kinder- und Hausmärchen), first published in 1812; a second volume was published in 1815. ……. But what interests us today, in discerning the length of eternity, is a lesser known story — the insightful, charming and timeless tale of the Shepherd Boy. A king summons a shepherd, who is famous for his tremendous wisdom, and challenges him to answer three questions.

The third question is: “how many seconds of time are there in eternity?” He answers: “In Lower Pomerania [northern Poland, at the southern tip of the Baltic Sea] is the Diamond Mountain, which is two miles and a half high, two miles and a half wide, and two miles and a half in depth; every hundred years a little bird comes and sharpens its beak on it, and when the whole mountain is worn away by this, then the first second of eternity will be over.”

Even shepherds are subject to arithmetic.

In Grimms’ story the mountain would have a volume of 15.625 miles3 or about 65 x 109 cubic meters. Assuming a density of 2000 kg/ m3 (or 2 x 106 g/m3), the mountain would weigh about 130 x 1015 g. Assuming further that the bird pecked 2 mg each time, the mountain would disappear after 65 x 1018 visits. Since each visit would occur every 100 years, the mountain would last 65 x 1020 years. And since 1 Zettayear is 1 x 1021 years the mountain would last 6.5 Zettayears (Zy). And that would then be one second of Eternal Time. The curious thing is that one Eternal Year would have to consist of 31.55 million Eternal Seconds and thus

1 Eternal Year = 205,124 Yottayears where a Yottayear (Yy) is 1 x 1024 years

Eternity itself is, of course, endless even if an Eternal Year is bounded by the Diamond Mountain.

But we must not forget that for the word “eternity” plurals are allowed, and the deeper truth is that many eternities, each endless, are possible. In fact, there may well be an eternity of eternities.


 

What’s so bad about bias?

August 13, 2020

The dictionary definition usually goes like this:

biasn. inclination or prejudice for or against someone or something (a person or group, or an opinion, or a theory, ……). v. incline, or cause to be inclined, in favour of someone or something

Unbiased brain image johnnycullen.net

A preference for anything is a bias. A preference for a particular food, or person, or pet, or idea is a bias.  Animals too exhibit behaviour which humans would interpret as bias. A bias is clearly a cognitive state based on the knowledge, memories and values stored in that brain at that time. It is also clearly a dynamic state and changes as the state of the brain changes. Consider a brain fully capable of thinking but empty of all memory, all values, and all knowledge. A brain with no preferences for anything! Such a brain would begin as truly unbiased when required to form an opinion. An impossible state, of course, but useful as a thought experiment for defining a zero bias condition.

A preference for anything is, in fact, the result of a cognitive judgement made by a brain based on the knowledge and memories it has and on the internal value scales it uses. Our empty, thinking brain could not form an opinion about anything without first having some knowledge and some value scale to apply. Forming an opinion, is itself, the creation of a preference and a bias. Expressing an opinion is an expression of bias. All knowledge is bias. The greater the knowledge held by a brain, the greater the bias it has. The clearer the set of values held by a brain, the greater its bias. An unbiased mind is an empty mind.

Bias, itself, is not a value. It is, I think, a description of a cognitive state. A knowledgeable person, a person with opinions, is a biased person.

A learned judge is a biased judge. An unbiased music critic with no prior opinions is a useless critic. A food critic without taste preferences would be unbiased but would also be worthless as a critic. Unbiased parents would show no preference for their own children. Without bias, “good” and “bad” start with equal value. I am incurably biased against what I consider “bad” and against people I don’t like. Bias is merely the current state of a functioning brain.

Yet, bias is considered “bad” and to be unbiased is considered “good”. I suspect it is because we conflate the state of bias with the value scale of fairness.

But bias and fairness are entirely different things.


 

Depraved, decadent and damned

August 5, 2020

Playing with words.

(For some unknown reason “depraved, decadent and damned” has the same rhythm in my mind as “bewitched, bothered and bewildered”).

Morality is entirely subjective and always relative. It varies with time and place and individual. The moral standards of a group are a composite of the individual standards of those making up the group. Yet we are obsessed in judging others about their immorality – and always by our standards. Why else would we have so many words to describe the nuances or gradations of immorality? I suspect that no age is more depraved or decadent than any other. The measuring stick is always the variable morality of the day and place.

One can always find an antonym for any of the plethora of words describing immorality, but I suspect that the words were coined first to describe the level of immorality rather than the level of morality. In English there seems to be an over-representation of words beginning with “d”. I just take a few of these though there are many, many others (corrupted, perverted, lewd, licentious, prurient, wanton, profligate, hedonistic, ……).

These ought to be put as questions but, since morality is subjective, I take the liberty to frame them as statements.

  • Hollywood is more debauched than Bollywood.
  • Los Angeles is more depraved today than ancient Rome ever was.
  • Tallulah Bankhead was more dissolute than Harvey Weinstein is.
  • JFK was more dissipated than LBJ.
  • Catholicism has degenerated more than Islam.
  • The West Coast of any country is always more decadent than the East (as evidenced by the US, Australia, India and Sweden).
  • California has more deviants now than Babylon had in its heyday.
  • China defiles the Uighurs as Genghis Khan defiled the Han.
  • Europeans despoiled the pyramids as ISIS despoiled Palmyra.

The nuances are fascinating. Decadence is not for the indigent but depravity is universal and indifferent to wealth. Decadence requires both wealth and indulgence to excess. (Clearly LA is more decadent than New York, Perth more decadent than Sydney, Bombay more decadent than Madras and Gothenburg more decadent than Stockholm). Depravity is simpler and just needs to be grossly immoral. On my very subjective scale of morality, I find depravity more immoral than decadence. Dissipation and dissolution include both moral and physical decay, though I tend to ascribe greater physical rottenness to dissipation. To be degenerate requires having had a high moral position to descend from. Debauchery always has innocence as a victim and is wasted on the already depraved. However, a debaucher would nearly always be depraved. Despoiling needs some artistic merit to begin with. What is foul and rotten cannot be despoiled. It does not take much to deviate from some norm to be considered a deviant. Every minority is necessarily deviant in fact, if not always in the popular discourse. In today’s politically correct world people who are fat, or old, or not pretty are the new deviants.

We are always morally superior to them. (This is inherent in the definition of we and them).

Naturally, all those others who are decadent and depraved are utterly damned.

Depraved, decadent and damned.


 

 

It is time for “Human Resources” to be retired and to return to basics

July 30, 2020

I was pleased to see that in India’s New Education Policy the “Ministry of Human Resource and Development” was to return to its pre-1985 name of the “Ministry of Education”.  This is not a comment about the new policy but about the use of the term “Human Resource”. The Ministry of Education became the HRD Ministry in 1985 during Rajiv Gandhi’s time as Prime Minister. But this was, in hindsight, both misguided and counter-productive. The intention was to show how “modern” and up-to-date India was. In practice it shifted the focus from the core needs of Education to the cosmetics of being seen to be modern.

News18: The Ministry of Human Resource and Development (HRD) has been renamed as the Ministry of Education following an approval from the Union Cabinet. The name change was a key recommendation of the draft New Education Policy, which has also been cleared in Wednesday’s Cabinet meeting. The HRD ministry name was adopted in 1985, during the tenure of former Prime Minister Rajiv Gandhi, as it was changed from ministry of education.

The term “human resource” was first used in 1893 though entirely in a descriptive way. The concept of mobilizing, training and managing personnel and employees in industry grew in the first half of the 20th century. Later it spread into the Military and all Defense Industries as the Second World War demonstrated clearly the need for training, educating and managing large groups of personnel. After the war the concept of managing personnel relationships spread into every branch of commerce and even into government and bureaucracies. It used to be the Personnel Department until it became trendy and fashionable in the late 1970s for corporations to use the term “Human resources” to show how caring they were.

Human Resource: Pioneering economist John R. Commons mentioned “human resource” in his 1893 book The Distribution of Wealth but did not elaborate. The expression was used during the 1910s to 1930s to promote the idea that human beings are of worth (as in human dignity); by the early 1950s it meant people as a means to an end (for employers). Among scholars the first use of the phrase in that sense was in a 1958 report by economist E. Wight Bakke.

It is my contention that the use of the term “human resource” has been misleading and, on balance, more bad than good. It has enshrined the notion of people being just another commodity in the economic cycle. The use of the term “human resource” has helped to apply the same principles to people as those applying to raw materials (cost, security of supply, alternative suppliers, competition between suppliers). Seeing humans as resources rather than “personnel” has encouraged – and enabled – the corporate world to dehumanize people and shift and change to the cheapest resource available. The entire notion of outsourcing, which has became a major area of HR, is based on the same principles of shifting risks of fluctuating production volumes to sub-suppliers.

Personnel and employers once exhibited loyalty, trust, a sharing of goals and commitment. In both directions. Values evolve. Employers have become faceless and so have the resources they employ. Resources, after all, are consumable. They are to be fully utilized and then discarded and replaced. Brand loyalty from customers is highly valued and to be pursued. Employer/employee loyalty is of no relevance if it is not specified in the employment contract. The goals of a large corporation are rarely anything shared by all the cogs in the large wheel. Corporations, instead, have HR Departments to produce Vision Statements which are meaningless and shared by no one. Human resources, for their part, are required to perform to specification, be judged by Key Performance Indicators, are trained (not educated) and are discarded and written-off when non-performing or obsolete.

So I am very pleased to see Human Resource Development in India return to Education. And it is about time that Human Resources returned to being about People.


 

Social distance versus social distancing

July 28, 2020

Social distancing in public health is about physical distancing but social distance in sociology is about race and attitudes to ethnic difference.

Social Distancing

Although the term was introduced only in the 21st century, social-distancing measures date back to at least the 5th century BC. The Bible contains one of the earliest known references to the practice in the Book of Leviticus 13:46: “And the leper in whom the plague is… he shall dwell alone; [outside] the camp shall his habitation be.” During the Plague of Justinian of 541 to 542, Emperor Justinian enforced an ineffective quarantine on the Byzantine Empire, including dumping bodies into the sea; he predominantly blamed the widespread outbreak on “Jews, Samaritans, pagans, heretics, Arians, Montanists and homosexuals”. In modern times, social distancing measures have been successfully implemented in several epidemics. In St. Louis, shortly after the first cases of influenza were detected in the city during the 1918 flu pandemic, authorities implemented school closures, bans on public gatherings and other social-distancing interventions. The influenza fatality rates in St. Louis were much less than in Philadelphia, which had fewer cases of influenza but allowed a mass parade to continue and did not introduce social distancing until more than two weeks after its first cases. Authorities have encouraged or mandated social distancing during the COVID-19 pandemic.

However in sociology, social distance is all about race.

In sociology, social distance describes the distance between different groups in society, such as social class, race/ethnicity, gender or sexuality. Members of different groups mix less than members of the same group. It is the measure of nearness or intimacy that an individual or group feels towards another individual or group in a social network or the level of trust one group has for another and the extent of perceived likeness of beliefs

Bogardus Social Distance Scale (1925)

This scale was developed by Emory Bogardus in 1924 and named after him. It is one of the oldest and still in use, psychological attitude scales. Due to its unidimensional nature, prejudice or the lack of it towards only one community or group can be measured at one point in time. The Bogardus social distance scale is also known as a cumulative scale because an agreement with one item shows agreement with any number of preceding items ……… 

For example, the Bogardus social distance scale is set up as a series of questions that ask an individual or a respondent, their feelings or the closest degree of intimacy towards a member of a group in question. A score of 1 is assigned to each option, asking the individual what the closest degree of intimacy is that he or she would be willing to admit a member of the group in question. The following is asked:

  • Would you be willing to marry a member of this group? (1.0)
  • Would you be willing to have a member of this group as your close personal friend? (2.0)
  • Would you be willing to have a member of this group as your neighbor? (3.0)
  • Would you be willing to have a member of this group as your colleague at work? (4.0)
  • Would you be willing to have a member of this group as a citizen of your country? (5.0)
  • Would you be willing to have a member of this group visit your country as a non-citizen? (6.0)
  • Would you be willing to have a member of this group be excluded from associating with your country in any way? (7.0)

The ratings of multiple people from one community is collected as a cumulative and the average of this number represents the value of the social distance scale.

The Bogardus scale tries to measure social differences between attitudes of members of different ethnic communities as perceived by members of one community. It does not address social distance within a community.

“Social media” can thus promote social distancing (public health) while reducing social distance (sociology).


 

There can be no intrinsic value to a human life (or to anything)

July 14, 2020
  1. If every human life has a fixed value, and a higher value is a good thing for humankind, then the greater the population of humans the better.
  2. If human life has a variable value, always positive but varying over time and varying by individual, then humankind is still best served by increasing population.
  3. If a human life has a variable value which can even be negative, then the value to humankind must be considered by a value summation over the entire life of an individual.

I question whether value (of anything) can ever be intrinsic. Nothing has value unless

  1. judged by a mind (or a consensus of minds),
  2. against a value scale to judge by.

I read an article recently which argued that life had intrinsic value and the intrinsic value of a human life was greater than that of a cockroach. To whom, I wondered? By what value scale? Qualifying the word value with the word intrinsic is meaningless.

Intrinsic value is often used to define the financial worth of an asset but I am not concerned with that particular use of the words. Philosophy distinguishes between intrinsic and extrinsic value and takes intrinsic value to be a necessary precursor for judgements of morality.

Intrinsic value has traditionally been thought to lie at the heart of ethics. Philosophers use a number of terms to refer to such value. The intrinsic value of something is said to be the value that that thing has “in itself,” or “for its own sake,” or “as such,” or “in its own right.” Extrinsic value is value that is not intrinsic. ….. Many philosophers take intrinsic value to be crucial to a variety of moral judgments. For example, according to a fundamental form of consequentialism, whether an action is morally right or wrong has exclusively to do with whether its consequences are intrinsically better than those of any other action one can perform under the circumstances. ……

The question “What is intrinsic value?” is more fundamental than the question “What has intrinsic value?,” but historically these have been treated in reverse order. For a long time, philosophers appear to have thought that the notion of intrinsic value is itself sufficiently clear to allow them to go straight to the question of what should be said to have intrinsic value. ….. 

Suppose that someone were to ask you whether it is good to help others in time of need. Unless you suspected some sort of trick, you would answer, “Yes, of course.” If this person were to go on to ask you why acting in this way is good, you might say that it is good to help others in time of need simply because it is good that their needs be satisfied. If you were then asked why it is good that people’s needs be satisfied, you might be puzzled. You might be inclined to say, “It just is.” Or you might accept the legitimacy of the question and say that it is good that people’s needs be satisfied because this brings them pleasure. But then, of course, your interlocutor could ask once again, “What’s good about that?”  …….  At some point, though, you would have to put an end to the questions, not because you would have grown tired of them (though that is a distinct possibility), but because you would be forced to recognize that, if one thing derives its goodness from some other thing, which derives its goodness from yet a third thing, and so on, there must come a point at which you reach something whose goodness is not derivative in this way, something that “just is” good in its own right, something whose goodness is the source of, and thus explains, the goodness to be found in all the other things that precede it on the list. It is at this point that you will have arrived at intrinsic goodness. ….  That which is intrinsically good is nonderivatively good; it is good for its own sake. 

But intrinsic is as subjective as value is or morality is. Rather than intrinsic value leading to morality, it is the subjective value scale of morality in a mind, which leads to an assessment of being intrinsic. And the most fundamental value in any mind is it’s own perception of what is good and what is bad. And that is subjective.

The words “intrinsic” and “value”, together and by themselves, are meaningless. In fact, the word “value” alone, only has meaning when assessed by someone as being “of value to someone or to something”, using some subjective value scale. The net intrinsic value of the known universe is zero. But even that assessment is subjective.


 

As sanctity declines, the sanctimonious proliferate

July 6, 2020

Sacred and sanctity originated with gods and religions but nowadays are applied regularly in non-religious contexts. Sanctity – in the meanings of inviolability, or deserving of respect – is claimed for many things but no claim for sanctity (religious or otherwise) is actually anything more than wishful thinking for a desired state. From sacred also come sanctimony and the sanctimonious. Once upon a time, sanctimony was a quality displayed by saints, but it is now always about a claim, or a display, of a pretended, self-proclaimed, moral superiority. I observe that sanctimony is invariably called upon by the sanctimonious when rational argument fails.


Sacreddedicated or set apart for the service or worship of a deity; devoted exclusively to one service or use; worthy of religious veneration; entitled to reverence and respect; of or relating to religion; not secular or profane; unassailable; inviolable; highly valued and important

Sanctity: godliness; holiness of life and character; the quality or state of being holy or sacred; inviolability; deserving of veneration or respect

Sanctimony: pretended or hypocritical moral superiority; (archaic) the quality of holiness or godliness

Sanctimonious: hypocritically pious or devout; falsely claiming moral superiority


Sacrosanct: having extreme sanctity (extreme inviolability, sort of like the most best)


A search for sanctity reveals that over 90% of secular usage is in the context of human life. The next most common occurrences are with reference to the sanctity of marriage or of law. In the context of religious associations it is still used, though less dogmatically, for, among other things, the sanctity of the Church; of priests; of temples; of holy places. Whereas the original religious usage implied something inherently extraordinary, out of this world, the word has been debased by its use to try and impart a sense of importance to concepts or situations, where there is, in fact, nothing very special. In a secular context, the word is now used widely to imply that something should be inviolable and deserving of extraordinary veneration or respect (for example with the sanctity of nature, or of the scientific method, or of natural forces, or of government, or of institutions).

As a philosophical concept the sanctity of life derives from religious or ethical schools of thought.

Stanford Encyclopedia of Philosophy: According to this ‘sanctity of life’ view, human life is inherently valuable and precious, demanding respect from others and reverence for oneself. 

WikipediaIn religion and ethics, the inviolability or sanctity of life is a principle of implied protection regarding aspects of sentient life that are said to be holy, sacred, or otherwise of such value that they are not to be violated. This can be applied to both animals and humans or micro-organisms, ….

But even in philosophy and logic the sanctity of life is just an assertion. It does not flow logically from, and is not inherent in, existence or in life. References to the sanctity of life  – which overwhelmingly dominates usage of the word – are so far from reality that the word sanctity has become just a parody of meaning inviolable. Using the phrase itself has become little more than virtue signalling. The association of inviolability with sanctity has been fatally diluted by the indiscriminate use of the word. Sanctity of life has even become a politically charged term in the abortion debate (with abortion supporters denying sanctity of life, while abortion opponents are in favour of such sanctity). But they both miss the point and lose track of the real issue of when life can be said to begin. The word is further debased in its meaning of inviolability when those supporting abortion oppose capital punishment and vice versa. Where sanctity was once used to denote the fact of inviolability, it has now come to mean an invocation of, or a desire for inviolability. The sanctity of the law is another phrase which has little to do with any inherent quality of law. Laws are merely man-made rules and regulations and they vary across space and change all the time. There is nothing sacred about law – only pragmatism for the functioning of societies. However, those charged with maintaining compliance, (the prevailing power, governments, police, courts, judges, lawyers, …..), have a strong desire that The Law, and laws, be considered inviolable. When they extol the sanctity of the law it is partly wishful thinking and partly a desire to protect themselves from criticism for failing to ensure compliance. Similarly the sanctity of marriage stems from religious and social desires for stability, rather than from any inherent inviolability of the married state. A claim to sanctity of the scientific process is used far too often to smother dissenting thought, even though the essence of the scientific process is to dissent and to question. Sanctity, as used, no longer means inviolability; it now means a presumption of, and a desire for, inviolability. Sanctity is on the decline and it is difficult to find any use of the word where inviolability is any more than a  desire (sometimes virtuous, sometimes not). The sanctity of religious institutions and places and people has been utterly debased by the all too many examples of inviolability being used to protect bad behaviour. Sanctuary derives from sanctity of place and this notion has been so abused as to be anti-social in itself. The sanctity of life or law or marriage or scientific method are empty claims and, again, usually invoked to protect errant behaviour. False claims of sanctity end up as sanctimony.

Sanctimony and the sanctimonious, though, are thriving. From sanctimony being used to describe the quality of being holy or virtuous, by the late 16th century (Shakespeare), it was also being used in the meaning of a hypocritical piousness. By the 19th century, the word was almost exclusively used to mean a hypocritical and pretended claim of moral superiority. Through the 1800s, the use grew of sanctimonious as a derogatory term for hypocritical and righteous do-gooders. In the present day, a dearth of saints and the saintly has all but killed off the original meaning.

The variety of platforms now available for public “debate” (including for proselytizing, preaching, bullying and haranguing) is unprecedented. In these “debates”, when arguments fail, the final defense is to claim moral superiority. As a last resort, bringing in Hitler or the Nazis makes it easy to claim moral superiority (Godwin’s Law). The nice thing about moral superiority is that it is “righteous” and makes it “ethical” to ignore rational argument. Sanctimony is especially useful when there is no time for exercise of mind. Social media provide little space, and less time, for developing arguments. It provides the fertile ground for sanctimony to flourish. Debate is by way of competing assertions. The weight of an assertion is determined by the number of “likes” it attracts, which in turn, is influenced by the perceptions of righteousness, political correctness and perceived virtue. The greater the level of sanctimony that an assertion can bring to bear, the greater the chance of winning more likes (and never mind the argument). The weaker the argument for a position, the greater the need for sanctimony. The sanctimonious are those with the greatest need, and some skill, to demonstrate sanctimony. (It can be quite amusing when the sanctimonious lose elections. As with an indignant Jeremy Corbyn who, after his resounding election defeat, claimed to have won the argument but lost the election). A reference to a sanctimonious moron could be taken as tautology.

It used to be the plebeians. Then in the 1830s, they became the “Great Unwashed”. Their natural successors today are the sanctimonious.

There is probably a connection between the decline of sanctity and the rise of the sanctimonious. When there is no real sanctity, false claims of purported sanctity lead to sanctimony. I have no doubt that investigating the connection could soon provide a suitable subject for a PhD in Social Sanctimony.


 


%d bloggers like this: