Political Dyslexia

November 7, 2017



Good laws are made to be broken

November 5, 2017

A bad law is one with which nobody complies.

A law which everybody complies with, without any coercion of pain or penalty, is superfluous.

(All natural laws are superfluous in that it is not possible not to comply with them).

Therefore good laws can only exist in the space between bad laws (zero compliance) and superfluous laws (complete compliance), in the space of partial compliance.

Good laws are therefore made to be broken.


Will recognition of “fake news” be followed by “fake science”

November 3, 2017

Collins Dictionary has chosen “fake news” as its word for 2017.

When a partisan publication exaggerates – even wildly – in favour of its own cause, it causes no great surprise.  It is not even too astonishing when it fabricates news or omits news to further its own agenda. The insidious nature of “fake news” is worst when it is a supposedly objective publication which indulges in fake news to further a hidden agenda. So when Breitbart or the Daily Mail or Huffington Post produce much of their nonsense it causes no great surprise and hardly merits the sobriquet of “fake news”, even if much of the “news” is slanted or exaggerated or skewed or just plain lies. It is when a publication, having a reputation for objectivity, misuses that reputation to push its own agenda, that “fake news” takes on a life of its own.

It is not that this is anything new but certainly the US Presidential Election has brought “fake news” to a head. “Fake News” applies though to much more than just US politics. Of course CNN heads the list of purveyors of “fake news”. CNN has never been objective but they once generally checked their facts and used to separate straight reporting from opinion. I used to find them, at least, fairly reliable for factual reporting. But they have abandoned that approach and I find that they not just unreliable but also intentionally misleading. Their “journalists” have all become lobbyists and “CNN” has become synonymous with “Fake News”.

I once was a regular reader of the Washington Post. They were biased but were not unreliable as to the facts. It was quite easy to just discount for bias and get what I thought was a “true” picture. But they, too, have degenerated swiftly in the last 2 years. Stories are not just distorted, they are even fabricated. But the real disappointments for me in the last 24 months has been the New York Times. Not just in the space of US politics. The NYT has its own definitions of what is politically correct in politics, in science and even in the arts. Somewhere along the way they have made a conscious decision that they are “lobbyists” rather than reporters. They have decided that, for what they have defined as being “politically correct”, pushing that view justifies omission, exaggeration, “spinning” and even fabrication. Straight reporting has become extinct.

Lobby groups such as Huff Post and Daily Kos and Red State are full of blatant falsifications but have no news reputation of any significance at stake. They are not, therefore, included in my take on the top purveyors of fake new.

If 2017 has seen the recognition of the widespread use of fake news, I am looking to 2018 to recognise the proliferation of fake science. There is fake science being disseminated every day in big physics (CERN funding), pharmaceuticals, “climate science”, behavioural studies, sociology, psychology and economics. Much of fake science follows funding. Perhaps there will be greater recognition that “good science” is neither decided by nor subject to a poll.



The one or the many

October 31, 2017

It is the classic dilemma of our age which shows up everywhere.

I nearly always tend towards prioritising the one primarily because without the one there cannot be the many, without the local you cannot get to global, without the national there is no international and without the excellence of the few you cannot get the good of the herd. I prefer the highest multiple possible to the lowest common factor. It colours my politics. I prefer the search for excellence rather than the common mediocrity of socialism. I prefer the the internationalism which comes as a consequence of strong nationalism to the bullying of the UN or the EU.

I side with “to each as he deserves” rather than to “each as he desires”. Freedoms flow bottom up from the individual to the group rather than privileges and sanctions flowing from the group down to the oppressed individual.


Eugenics is already being practiced

October 31, 2017

In a survey published in 2008 of the 18 (now 20) countries in EUROCAT, 88% of neural tube defects were detected prenatally and 88% of these were aborted. Of  Downs Syndrome cases, 68% were detected prenatally and 88% of these too were aborted.

Iceland (population of 330,000) has virtually eliminated Downs Syndrome births though in such a small population there are only one or two cases every year. However in Denmark around 98% of Downs Syndrome fetuses are aborted , in France around 77% and even in the US around 67% are aborted.

In Sweden

The mother chose to terminate the pregnancy in 88 percent of cases when the child had anencaphaly, when part of the child’s brain is missing. They also chose abortion in 75 percent of cases where doctors detected bilateral renal agenesis – when the child is missing both kidneys.

Artificial selection

As technology develops, and more fetal abnormalities can be detected, and as abortion becomes less stigmatised, it is not only serious abnormalities which lead to abortion. With abortion on demand, even conditions which are eminently treatable (cleft palate for example) are leading to the choice to abort. The availability and practice of abortion is itself becoming the main vehicle of artificial selection. Correction of fetal defects is usually not possible. Even if genetic “tailoring” for designer babies is not quite there yet, the incidence of fetuses being born with even minor defects is inevitably declining.

Selective abortion is artificial selection, and that is, after all, just eugenics.


On birth rates, abortions and “eugenics by default”


The universe shouldn’t exist

October 28, 2017

Even if the Standard Model is right and an equal amount of matter and anti-matter was produced at the Big Bang, it still does not explain why the matter and anti-matter did not exterminate each other with a huge flash of energy. Why a huge amount of energy was first triggered to be absorbed to create matter and anti-matter at the Big Bang is brushed aside as being at a singularity, before the laws of physics existed and maybe before even time existed.

CERN has a new press release showing that apart from sign – as the standard model requires – no difference can be detected between a proton and an antiproton.

It is not the new measurements which say the universe should not exist. It is in fact the standard model which says that the universe should not exist. Maybe the standard model has to be modified.

Riddle of matter remains unsolved: Proton and antiproton share fundamental properties

Scientists are still in search of a difference between protons and antiprotons which would help to potentially explain the existence of matter in our universe. However, physicists in the BASE collaboration at the CERN research center have been able to measure the magnetic force of antiprotons with almost unbelievable precision. Nevertheless, the data do not provide any information about how matter formed in the early universe as particles and antiparticles would have had to completely destroy one another. The most recent BASE measurements revealed instead a large overlap between protons and antiprotons, thus confirming the Standard Model of particle physics. Around the world, scientists are using a variety of methods to find some difference, regardless of how small. The matter-antimatter imbalance in the universe is one of the hot topics of modern physics. …….. 

The BASE collaboration published high-precision measurements of the antiproton g-factor back in January 2017 but the current ones are far more precise. The current high-precision measurement determined the g-factor down to nine significant digits. This is the equivalent of measuring the circumference of the earth to a precision of four centimeters. The value of 2.7928473441(42) is 350 times more precise than the results published in January. “This tremenduous increase in such a short period of time was only possible thanks to completely new methods,” said Ulmer. The process involved scientists using two antiprotons for the first time and analyzing them with two Penning traps.

We have to bear in mind  that CERN has a massive confirmation bias. Their primary reason for existence is to confirm the standard model.


The Liar Paradox can be resolved by the unknowable

October 17, 2017

A paradox appears when reasonable assumptions together with apparently valid logic lead to a seeming contradiction. When that happens, then applying the same rules of logic lead to the further conclusion that either

  1. the assumptions were wrong or
  2. that the logic applied was not valid or
  3. that the seeming contradiction was not a contradiction.

Where a paradox lies in wrongly identifying a contradiction, it is just an error. If the paradox is due to an error of applying the rules of logic it is also just a mistake. However sometimes the paradox shows that the logic itself is flawed or inconsistent. That can lead to a fundamental revision of the assumptions or rules of the logic itself. Paradoxes have contributed to whole realm of the philosophy of knowledge (and of the unknowable). Many paradoxes can only be resolved if perfectly reasonable assumptions can be shown to be wrong. Many scientific advances can be traced back to the confrontation of the starting assumptions.

Confronting paradoxes has led to many advances in knowledge. Niels Bohr once wrote, “How wonderful that we have met with a paradox. Now we have some hope of making progress.” A paradox is just an invitation to think again. 

Classical Logic is digital. It allows no grey zone between true and false. What is not true is false and what is not false must be true. This requirement of logic is built into the fabric of language itself. Classical Logic does not allow a statement to be, true and false simultaneously, or neither false nor true. Yet this logic gives rise to the Liar Paradox in its many formulations. The paradox dates back to antiquity and in its simplest forms are the statements

“I am lying”, or

“This statement is false”

There are many proposed ways out of this paradox but they all require some change to the rules of Classical Logic. Some require a term “meaningless” being introduced which requires that a proposition may be neither true nor false. Others merely evade the paradox by claiming that the statement is not a proposition. Bertrand Russel took the radical step of excluding all self reference from the playing field.

Experts in the field of philosophical logic have never agreed on the way out of the trouble despite 2,300 years of attention. Here is the trouble—a sketch of the paradox, the argument that reveals the contradiction:

Let L be the Classical Liar Sentence. If L is true, then L is false. But we can also establish the converse, as follows. Assume L is false. Because the Liar Sentence is just the sentence “L is false,” the Liar Sentence is therefore true, so L is true. We have now shown that L is true if, and only if, it is false. Since L must be one or the other, it is both.

That contradictory result apparently throws us into the lion’s den of semantic incoherence. The incoherence is due to the fact that, according to the rules of classical logic, anything follows from a contradiction, even “1 + 1 = 3.” 

It seems to me that the paradox vanishes if we allow that what is not true is not necessarily false, and what is not false is not necessarily true. The fault lies in our self-determined assumption of the absence of a continuum between true and false. We can as well create a grey zone – some undefined state between and outside the realms of true and false – which would be an unknown state. Perhaps even an unknowable state. It is a strong indication – even if not a proof – that the unknowable exists and that logic needs to be fuzzy rather than digital.

The Liar Paradox is connected to the Fitch Paradox of Knowability

The paradox of knowability is a logical result suggesting that, necessarily, if all truths are knowable in principle then all truths are in fact known. The contrapositive of the result says, necessarily, if in fact there is an unknown truth, then there is a truth that couldn’t possibly be known.

More specifically, if p is a truth that is never known then it is unknowable that p is a truth that is never known. The proof has been used to argue against versions of anti-realism committed to the thesis that all truths are knowable. For clearly there are unknown truths; individually and collectively we are non-omniscient. So, by the main result, it is false that all truths are knowable. The result has also been used to draw more general lessons about the limits of human knowledge. 

Language is invented. Both words and the rules of grammar are invented – not discovered. We are forced to define words such as “infinite” and “endless” and “timeless” attempting to describe concepts which we cannot encompass with our finite brains and our limited physical senses. It is not possible to measure an endless line with a finite ruler.

The unknowable exists and it is therefore that we need the word “unknowable” .


Known, unknown and unknowable

The unknowable is neither true nor false

Gödel’s Incompleteness Theorems


Could Mount Agung eruption be the VEI 5+ volcano that is overdue?

October 16, 2017

We have not had a VEI 5+ volcanic eruption for 26 years since the Mount Pinatubo eruption in 1991. In 2015, I pointed out that the probability of a VEI 5+ volcano eruption within 5 years was over 95%. The probability of a VEI 5+ eruption in the next two years must now be approaching 99%.

(see also The next VEI 5+ volcanic eruption is overdue).


It has been 24 26 years since the last VEI 5+ (Mount Pinatubo, 1991, VEI 6) occurred and the probability that a VEI 5+ volcanic eruption will occur within the next 5 years is now over 95%. There are around 10 – 14 VEI 5+ eruptions every hundred years and for the the last 300 years the time between eruptions has been as short as 1 year and as long as 23 years. The current gap could be the longest recorded in three centuries. There are, on average, 2 eruptions of intensity 6 every hundred years and so the probability that an eruption of VEI 6 could occur within 5 years is about 50% (current gap 24 years, average gap 50 years). That a supervolcanic eruption of VEI 7 or greater could occur within the next 5 years is less than 1%.

Mount Agung’s volcanic activity is now reaching levels which suggests that an eruption is “imminent”.

Volcano Agung in Bali is showing worrisome signs of a major eruption, writes German climate blogger Schneefan here. The highest level of activity with multiple tremor episodes were just recorded. You can monitor Agung via live cam and live seismogram.

The 3000-mter tall Agung has been at the highest warning level 4 since September 21.

Schneefan writes that the lava rise has started and that “an eruption can be expected at any time“.

So far some 140,000 people have been evacuated from the area of hazard, which extends up to 12 km from the volcano. Schneefan writes:

Yesterday ground activity by far exceeded the previous high level. Quakes have become more frequent and stronger, which indicates a stronger magma flow (see green in the histogram). Since October 13 there has been for the first time a “nonharmonic trembling (tremor), which can be seen in red at the top of the last two bars of the histogram.”

The colors of the columns in the bar chart from bottom to top stand for perceptible earthquakes (blue) low eartnhquakes (green) surface quakes (orange . Just recently red appeared, signifying non harmonic tremors.  The seismogram below shows what are at times longer period quakes: meaning magma is violently flowing in the volcano. Source: https://magma.vsi.esdm.go.id/.

Since yesterday the seismogram for AGUNG has been showing powerful rumbling (red).

The seismogram of AGUNG shows powerful tremors (level RED). The seismogram is updated every 3 minutes: Source: Seismogramm

Because Agung is located near the equator, a major eruption with ash flying up into the stratosphere would have short-term climatic impacts that could last a few years.

Agung last erupted in 1963 with an explosivity index of VEI 5, sending a plume of ash some 25 km into the atmosphere before leading to a cooling of 0.5°C. The eruption of Pinatubo in the Philippines in 1991 led to a global cooling of 0.5°C.



October 8, 2017

A little irritated today by some stupid behaviour.


Fresh water is globally abundant but often locally mismanaged

October 7, 2017


There is no global water shortage. In many places, the scarcity that exists is caused by bad local planning and inadequate storage and distribution, but there is no global shortage. Variable climate and weather and “natural” catastrophes are often blamed, but wherever scarcity is experienced, it is a lack of planning and/or unbridled expansion and/or a profligate use of available resources which is the real cause.

Abundance or scarcity

Alarmist organisations tend to paint a bleak future:

WWF: Only 3% of the world’s water is fresh water, and two-thirds of that is tucked away in frozen glaciers or otherwise unavailable for our use. As a result, some 1.1 billion people worldwide lack access to water, and a total of 2.7 billion find water scarce for at least one month of the year.

This is a false picture. Just one third of the “3% of the world’s water” is actually equivalent to over one million years of water at current utilisation levels. The amount of global fresh water is not a cause of any scarcity.

More importantly, what they fail to mention is that all human activities use just 0.01% (one ten thousandth) of the annual precipitation just over land. It is not that scarcities don’t exist but they exist mainly because of inadequate planning, mismanagement, incompetence and plain stupidity among those affected. Local resources are grossly overstretched or adequate storage is not planned for by the societies affected. Urban areas expand without planning and exhaust the easily accessible ground water resources available. Rainwater is not harnessed locally. Distribution is ignored. Rivers are polluted upstream. Ignoring common sense has consequences. Often this rank stupidity is at the local community or national government level. The problems are local and the solutions are local. It is easier for local bodies to demand some all encompassing solutions to their problems. But this is a red herring to divert responsibility away from the local incompetence.

Water scarcities are much more due to the local non-application of intelligence than any global or climate or weather events.

Water on Earth

The quantity of water on earth has been pretty constant for about 4 billion years. All the water on earth can be accounted for as 3 main categories

  1. that trapped in molten rocks,
  2. saline water and
  3. “fresh” water.

In the most part water on earth is neither created nor destroyed. Fresh water is essential for human activity. Human activity does not, though, destroy water. A relatively miniscule amount of water is created by combustion processes and bio-degradation processes. But human activity does contaminate “fresh” water and makes for an extra stop for the precipitation over land as it finds its way back to the oceans. It does therefore add some time delays to a small quantity of water within the “normal” water cycle. Human activity requires “fresh” rather than saline water but all global activity uses less than 10×1012 kg/year. Almost 92% of this is for agricultural use (crops, pasture and animals), around 4% for industrial use and less than 4% for domestic use.

The primary – and overwhelmingly dominant source – of fresh water is precipitation (rain, snow, sleet, hail, fog, dew …) from the atmosphere. Global precipitation gives about 506×1015 kg/year, but most of this is over the oceans. Just over the land mass of the earth, precipitation gives about 108×1015 kg/year of fresh water. All human activity thus uses only about 0.01% (one ten thousandth) of the precipitation that occurs just over land. Fresh water scarcity is clearly non-existent at a global level and when averaged over time. Scarcities – and some severe scarcities – occur locally, at certain times. At any instant in time (average) the atmosphere holds about 12.6×1015 kg (12.6 P kg) which itself represents about 1300 years of total human consumption. (With a total precipitation rate of 506×1015 kg/year, water has a residence time in the atmosphere of only about 9 days). It is not, therefore, that the atmosphere, acting as a global reservoir, holds too little which is the cause of the local scarcities. Global reservoirs of fresh, ground water (under the surface in liquid form) amount to 10×1018 kg which is about 1 million years of human utilisation. But these reservoirs are not evenly spread around the world and some are much deeper and less accessible than others. Lakes and swamps and rivers hold some 100×1015 kg of fresh water which represents about 10,000 years of human utilisation. The global capacity of fresh water reservoirs on the surface (10,000 years) and of ground water (1 million years) pose no limitations on human utilisation. The geographical location of these and the seasonal nature of their flows can and do place limits and that, in turn, when combined with a dearth of common sense, leads to local scarcities.

When we include all the water thought to be trapped in the earth’s mantle, we don’t really know how much water there is on earth . We don’t really know how the water got to earth in the first place either. Unlike hydrogen, water vapour in the earth’s atmosphere does not leak out into space. Some surface water is continuously lost into the earth’s interior as tectonic plate movement leads to subduction of land mass. Equally some water from the earth’s crust reaches the surface by volcanic eruptions where degassing of hydroxyl radicals (OH) trapped within molten rocks is released. It is thought that the mass of water on earth has remained virtually constant for billions of years. Very little water is “destroyed” by decomposition into hydrogen and oxygen and very little is “created” (primarily by combustion of hydrocarbons).

How the water came to be on earth is a matter of speculation. It could have been there in the original mix as elemental hydrogen and oxygen which later combined and was then trapped in molten rock. It may have been brought to the earth’s surface as the earth cooled and violent volcanic or tectonic activity brought degassing molten rocks to the surface. Or it could have come from some icy comets or asteroids or meteors which collided with the earth in its infancy. Studies of some zircons indicate that water was present on earth as far back as 4 billion years ago (with the earth’s formation put at 4.5 billion years ago).

The mass of the Earth is estimated to be 5.98×1024 kg (5.98 Yotta kg). Hydrogen needs eight times its own mass of oxygen to make water. On earth the amount of oxygen available is far in excess of that needed to convert all the available hydrogen to water. Water on earth is thus limited by hydrogen rather than oxygen. Just 0.8% of the oxygen present on earth would be sufficient to convert all the available hydrogen (260 ppm of the earth’s mass) to water. If all the available hydrogen could be converted into water we could theoretically have 14×1021 kg of water.  We can account for about 8.4×1021 kg which is 60% of the theoretical maximum. Around 85% (as a crude estimate) lies in the deep mantle and 15% at or near the earth’s surface.

Water at or near the surface, in all its forms (including atmospheric water but excluding water in the mantle), is estimated to have a mass of 1.386×1021 kg (1.386x 109 cubic kilometers). However about five times that (estimated range is from 2 to 10 times ) is possibly held in the mantle within molten rocks. Of the total  surface water about 97.5% is salt water and only 2.5% is fresh water (35×1018 kg). The fresh water “storage capacity” is mainly in solid form (ice sheets, glaciers) and accounts for about 68.7% of the fresh water (24×1018 kg). The remainder is mainly ground water (10.5×1018 kg) with about 0.4×1018 kg as fresh water on the earth’s surface. Around 69% of this surface fresh water is in the form of ice and permafrost and not readily accessible. Lakes and swamps and rivers hold some 100×1015 kg. The water held in all living matter only accounts for about 1.1×1015 kg.

Water reservoirs on earth

1 Peta kg (P kg) = 1015 kg

The Water cycle

It is not just the size of the global reservoirs that matters. More important for human utilisation are the flows making up the water cycle. The dominant loop is the manner in which fresh water enters the atmosphere and then returns to earth. The saline water of the oceans covering some 70% of the earth’s surface is converted to fresh water mainly by evaporation but also by freezing (mainly near the poles). Only the evaporation is of significance for water entering the atmosphere. Ice does sublime and contribute to the water input to the atmosphere but this amount is minuscule compared to the evaporation from the ocean surfaces. Over the earth’s land mass, water enters the atmosphere by evaporation from the soil and open water surfaces. A very small amount is also transferred by transpiration from living things.

Human consumption hardly shows up in the big picture of the water cycle. Of the precipitation which falls on land 70% is re-evaporated. About 2% goes into ground water and thence to the ocean. The remaining 28% returns to the ocean as surface water run-off. The human part which is only about 0.01% of the precipitation on land is just a small side loop in the surface water run-off section and an insignificant loop in the ground water part of the cycle.

Consumption and scarcity

The water footprint of humanity

The WF of the global average consumer was 1,385 m3/y. The average consumer in the United States has a WF of 2,842 m3/y, whereas the average citizens in China and India have WFs of 1,071 and 1,089 m3/y, respectively. Consumption of cereal products gives the largest contribution to the WF of the average consumer (27%), followed by meat (22%) and milk products (7%). 

Human “consumption” today, with a population of 7.1 billion, comprises of about 9.2×1012 kg/year for agriculture and domestic animals, 0.45×1012 kg/year for industrial use and about 0.35×1012 kg/year for domestic use. (10×1012 kg/year in total). Around 1.5 billion humans are subject to scarcities. If we add an additional 30% utilisation (to give a total of 13×1012 kg/year), at the right times and in the right places, it would suffice to eliminate all scarcities. Further suppose that global population increased to around 10 billion by 2100. To avoid scarcities, utilisation would then need to be ramped up to 18.5×1012 kg/year. Precipitation over land would still be about 6,000 times greater than this need. The run-off through lakes and rivers into the oceans would represent about 2,000 times the desired human consumption. Even with the increased utilisation, the global capacity of fresh water reservoirs on the surface (6,000 years) and of ground water (600,000 years) would pose no limitations on human utilisation.


There is no shortage of water. Fresh water scarcities are not a problem of global or temporal availability but essentially of local acquisition, storage and distribution. The solution lies locally – not in grandiose global campaigns. The solutions are relatively simple – but not necessarily inexpensive – but require intelligent planning and implementation of local acquisition, distribution and storage. 














%d bloggers like this: