Archive for the ‘Alarmism’ Category

Birth and the 116 other things which increase cancer risk

October 29, 2015

The good old WHO.

I suppose they do do some good, but they also make some horrible blunders as with the UN introduced cholera epidemic in Haiti, or with the initial downplaying of the Ebola outbreak in some African countries, or when their panel members take money from vaccine manufacturers to recommend mass flu vaccination programs. As with all UN organisations the staff are a mixture of professionals, surrounded by bureaucrats with political agendas from their home countries, and with some members from partisan lobby groups who promote their own causes and self-interests. WHO panels which recommend certain drugs or mass vaccination programs always seem to contain members with commercial ties to the pharmaceutical industry. Many in the WHO justify their alarmist tactics as a means to stimulate or trigger actions and – inevitably – many of these actions are totally unnecessary (but they are often very lucrative for some members of the WHO and their sponsors).

Now the WHO are going after processed and even red meat as causing cancer. But they have had to torture their data to calculate the risk. They forget that living is risk. Not being born, however, carries no risk of dying of anything. Therefore, the risk of cancer due to being born is far, far greater than that introduced by any other parameter or substance.  I won’t be changing my meat eating habits just yet.

Their list of 116 other things – besides birth – that increase the risk of cancer are taken from the Daily Mail.

1. Tobacco smoking

2. Sunlamps and sunbeds

3. Aluminium production

4. Arsenic in drinking water

5. Auramine production

6. Boot and shoe manufacture and repair

7. Chimney sweeping

8. Coal gasification

9. Coal tar distillation

10. Coke (fuel) production

11. Furniture and cabinet making

12. Haematite mining (underground) with exposure to radon

13. Secondhand smoke

14. Iron and steel founding

15. Isopropanol manufacture (strong-acid process)

16. Magenta dye manufacturing

17. Occupational exposure as a painter

18. Paving and roofing with coal-tar pitch

19. Rubber industry

20. Occupational exposure of strong inorganic acid mists containing sulphuric acid

21. Naturally occurring mixtures of aflatoxins (produced by funghi)

22. Alcoholic beverages

23. Areca nut – often chewed with betel leaf

24. Betel quid without tobacco

25. Betel quid with tobacco

26. Coal tar pitches

27. Coal tars

28. Indoor emissions from household combustion of coal

29. Diesel exhaust

30. Mineral oils, untreated and mildly treated

31. Phenacetin, a pain and fever reducing drug

32. Plants containing aristolochic acid (used in Chinese herbal medicine)

33. Polychlorinated biphenyls (PCBs) – widely used in electrical equipment in the past, banned in many countries in the 1970s

34. Chinese-style salted fish

35. Shale oils

36. Soots

37. Smokeless tobacco products

38. Wood dust

39. Processed meat

40. Acetaldehyde

41. 4-Aminobiphenyl

42. Aristolochic acids and plants containing them

43. Asbestos

44. Arsenic and arsenic compounds

45. Azathioprine

46. Benzene

47. Benzidine

48. Benzo[a]pyrene

49. Beryllium and beryllium compounds

50. Chlornapazine (N,N-Bis(2-chloroethyl)-2-naphthylamine)

51. Bis(chloromethyl)ether

52. Chloromethyl methyl ether

53. 1,3-Butadiene

54. 1,4-Butanediol dimethanesulfonate (Busulphan, Myleran)

55. Cadmium and cadmium compounds

56. Chlorambucil

57. Methyl-CCNU (1-(2-Chloroethyl)-3-(4-methylcyclohexyl)-1-nitrosourea; Semustine)

58. Chromium(VI) compounds

 59. Ciclosporin

60. Contraceptives, hormonal, combined forms (those containing both oestrogen and a progestogen)

61. Contraceptives, oral, sequential forms of hormonal contraception (a period of oestrogen-only followed by a period of both oestrogen and a progestogen)

62. Cyclophosphamide

63. Diethylstilboestrol

64. Dyes metabolized to benzidine

65. Epstein-Barr virus

66. Oestrogens, nonsteroidal

67. Oestrogens, steroidal

68. Oestrogen therapy, postmenopausal

69. Ethanol in alcoholic beverages

70. Erionite

71. Ethylene oxide

72. Etoposide alone and in combination with cisplatin and bleomycin

73. Formaldehyde

74. Gallium arsenide

75. Helicobacter pylori (infection with)

76. Hepatitis B virus (chronic infection with)

77. Hepatitis C virus (chronic infection with)

78. Herbal remedies containing plant species of the genus Aristolochia

79. Human immunodeficiency virus type 1 (infection with)

80. Human papillomavirus type 16, 18, 31, 33, 35, 39, 45, 51, 52, 56, 58, 59 and 66

81. Human T-cell lymphotropic virus type-I

82. Melphalan

83. Methoxsalen (8-Methoxypsoralen) plus ultraviolet A-radiation

84. 4,4′-methylene-bis(2-chloroaniline) (MOCA)

85. MOPP and other combined chemotherapy including alkylating agents

86. Mustard gas (sulphur mustard)

87. 2-Naphthylamine

88. Neutron radiation

89. Nickel compounds

90. 4-(N-Nitrosomethylamino)-1-(3-pyridyl)-1-butanone (NNK)

91. N-Nitrosonornicotine (NNN)

92. Opisthorchis viverrini (infection with)

93. Outdoor air pollution

94. Particulate matter in outdoor air pollution

95. Phosphorus-32, as phosphate

96. Plutonium-239 and its decay products (may contain plutonium-240 and other isotopes), as aerosols

97. Radioiodines, short-lived isotopes, including iodine-131, from atomic reactor accidents and nuclear weapons detonation (exposure during childhood)

98. Radionuclides, α-particle-emitting, internally deposited

99. Radionuclides, β-particle-emitting, internally deposited

100. Radium-224 and its decay products

101. Radium-226 and its decay products

102. Radium-228 and its decay products

103. Radon-222 and its decay products

104. Schistosoma haematobium (infection with)

105. Silica, crystalline (inhaled in the form of quartz or cristobalite from occupational sources)

106. Solar radiation

107. Talc containing asbestiform fibres

108. Tamoxifen

109. 2,3,7,8-tetrachlorodibenzo-para-dioxin

110. Thiotepa (1,1′,1′-phosphinothioylidynetrisaziridine)

111. Thorium-232 and its decay products, administered intravenously as a colloidal dispersion of thorium-232 dioxide

112. Treosulfan

113. Ortho-toluidine

114. Vinyl chloride

115. Ultraviolet radiation

116. X-radiation and gamma radiation

From the Daily Mail.

 

French Mathematical Society “Battle against global warming an absurd,costly and pointless crusade”

October 27, 2015

The French Society of Mathematical Calculation (Société de Calcul Mathématique SA) has released a white paper entitled

The battle against global warming: an absurd,costly and pointless crusade

French Mathematical Calculation Society White Paper

The crusade is absurd

There is not a single fact, figure or observation that leads us to conclude that the world‘s climate is in any way disturbed‘. It is variable, as it has always been, but rather less so now than during certain periods or geological eras. Modern methods are far from being able to accurately measure the planet‘s global temperature even today, so measurements made 50 or 100 years ago are even less reliable. Concentrations of CO2 vary, as they always have done; the figures that are being released are biased and dishonest. Rising sea levels are a normal phenomenon linked to upthrust buoyancy; they are nothing to do with so-called global warming. As for extreme weather events – they are no more frequent now than they have been in the past. We ourselves have processed the raw data on hurricanes. We are being told that ‗a temperature increase of more than 2ºC by comparison with the beginning of the industrial age would have dramatic consequences, and absolutely has to be prevented‘. When they hear this, people worry: hasn‘t there already been an increase of 1.9ºC? Actually, no: the figures for the period 1995-2015 show an upward trend of about 1ºC every hundred years! Of course, these figures, which contradict public policies, are never brought to public attention. 

…..

We are not in a position to question the composition of the IPPC, or its legitimacy and policy decisions, and we shall not do so. However, as mathematicians, we have every right to respond to the following question: if the IPPC‘s work were to be submitted for publication in a reputable scientific journal, would it be accepted? This decision is the task of a referee, in a procedure that is common practice in the sciences. The answer is very simple: no sensible, high-quality journal would publish the IPPC‘s work. The IPPC‘s conclusions go against observed facts; the figures used are deliberately chosen to support its conclusions (with no regard for the most basic scientific honesty), and the natural variability of phenomena is passed over without comment. The IPPC‘s report fails to respect the fundamental rules of scientific research and could not be published in any review with a reading panel.

Of course these mathematicians could not possibly be part of the great 97%.

The 97% reminds me of how we used to present “market share” when we were looking to increase the marketing budget. Whenever necessary we could always show that we had 100% market share for the machines we had sold, or that we had virtually zero market share of a market defined sufficiently wide. And any number in-between was a simple matter of defining the market.

Thus, 97% of those who believe in man-made global warming, believe in man-made global warming

Hurricane Patricia – 200 became 165, potentially catastrophic became no significant damage – why the hype?

October 24, 2015

It was the strongest storm ever recorded. It was the most dangerous storm in history. Such strong winds had never been recorded before. It was going to cause catastrophic damage. The banner headlines were spread in the mainstream media across the globe.

But Patricia has fizzled to a Category 2 storm just a few hours after landfall. No serious damage so far.

Was there an agenda to the world-wide hype? Remember the hype about Hurricane Joaquin just a few weeks ago. It didn’t even make landfall and in the course of just a few days the forecast course changed by some 6,000km. Remember Typhoon Haiyan. Strongest ever winds of 200 MPH  were predicted hours before landfall. By the time it made landfall it was a tropical storm. Is there a pattern to the hype?

Hurricane Joaquin post mortem -- tracks from RealScience

Hurricane Joaquin post mortem — tracks from RealScience

This morning I learn that Hurricane Patricia has made landfall and is weakening. The expected 200mph winds had become less than 165mph by landfall. From Category 5 it has been downgraded to Category 2. So far minor landslides and fallen trees have been reported. How come they were “destructive winds and rain” but “heavy damage was avoided”? Destructive without damage? We used to call that non-destructive.

BBC: 

The storm touched down in western Mexico, bringing destructive winds and rain, but heavy damage appears to have been avoided.

The US National Hurricane Center said the hurricane hit as a Category Five storm – the highest classification.

It said “life-threatening flash floods and mudslides” were now likely.

The states of Nayarit, Jalisco, Colima, Michoacan, and Guerrero are in particular danger as the storm moves inland, the centre says.

Four hours after making landfall as the strongest recorded hurricane, Patricia weakened to a Category Four, and is likely to be downgraded to a tropical storm in the coming hours as it passes over mountainous regions.

“The first reports confirm that the damage has been smaller than that corresponding to a hurricane of this magnitude,” Mexico’s president, Enrique Pena Nieto, said in a televised address.

Mexican federal police said only “minor landslides and fallen trees” had so far been reported in Colima.

I don’t suppose that the hype has anything to do with the approaching Paris conference on wealth distribution (ostensibly about global warming)?

I note that Mexico is expecting to be a beneficiary from the Paris largesse.

NASA alarmists predict 99.9% probability of LA earthquake in 3 years, but US Geological Survey is sharply critical

October 22, 2015

A new NASA paper published in Earth and Space Science claims that “For a M ≥ 5 earthquake within a circle of radius 100 km, and over the 3 years following 1 April 2015, the probability is 99.9%”.

But the US Geological Survey was very quick to criticise the methods and the conclusion.

NASA was once an unimpeachable science source. No longer. That brand value has been badly impaired. There is far too much exaggeration and hype. There are peripheral sections of NASA which seem to revel in alarmism. This is especially visible when they pontificate about areas which are not their core business. Just because the radar or aerial or space based images may originate with NASA, some think it gives them a pondus on subjects they are not expert on. Perhaps it is also the chase for publications and notoriety from some sections of the organisation who feel their work does not get enough publicity. NASA statements about potential natural disasters always seem to be highly exaggerated for effect. This includes storms, hurricanes, climate change and now earthquakes. Even when they do have something to say they tend to overdo the hype (as with the recent press conference about Martian water). The alarmist theme is encroaching even into the core areas. Have you noticed how many recent asteroids have been highlighted as “not being any danger”? Reverse psychology being applied by NASA perhaps, to inject some alarm into situations which have not the slightest danger and which otherwise would have passed unremarked?

Naturally NASA issued a press release. However, even the NASA PR machine had not the cheek themselves to highlight the main conclusion which is in the discussion section (Section 5) of the paper:

The calculated probability for a M ≥ 6 earthquake within a circle of radius 100 km, and over the 3 years following 1 April 2015, is 35%. For a M ≥ 5 earthquake within a circle of radius 100 km, and over the 3 years following 1 April 2015, the probability is 99.9%.

A 99.9% chance is as close to certainty in a prediction that one could ever get. But the US Geological Survey was not amused by these upstart alarmists. It took to Facebook and was sharply critical and has been quick to publish a severe put-down.

USGS Statement on JPL La Habra Study in the news:

This paper claims a 99.9% probability of an earthquake of magnitude 5 or greater occurring in the next 3 years within a large area of Southern California without providing a clear description of how these numbers were derived. The area—a 100 km radius circle centered on the city of La Habra—is a known seismically active area. For this same area, the community developed and accepted model of earthquake occurrence, “UCERF3”, which is the basis of the USGS National Seismic Hazard Maps, gives a 3-year probability of 85%. In other words, the accepted random chance of a M5 or greater in this area in 3 years is 85%, independent of the analysis in this paper.

While the earthquake forecast presented in this paper has been published in the online journal Earth and Space Sciences, it has not yet been examined by the long-established committees that evaluate earthquake forecasts and predictions made by scientists. These committees, the California Earthquake Prediction Evaluation Council, which advises the California Office of Emergency Services, and the National Earthquake Prediction Evaluation Council, which advises the U.S. Geological Survey, were established to provide expert, independent assessment of earthquake predictions.

The earthquake rate implied by the 99.9% probability is significantly higher than observed at any time previously in Southern California, and the lack of details on the method of analysis makes a critical assessment of this approach very difficult. Therefore, the USGS does not consider the analysis presented in this paper a reason to change our assessment of the hazard.

http://pubs.usgs.gov/fs/2015/3009/

“Therefore, the USGS does not consider the analysis presented in this paper a reason to change our assessment of the hazard” effectively says that the USGS does not think this paper has any significance. 

One wonders – from the USGS comments – how this paper got to be published. The peer review applied for this paper seems a little suspect. None of the “peers” came from the USGS apparently. Was it just a “pal” review? In recent times, in my perception, many of the peripheral NASA sections publish papers with little substance just to say “Look how good we are“. I suppose they are deemed necessary to maintain department budgets.

Outside of its own core areas, NASA is strongly in the alarmist camp. They probably thinks it helps funding. But perhaps NASA needs to take stock of the damage being done to their brand every time they choose the alarmist route.

I think I will go with the US Geological Survey in this case and their more nuanced probabilities over 30 years.

US Geological Survey 30 year Uniform California Earthquake Rupture Forecast

India and China have already won and the Paris climate conference has become irrelevant

October 20, 2015
Paris conference

Paris conference

India and China have successfully managed to get the UN to focus on the intensity of emissions per unit of GDP and thus can make promises (not legally binding) about future emissions tied to GDP such that they will not be limited in their use of coal in any significant way.

The hype about the UN’s December climate meeting in Paris is gradually growing. Media, politically correct politicians and the global warming religion’s orthodoxy are winding up their rhetoric. Ostensibly the goal is to demonise carbon and to get nations to commit to reducing fossil fuel use such that the global temperature rise “will not exceed 2ºC”. This target of “allowable” temperature rise is not “2ºC caused by man” but just “2ºC”. Nobody actually knows what the rise by “natural causes” might be and what is caused by man. “Global temperature” itself is an artefact, a calculated quantity and calculated by those with a vested interest in showing that it is increasing. It seems that the calculation method is conveniently variable and is adjusted every year to show that the current year has demonstrated the highest ever temperature. Nevertheless the 5,000 participants and 190+ countries have effectively set themselves up to discuss commitments to stop climate change itself. The arrogance is astounding and worthy of King Cnut.

What effect man has actually had on climate is unknown. For almost 20 years now, man-made carbon dioxide emissions have been growing explosively but “global temperature” has paused. Those countries which have increased their own costs of electricity by reducing fossil fuel use (mainly in Europe) have effectively done it all quite uselessly and unnecessarily. Other countries (China and India in the main) have increased their use of fossil fuels such that global emissions of carbon dioxide have continued to grow. And yet there has been no change in “global temperature” except by arithmetical tricks. The last 3 decades of reducing fossil fuel use in Europe have been unnecessary. Three decades of subsidising renewable energy have still not made them commercial in their own right.

Climate policies are all policies where the objectives are not measurable. Policies are being proposed where the effect of the policies on climate itself cannot be measured. All that can be measured are the actions themselves which is both trivial and meaningless. For example countries can measure amounts of money spent but have no clue as to what the resultant effect on climate may be. Emissions reductions can be measured, but not the actual climate effects such reductions may have caused or not caused. For many delegates the purpose is not climate but the redistribution of wealth among nations where climate policy is the vehicle.

Ask a politician what his countries climate policies will achieve and the answer is that it will “contribute to the world’s efforts to stop climate change”. But by how much and how success can be measured are unknowns. It has become a matter of solidarity among nations not of policies with objectives. Not a single country (nor any politician nor any so-called climate scientist) has any inkling about what its climate policies will achieve for climate or even if it will achieve anything at all.

Some of the more savvy politicians and countries have figured out ways to seem to support political correctness while ensuring that their continued – and increasing – use of fossil fuels is not constrained in practice. For India and China the continued use of fossil fuels is critical and necessary for their growth. For the next 20  – 30 years, their carbon dioxide emissions are going to increase regardless of what the Paris meeting decides. India has proposed policies which seem – at first sight – to be drastic reductions in the “intensity of carbon dioxide emissions per unit of GDP” but defined in terms of growth such that coal consumption will have trebled in the next 25 years from 2005. India has now said it will cut emissions intensity by up to 25% of 2005 levels by 2020. China has also said it will reduce the intensity of carbon dioxide emissions per unit of GDP in 2020 by 40 to 45 percent compared with the level of 2005.

India’s GDP has grown from $0.8 trillion in 2005 to be about $2.1 trillion in 2014. China’s GDP has already grown from $2.3 trillion in 2005 to $10.3 trillion in 2014. These “promises” based on GDP are not even going to be legally binding  and there is certainly no cap to the GDP which can be aimed for or achieved. The GDP targets for India and China inherently require a mix of fuels to be used for electricity generation; coal, gas, nuclear and hydro primarily. Solar and wind power may have a large installed capacity and may contribute something to the growth but are not necessary or critical. The Indian and Chinese plans for using more gas and nuclear in their mix automatically brings down the carbon intensity per GDP from the levels of 2005 when both countries were heavily dependent on coal. Their coal plans can therefore proceed unimpeded while still meeting their “promises”. Both countries are relying on GDP growth to effectively reduce their “intensities of carbon emission” without having to reduce the rate at which they increase planned fossil fuel use or carbon dioxide emissions. Both India and China have reached the stage of development where electricity consumption growth is now lower than GDP growth. Both are at low levels of energy utilisation efficiency such that significant demand side improvements can be made. With around 7% growth in India and even with China reducing to, say, 6% growth, the reductions of intensity of carbon dioxide emissions per unit of GDP are impossible to prevent.

Any agreement in Paris will mean India trebling and China doubling its coal burn by 2030. And with “official” sanction to do so. So what “success” in Paris means is that global, man-made, carbon dioxide emissions are going to double (at least). And it also means that any carbon dioxide emission reductions promised by other countries are of no significance whatsoever. It is a very good thing that man-made, carbon dioxide emissions have no significant impact on global temperature.

And the Paris conference is both meaningless and irrelevant.

What Boston took away from the sea

October 19, 2015

Now they say that Boston may be threatened by rising sea levels – except that Boston is still growing itself by reclaiming land from the sea. And sea levels are not actually rising any faster than the rate they have been over the millennia of recovery from the last ice age. Just as arctic ice seems to be quite “normal” while antarctic ice cover is on the high side.

The marshland that has been filled in to allow Boston to expand over the last 150 years is shown very nicely in this animation (from 2010) from Joost Bonsen at Maximising Progress.

 

Tomorrow, 7th October, the world will end in fire

October 6, 2015

Here we go again.

The world ends tomorrow in fire. This time it is the eBible Fellowship who reckon that the gates of Heaven were closed on 21st May 2011. Then by some strange and convoluted machinations they come to the end of the world 1600 days later – which brings us to tomorrow – 7th October 2015.

You have been warned. Prepare to meet thy doom.

I would if I could, but I have a cold.

ebible:In our previous Bible pamphlet entitled: “Spiritual
Judgment Began May 21, 2011” we demonstrated God’s
propensity toward bringing spiritual judgments to pass; the
judgment upon mankind in the garden of Eden; the
judgment of God upon the Lord Jesus Christ in the garden
of Gethsemane; and the judgment upon the corporate
churches of the world. These were all spiritual judgments
and in bringing about a spiritual judgment on the world
(beginning May 21, 2011) the Lord is following this biblical
pattern.
On May 21, 2011 the Bible indicates that God shut the door of heaven.

…. Therefore, 1600 furlongs may be viewed as 1600 days. If we go 1600 days from May 21, 2011 we arrive at October 7, 2015. We now have a time path that leads us to the date of October 7, 2015. ….

…. The Bible reveals to us that God has been punishing the unsaved people of the earth since May 21, 2011 and, simultaneously, through the number 1600, the Bible also reveals that God has been severely trying all those that are considered true believers. October 7, 2015 would be the 1600th day since May 21, 2011 and, therefore, the testing would be finished on that day.

Even the Guardian is impressed.

The Guardian: The eBible Fellowship, an online affiliation headquartered near Philadelphia, has based its prediction of an October obliteration on a previous claim that the world would end on 21 May 2011. While that claim proved to be false, the organization is confident it has the correct date this time.

“According to what the Bible is presenting it does appear that 7 October will be the day that God has spoken of: in which, the world will pass away,” said Chris McCann, the leader and founder of the fellowship, an online gathering of Christians headquartered in Philadelphia.

“It’ll be gone forever. Annihilated.”

McCann said that, according to his interpretation of the Bible, the world will be obliterated “with fire”.

When the population implosion threatens …..

October 6, 2015

By 2050, virtually all parts of the world, except some parts of Africa, will be witnessing a decline in population. Until then, migrations of peoples will serve to maintain the ratio of productive to “non-productive” people. (By then, the non-productive will probably be defined as those under 20 and those over about 70). But going forward, migration from declining source populations will no longer be able to provide even a temporary solution.

The fundamental decline in fertility rates will be a consequence of the widespread acceptance of women’s rights, the increasing liberation of women in Asia and Africa, and the ready availability of contraception and abortion. Increasing longevity will mitigate population decline to some extent but will exacerbate the declining ratio of productive/unproductive population. The threat will be of an accelerating decline. Alarmism will no longer be about “population explosions” but about the coming “population implosion”. The decline of rural populations will threaten food supplies, though mitigated by increasing automations and genetically modified crops. Growth will be limited, not so much by capital or raw materials, but by the availability of personnel. In developed countries, tax revenues will stagnate or begin to decline.

Sustainable communities, somewhat smaller than the current nation states, will start husbanding their people resources and their (local) tax revenues. “National” programs – health, education and even infrastructure – will increasingly shift to be “local”. The immigration issue of the day will be about preventing any influx of non-productive peoples. Incentives will be offered to attract productive businesses and people. Some isolated areas already below critical populations or population-mixes, which survive only on subsidies, will be “abandoned” financially. That will, in turn, shift people to areas which exceed the critical mass for the provision of welfare and other services. Successful communities will be those which attract productive people and provision of local jobs, education and health services will be the competitive factors. Education services provided will be linked to performance. Health services to be provided for any individual will be judged by the cost of the service against the benefit of the individual’s remaining productive life. Health services for the elderly will gradually be removed from welfare services and will all have to be purchased. Assisted deaths for the elderly will be as readily available as abortions.

The globalisation paradigm which would have been in effect for a century will shift to a new “localisation” meme.

As power to raise revenues is devolved increasingly to smaller, sustainable communities, “national” defense budgets will be slashed. Expansionism will no longer make any sense. Conflicts may still occur over resources (water, rare metals, rare earths ….) but will decline as population declines. Virtually every local government will then be engaged in trying to increase fertility rates. Tax breaks and extra payments will be available for every child. “Political correctness” will shift to the having of many children. But all these measures will not have much effect in increasing fertility rates.

Surrogacy will pay very well until the artificial womb is developed. That will be the game changer. Then community governments will move to control artificial fertilisation from donor sperm and eggs. The birth of children will move into the “public” sphere. Genetic scanning will be increasingly used prior to allowing a foetus to develop in an artificial womb. Humans will then only be required to supply their sperm or their eggs. They will no longer be required to perform as parents. Mating will no longer be an activity connected to the production of children. The children will be brought up in community creches. The fertility rate will become a completely controllable parameter. Eventually, so will the genetic make up of the children being produced. Some will have their genes tailored to meet some specific community need. Others will be mass-produced when “drones” are required in large numbers. The most powerful committee in any community will then be that which chooses which egg will be fertilised by which sperm.

It is population decline which will lead inevitably and remorselessly to the Brave New World.

Hurricane Joaquin moves away

October 3, 2015

US hurricane strikes are at an all-time low and there has been much alarmist excitement this week about the possibility of hurricane Joaquin striking the US mainland. First, it was going to strike N Carolina. Emergencies were declared. Then it was going to hit New York. Emergencies were extended. The hurricane declined to follow the computer models. Now Boston and New England were the targets claimed the modellers. Hurricane Joaquin moved even slower and moved further East.

The latest is that it is now headed for Bermuda. Some of the UK media are now concerned it may hit the UK. Not to underestimate the power of the hurricane since a cargo ship with 33 on board which was near the eye has gone missing. But, so much for the infallibility of computer models which claim that climate is a settled “science”. “It is weather that is variable and unpredictable”, I hear them cry. “The climate is perfectly predictable and the science is settled”. Ah Yes. But climate is nothing more than weather integrated over space and time.

Modellers who change the data rather than change their mathematical models which don’t fit reality, ought to be beyond the pale. But apparently it is acceptable to fudge data if it preserves political correctness.

Joaquin 1

Joaquin 2

This image from RealScience shows how the predicted track has moved.

In spite of the alarm of extreme weather the reality is that US hurricane strikes are at an all-time low.

Data from:  HURDAT Re-analysis Chronological List of All Hurricanes

There was no biodiversity to begin with

September 15, 2015

I was listening to some conservationists on the radio discussing the rate of loss of species and how this was a catastrophe in the making for biodiversity. It was an unsatisfactory talk mainly because they all made what I thought were quite unjustified assumptions. It was more about political advocacy rather than any attempt to argue based on evidence.

The “politically correct” view is that biodiversity (measured as the number of species in existence) is a “good thing” and that more species is “good” and fewer species is “bad”. Saving endangered species is also a “good” thing. That species are becoming extinct at an alarming rate means catastrophically that a 6th mass extinction is nigh. But I find this viewpoint lacking in substance. We have more species existing today than ever before. Probably too many. Mass extinctions have helped “clean out” the rubbish that evolution throws up. Extinction rates may be high but that is hardly surprising when the number of species is so high. A 6th mass extinction may, in fact, be necessary. More species and more biodiversity is not always a good thing.

The fossil record shows that biodiversity in the world has been increasing dramatically for 200 million years and is likely to continue. The two mass extinctions in that period (at 201 million and 66 million years ago) slowed the trend only temporarily. Genera are the next taxonomic level up from species and are easier to detect in fossils. The Phanerozoic is the 540-million-year period in which animal life has proliferated. Chart created by and courtesy of University of Chicago paleontologists J. John Sepkoski, Jr. and David M. Raup.

The fossil record shows that biodiversity in the world has been increasing dramatically for 200 million years and is likely to continue. The two mass extinctions in that period (at 201 million and 66 million years ago) slowed the trend only temporarily. Genera are the next taxonomic level up from species and are easier to detect in fossils. The Phanerozoic is the 540-million-year period in which animal life has proliferated. Chart created by and courtesy of University of Chicago paleontologists J. John Sepkoski, Jr. and David M. Raup.

An endangered species is one whose population is low and dangerously in decline. If numbers of individuals of a species are that low, then that species has already become irrelevant in its contribution to the functioning of the biosphere. It may well be a matter of regret, just as there is always regret when a language becomes extinct from disuse. But apart from providing entertainment value for humans, the saving of a few members of a doomed species provides no real benefit for the functioning of the biosphere. I would be very sorry to see tigers becoming extinct, but the reality is that their numbers are so low that they play no significant part in the sustenance of the biosphere. The role of a predator species is primarily to control the population of its prey. From a biodiversity point of view they are already irrelevant. Saving the tiger has nothing to do with maintaining a healthy biodiversity and everything to do with human entertainment (including that of the conservationists) and “feeling good”.

(I am of the opinion that helping an endangered species to survive can be desirable but then “conservation” should be based on helping that species to adapt genetically rather than to freeze it into an artificial habitat – zoos and reserves – to which it is not suited).

At one time there was just a single species that all life derives from – perhaps even just one living cell. (And even for creationists, all the diversity of humankind has derived from a single mating pair – and the raging incest that that implies). There was no biodiversity to begin with. Genetic variation with each generation and genetic mutations then caused new species to come into being, first to fill up the spaces that the prevailing environment allowed and then to adapt to changing environments. If each generation of the first species had bred true there would, of course, be no biodiversity. Genetic variation and empty space in the environment led to growth of species. Overcrowding of a given space or drastic environment change cause the decline and extinction of species. The prevailing level of “biodiversity” at any time is not then some target to be achieved, but just the current balance between the birth and death of species.

It seems almost self-evident to me that, for any given environment there must be an optimum number of species, with particular combinations of characteristics, which allow the ecosystem or biosphere to be in a self-sustaining equilibrium (not growing or declining but self-sustaining). This optimum will vary depending upon the characteristics and interactions between the particular species existing and the available space in the prevailing environment. Then, having fewer than the optimum number of species in that environment would mean that all the complex interdependent, interactions between species that seem to be necessary for sustaining each of the participating species would not be fully developed. I say “seem” because it is not certain that all interdependencies are necessarily of benefit to individual species. “It is the entire ecosystem which benefits” I hear some say, but even that is more an assumption than a conclusion.

But what would happen in such a situation?  If the interactions are truly necessary, then some of these sub-optimal number of species should logically be on the way to stagnation or to extinction. But it is not certain that some new equilibrium will not be reached. One species too few for a given environmental space will only lead to the space being occupied by an existing or a new species. One species too many for a given space will lead to the extinction of a redundant species or of a number of species existing under genetic stress, until genetic variation reduced the stress. The interactions between species in any environment are not planned in advance. They are just those that happen to prevail and survive because they succeed in the environmental space available. Too few species will give an increase of species until overcrowding reduces the number of species. A rapid change of environment and a reduction of the space available must give a decrease in the number of species making up the optimum for a self-sustaining biosphere.

Generally species of plant life have increased in the wake of human habitations.

For example, more than 4,000 plant species introduced into North America during the past 400 years grow naturally here and now constitute nearly 20 percent of the continent’s vascular plant biodiversity.

But then we try to eradicate “invasive” species even though that represents a decrease in biodiversity. Clearly some biodiversity “is not good”. We hunt down successful species as pests when they reach and thrive in new or empty environmental spaces. We protect and support unsuccessful (failed) species in the name of conservation and biodiversity. We have no qualms in trying to eradicate insects, microbes and bacteria which cause human disease even if biodiversity is consequently reduced. From the perspective of the biodiversity of the genetic pool, losing a species of some unknown bacteria may be just as significant as the extinction of the elephant.

The rate of growth of the human species has meant that other species have not been able to adapt fast enough – genetically – to their loss of habitat or the increase of competition. The environmental space available to them has drastically reduced. But that is reality. Creating artificially unsustainable habitats will not change that. The optimum level of biodiversity for the environmental space today is different to that of 100 years ago. Biodiversity cannot be considered independently of the environmental space available. Conservationism which seeks to maintain the wrong level of biodiversity for the available space seems to me to be both futile and stupid. Especially when conservationism has no idea what the “optimum” level of biodiversity is and whether the current level lies above or below the optimum level.