Archive for the ‘Alarmism’ Category

Global population will likely start reducing by 2070

July 19, 2017

The alarmist meme of a global population explosion leading to catastrophic depletion of resources and mass famines was already obsolete 20 years ago. The alarmism reached its peak in the 1970s and 1980s  (peak oil, peak food, peak water, peak resources ….). At least the panicky stridency of the alarmism about a population explosion has long gone (though it has now shifted to become panicky stridency about catastrophic global warming).

The 2017 Revision of the UN World Population Prospects is now available.

  • the world’s population reached nearly 7.6 billion in mid-2017. The world has added one billion people since 2005 and two billion since 1993. In 2017, an estimated 50.4 per cent of the world’s population was male and 49.6 per cent female.
  • the global population is expected to reach 8.6 billion in 2030, 9.8 billion in 2050 and 11.2 billion in 2100, according to the medium-variant projection

However the projections are very sensitive to fertility rates and their development – especially in Africa. Actual fertility rates have always tended to be below UN projections of fertility.

In the 2017 review, the sensitivity to fertility rate is highlighted;

  • Future population growth is highly dependent on the path that future fertility will take, as relatively small changes in the frequency of childbearing, when projected over several decades, can generate large differences in total population.
  • In the medium-variant projection, it is assumed that the global fertility level will decline from 2.5 births per woman in 2010-2015 to 2.2 in 2045-2050, and then fall to 2.0 by 2095-2100.
  • fertility levels consistently half a child below the assumption used for the medium variant would lead to a global population of 8.8 billion at mid-century, declining to 7.3 billion in 2100.
  • fertility has declined in virtually all regions of the world. In Africa, where fertility levels are the highest of any region, total fertility has fallen from 5.1 births per woman in 2000-2005 to 4.7 in 2010-2015. Over the same period, fertility levels also fell in Asia (from 2.4 to 2.2), Latin America and the Caribbean (from 2.5 to 2.1), and Northern America (from 2.0 to 1.85). Europe has been an exception to this trend in recent years, with total fertility increasing from 1.4 births per woman in 2000-2005 to 1.6 in 2010-2015. Total fertility in Oceania has changed little since 2000, at roughly 2.4 births per woman in both 2000-2005 and 2010-2015.

As with Asia, I expect that the decline of fertility in Africa will accelerate with development and GDP growth. If global fertility turns out to be as much as 0.5 children/woman less than the medium assumption, global population will start declining already by 2050. It may not happen quite that fast but it is now very likely that the decline will have begun by 2070. By the end of this century global population may not be much more than it is today.

It is only a matter of time before the alarmists start getting panicky and strident about the impending population implosion.


 

Humanity’s existential threats (sans alarmism)

July 13, 2017

Many so-called “scientists” indulge in phony-science which is entirely geared to winning funding. Alarmism is often misused. Such “scientists” and insurance companies cannot be trusted with risk assessments. They have a vested interest in exaggerating the risk, either to get more funding or to increase the sale of highly profitable insurance products.

(Insurance companies are expert at assessing risk. However their business is built on perceived risk being higher than actual risk. Their profits provide the evidence for how well they have succeeded both in raising perceptions of risk and in accurately predicting actual risk. “Scientific” papers authored or sponsored by insurance companies and their reports must always be taken with a large bushel of salt. Their publications are all about building up risk perceptions. They are probably very good at assessing the real risk, but that material will never get published. The risk of alarmism is multiplied when scientists are sponsored by insurance companies).

Politicians like exaggerated risk assessments where the causes of the risk are outside their control. This provides a bottomless pit of “allowable taxes”, ostensibly to “fight” against causes which are not understood and outside their control, but  provide for increased budget revenues.

Scientists, politicians and insurance companies inevitably try to arouse emotions and “fear level” to their own purposes.

The actual existential risks for humanity are a long way below what the alarmists would have us believe and need to be considered unemotionally. Risk assessment is an exercise which primarily needs common sense and which must not get mired in  “political correctness”.

The most serious existential threats facing humanity in the next 1,000 years come from the almost inevitable eruption of a super-volcano (which is overdue) and the continuing decline in global fertility which could give serious depopulation by 2100.

(I am quite happy to offer odds of 10 million to one to anybody who wishes to place money on a large asteroid impacting earth – say – within the next 5 years. The risk that the punter and I will both survive such an event. such that I have to pay out, is as close to zero as you can get).


 

 

Conservation denies tigers a future as a species

June 13, 2017

There are, it is thought, around 4,000 tigers still living in the “wild”. There may be as many as 8 – 9,000 in captivity (3,000 in China and perhaps 5,000 in the US). The tigers in captivity are in zoos and parks and are, in the US, often bred for “hunting”. Very few (< 100 perhaps) of those in captivity are returned to the “wild” every year. Breeding hybrid tigons and ligers once used to be very popular in zoos but less so now though it is still prevalent for entertainment purposes. The numbers are not very significant.

Tigers are magnificent animals and a cultural icon for humans. No doubt the sabre-toothed tiger was an even more magnificent creature. It is surely a matter of regret that they became extinct a long time ago. As a species they were replaced by others which were more suited to the changing world. If present-day tigers (considered endangered) were to become extinct, it would also be a matter of much regret. But I find the rationale for “conservation” efforts flawed and illogical. The WWF (which is close to being one of my least favourite organisations) writes in a typical woolly-headed, gushing style:

Yet they are more than just a magnificent animal – they are also crucial for the ecosystems in which they live. As top predators of the food chain, tigers keep populations of prey species in check, which in turn maintains the balance between herbivores and the vegetation upon which they feed. Balanced ecosystems are not only important for wildlife, but for people too – both locally, nationally and globally. People rely on forests, whether it is directly for their livelihoods or indirectly for food and products used in our daily lives. ……… Tigers not only protect the forest by maintaining ecological integrity, but also by bringing the highest levels of protection and investment to an area. Tigers are an “umbrella species” – meaning their conservation also conserves many other species in the same area. They are long-ranging and require vast amounts of habitat to survive; an adult male’s home range varies from 150 km2 – 1000 km2.

Tigers are endangered because their habitats are disappearing. That habitat loss is fundamentally irreversible. As a species they already have no significant role to play in the ecosystem prevailing. They have already become a redundant species biologically even if the concept of majestic tigers roaming wild forests still has a massive emotional impact on the selfish human psyche. Creating new tiger reserves – constrained in area by various means –  is little more than creating glorified zoos. They are just parks where the cages are a little bigger.  The tigers themselves are “frozen” into their current, unsuccessful, unsuitable, failed genetic state. They are doomed to continue unchanged and unchanging in a shrinking and ever more unsuitable habitat. There are no natural selection pressures (or artificial selection measures) in play which would make their descendants more capable of surviving in the new habitats due to changes that have already happened and have yet to come. This “conservation” is not about helping the tiger to survive by evolving but is only about freezing them into an increasingly untenable form. It is backwards looking and all about preserving failure.

I am even more convinced that traditional “conservation” is misguided and is done just to satisfy the emotional needs of humans, and not, in any way, forward-looking to help endangered species to adapt and survive into the future.

Fighting against species extinction is to deny evolution   – (ktwop – 2013)

So what then is the objection to – say – tigers becoming extinct which is not just an emotional reaction to the disappearance of a magnificent but anachronistic creature?  The bio-diversity argument is not very convincing and is of little relevance. To artificially keep an unsuccessful species alive in a specially protected environment has no genetic value. It increases the mis-match between the existing environment and the genetic profile needed to survive in that environment. In fact the biodiversity argument is only relevant for “life” in general and never for any particular species or group of species.  It can serve to maintain a very wide range of genetic material in the event of a catastrophe such that some form of life has a chance of continuing. But given a particular environment biodiversity in itself is of little value. …

…. All those species which succeed into the future will be those which continue to “evolve” and have the characteristics necessary to thrive within the world as it is being shaped and changed by the most successful species that ever lived (though we cannot be sure how far some particular species of dinosaur may have advanced). Putting a tiger into a zoo or a “protected” environment actually only preserves the tiger in an “unsuccessful” form in an artificial environment. Does this really count as “saving the species”? We might be of more use to the future of the tiger species if we intentionally bred them to find a new space in a changed world  – perhaps as urban tigers which can co-exist with man.

Smilodon image DinoAnimals.com

I’ll still make a donation to Project Tiger but that is about helping individuals to survive and has nothing to do with saving the species.


California normal

April 17, 2017

So much for the global-warming induced, permanent drought that had afflicted California. Population in California has doubled since 1979 but water resources infrastructure has not changed much. Ground water is being consumed and depleted at an increasing pace. Yet normal weather variations are taken to be man-made global warming.

California remains normal, but Californians …….

USNews: 

An index of precipitation at eight sensors showed that just under 90 inches of rain and snow have fallen this winter in the northern Sierra Nevada. The previous record of 88.5 inches was set in the winter of 1982-1983. The average for the region is 50 inches a year, according to the state Department of Water Resources.

The record was surpassed less than a week after Gov. Jerry Brown officially declared an end to California’s drought emergency — a largely symbolic pronouncement that left in place some water-conservation rules for the 40 million residents of the nation’s most populous state.

More snow and rain is likely to pad the record before the wet season ends.


 

The power of the global warming religion

March 14, 2017

Roy Spencer says it:

Global warming theory is in fact so malleable that it predicts anything. More cold, less cold. More snow, less snow.

What a powerful theory.

And what’s even more amazing is that climate change can be averted by just increasing your taxes.

But what nobody ever reports on — because it would be boring — are the storms and severe weather events that haven’t happened. For example, U.S. tornado counts have been running below average, or even at record lows, in recent years.

Amazingly, the low tornado activity has been blamed on climate change. So, too, have actual tornado occurrences!

What a grand and gloriously useful theory global warming provides us.

……. 

Winters in the U.S. are notoriously variable. Typically, if it’s warm in the East, it’s cold in the West. This is exactly what has happened this winter, except for this brief reversal before winter’s end.

Normal people call it weather. More enlightened people, in contrast, call it climate change. Next winter it could be the opposite. No one knows.

Like death and taxes, though, what is certain is that anything “unusual” that happens will somehow be blamed on your SUV.

Too much rain, too little rain, drought, flood, storms cyclones, hurricanes and – for the true fanatics – even earthquakes and volcanic eruptions are somehow due to global warming (which of course means carbon dioxide emissions).

I find the European statements (and especially the Swedish government’s) about “climate goals” particularly arrogant and idiotic.

“The government’s goal is for Sweden to be a global role model in climate conversion”.

They really think that they can change the climate? From what to what? The arrogance is boundless.


 

Man-made contribution to carbon dioxide in the atmosphere is just 4.3%

February 26, 2017

This new paper finds that CO2 concentration in the atmosphere has risen by 110 ppm since 1750, but of this the human contribution is just 17 ppm. With the concentration now at 400 ppm, the human contribution is just 4.3%. The results indicate that almost all of the observed change of CO2 during the Industrial Era comes, not from anthropogenic emissions, but from changes of natural emission.

The general assumption by IPCC and the global warming fraternity that natural carbon dioxide absorption and emissions are miraculously in balance and, therefore that man-made emissions are solely responsible for the increase in carbon dioxide concentration is deeply flawed (if not plain stupid).

Clearly this paper is not at all to the liking of the religious zealots of the “global warming brigade” and is causing much heartburn among the faithful.

Hermann Harde, Scrutinizing the carbon cycle and CO2 residence time in the atmosphereGlobal and Planetary Change, http://dx.doi.org/10.1016/j.gloplacha.2017.02.009

Highlights

•An alternative carbon cycle is presented in agreement with the carbon 14 decay.
•The CO2 uptake rate scales proportional to the CO2 concentration.
•Temperature dependent natural emission and absorption rates are considered.
•The average residence time of CO2 in the atmosphere is found to be 4 years.
•Paleoclimatic CO2 variations and the actual CO2 growth rate are well-reproduced.
•The anthropogenic fraction of CO2 in the atmosphere is only 4.3%.
•Human emissions only contribute 15% to the CO2 increase over the Industrial Era.

AbstractClimate scientists presume that the carbon cycle has come out of balance due to the increasing anthropogenic emissions from fossil fuel combustion and land use change. This is made responsible for the rapidly increasing atmospheric CO2 concentrations over recent years, and it is estimated that the removal of the additional emissions from the atmosphere will take a few hundred thousand years. Since this goes along with an increasing greenhouse effect and a further global warming, a better understanding of the carbon cycle is of great importance for all future climate change predictions. We have critically scrutinized this cycle and present an alternative concept, for which the uptake of CO2 by natural sinks scales proportional with the CO2 concentration. In addition, we consider temperature dependent natural emission and absorption rates, by which the paleoclimatic CO2 variations and the actual CO2 growth rate can well be explained. The anthropogenic contribution to the actual CO2 concentration is found to be 4.3%, its fraction to the CO2 increase over the Industrial Era is 15% and the average residence time 4 years.

Conclusions.

Climate scientists assume that a disturbed carbon cycle, which has come out of balance by the increasing anthropogenic emissions from fossil fuel combustion and land use change, is responsible for the rapidly increasing atmospheric CO2 concentrations over recent years. While over the whole Holocene up to the entrance of the Industrial Era (1750) natural emissions by heterotrophic processes and fire were supposed to be in equilibrium with the uptake by photosynthesis and the net oceanatmosphere gas exchange, with the onset of the Industrial Era the IPCC estimates that about 15 – 40 % of the additional emissions cannot further be absorbed by the natural sinks and are accumulating in the atmosphere.

The IPCC further argues that CO2 emitted until 2100 will remain in the atmosphere longer than 1000 years, and in the same context it is even mentioned that the removal of human-emitted CO2 from the atmosphere by natural processes will take a few hundred thousand years (high confidence) (see AR5-Chap.6-Executive-Summary).

Since the rising CO2 concentrations go along with an increasing greenhouse effect and, thus, a further global warming, a better understanding of the carbon cycle is a necessary prerequisite for all future climate change predictions. In their accounting schemes and models of the carbon cycle the IPCC uses many new and detailed data which are primarily focussing on fossil fuel emission, cement fabrication or net land use change (see AR5-WG1-Chap.6.3.2), but it largely neglects any changes of the natural emissions, which contribute to more than 95 % to the total emissions and by far cannot be assumed to be constant over longer periods (see, e.g.: variations over the last 800,000 years (Jouzel et al., 2007); the last glacial termination (Monnin et al., 2001); or the younger Holocene (Monnin et al., 2004; Wagner et al., 2004)).

Since our own estimates of the average CO2 residence time in the atmosphere differ by several orders of magnitude from the announced IPCC values, and on the other hand actual investigations of Humlum et al. (2013) or Salby (2013, 2016) show a strong relation between the natural CO2 emission rate and the surface temperature, this was motivation enough to scrutinize the IPCC accounting scheme in more detail and to contrast this to our own calculations.

Different to the IPCC we start with a rate equation for the emission and absorption processes, where the uptake is not assumed to be saturated but scales proportional with the actual CO2 concentration in the atmosphere (see also Essenhigh, 2009; Salby, 2016). This is justified by the observation of an exponential decay of 14C. A fractional saturation, as assumed by the IPCC, can directly be expressed by a larger residence time of CO2 in the atmosphere and makes a distinction between a turnover time and adjustment time needless. Based on this approach and as solution of the rate equation we derive a concentration at steady state, which is only determined by the product of the total emission rate and the residence time. Under present conditions the natural emissions contribute 373 ppm and anthropogenic emissions 17 ppm to the total concentration of 390 ppm (2012). For the average residence time we only find 4 years.

The stronger increase of the concentration over the Industrial Era up to present times can be explained by introducing a temperature dependent natural emission rate as well as a temperature affected residence time. With this approach not only the exponential increase with the onset of the Industrial Era but also the concentrations at glacial and cooler interglacial times can well be reproduced in full agreement with all observations. So, different to the IPCC’s interpretation the steep increase of the concentration since 1850 finds its natural explanation in the self accelerating processes on the one hand by stronger degassing of the oceans as well as a faster plant growth and decomposition, on the other hand by an increasing residence time at reduced solubility of CO2 in oceans.

Together this results in a dominating temperature controlled natural gain, which contributes about 85 % to the 110 ppm CO2 increase over the Industrial Era, whereas the actual anthropogenic emissions of 4.3 % only donate 15 %. These results indicate that almost all of the observed change of CO2 during the Industrial Era followed, not from anthropogenic emission, but from changes of natural emission.

The results are consistent with the observed lag of CO2 changes behind temperature changes (Humlum et al., 2013; Salby, 2013), a signature of cause and effect. Our analysis of the carbon cycle, which exclusively uses data for the CO2 concentrations and fluxes as published in AR5, shows that also a completely different interpretation of these data is possible, this in complete conformity with all observations and natural causalities. 

I expect there will be a concerted effort by the faithful to try and debunk this (and it has already started).

But I am inclined to give credence to this work – and not merely because it is in general agreement with my own conclusions about the Carbon cycle. Back in 2013 I posted

Even though the combustion of fossil fuels only contributes less than 4% of total carbon dioxide production (about 26Gt/year of 800+GT/year), it is usually assumed that the sinks available balance the natural sources and that the carbon dioxide concentration – without the effects of man – would be largely in equilibrium.  (Why carbon dioxide concentration should not vary naturally escapes me!). It seems rather illogical to me to claim that sinks can somehow distinguish the source of carbon dioxide in the atmosphere and preferentially choose to absorb natural emissions and reject anthropogenic emissions! Also, there is no sink where the absorption rate would not increase with concentration.

Carbon dioxide emission sources (GT CO2/year)

  • Transpiration 440
  • Release from oceans 330
  • Fossil fuel combustion 26
  • Changing land use 6
  • Volcanoes and weathering 1

Carbon dioxide is accumulating in the atmosphere by about 15 GT CO2/ year. The accuracy of the amounts of carbon dioxide emitted by transpiration and by the oceans is no better than about 2 – 3% and that error band (+/- 20GT/year)  is itself almost as large as the total amount of emissions from fossil fuels.


 

Risk of rapid North Atlantic cooling in 21st century greater than previously estimated

February 25, 2017

This paper in Nature would not have have had any chance of being published a few years ago. But times are changing.

CNRS: “The possibility of major climate change in the Atlantic region has long been recognized and has even been the subject of a Hollywood movie: The Day After Tomorrow. To evaluate the risk of such climate change, researchers from the Environnements et Paléoenvironnements Océaniques et Continentaux laboratory (CNRS/University of Bordeaux) and the University of Southampton developed a new algorithm to analyze the 40 climate models considered by the latest report from the Intergovernmental Panel on Climate Change (IPCC). Their findings raise the probability of rapid North Atlantic cooling during this century to nearly 50%. Nature Communications publishes their work on February 15, 2017”.

My own view is that man-made global warming is insignificant and virtually impossible to measure. The apparent climate turbulence we may currently be experiencing is probably the exhibition of instabilities as climate shifts from an interglacial paradigm to the, more normal, glacial conditions. The transition will probably be “rapid” in geologic terms which probably means a thousand years or so. Major volcanic eruptions (VEI>6) are overdue. This interglacial has lasted some 13,000 years and is also, relatively, long. I think it feasible that 2 or 3 major volcanic eruptions in relatively quick succession could provide the conditions to trigger a full transition. Once glacial conditions are established they will last for about 100,000 years. And we will then be very thankful for all the fossil or nuclear energy we can have available to us.

Giovanni Sgubin, Didier Swingedouw, Sybren Drijfhout, Yannick Mary, Amine Bennabi. Abrupt cooling over the North Atlantic in modern climate models. Nature Communications, 2017; 8 DOI: 10.1038/ncomms14375

Abstract: Observations over the 20th century evidence no long-term warming in the subpolar North Atlantic (SPG). This region even experienced a rapid cooling around 1970, raising a debate over its potential reoccurrence. Here we assess the risk of future abrupt SPG cooling in 40 climate models from the fifth Coupled Model Intercomparison Project (CMIP5). Contrary to the long-term SPG warming trend evidenced by most of the models, 17.5% of the models (7/40) project a rapid SPG cooling, consistent with a collapse of the local deep-ocean convection. Uncertainty in projections is associated with the models’ varying capability in simulating the present-day SPG stratification, whose realistic reproduction appears a necessary condition for the onset of a convection collapse. This event occurs in 45.5% of the 11 models best able to simulate the observed SPG stratification. Thus, due to systematic model biases, the CMIP5 ensemble as a whole underestimates the chance of future abrupt SPG cooling, entailing crucial implications for observation and adaptation policy.

Even The Guardian (a high priest of the man-made global warming religious fantasy) is compelled to report!!

guardian-global-cooling


CNRS Press Release:

Current climate models all foresee a slowing of the meridional overturning circulation (MOC)2—the phenomenon behind the familiar Gulf Stream, which carries warmth from Florida to European shores—that could lead to a dramatic, unprecedented disruption of the climate system. In 2013, drawing on 40 climate change projections, the IPCC judged that this slowdown would occur gradually over a long period of time. The panel’s findings suggested that fast cooling of the North Atlantic during this century was unlikely.

Oceanographers from the EU EMBRACE project team reexamined the 40 projections by focusing on a critical spot in the northwest North Atlantic: the Labrador Sea. The Labrador Sea is host to a convection system ultimately feeding into the ocean-wide MOC. The temperatures of its surface waters plummet in the winter, increasing their density and causing them to sink. This displaces deep waters, which bring their heat with them as they rise to the surface, preventing the formation of ice caps. To investigate this phenomenon in greater detail, the researchers developed an algorithm able to detect quick sea surface temperature variations. Their number crunching revealed that 7 of the 40 climate models they were studying predicted total shutdown of convection, leading to abrupt cooling of the Labrador Sea: by 2–3 °C over less than 10 years. This in turn would drastically lower North Atlantic coastal temperatures.

But is such rapid cooling a real possibility? (After all, only a handful of the models supported this projection.) To answer this question, the researchers honed in on the critical parameter triggering winter convection: ocean stratification. Indeed, 11 of the 40 models incorporated vertical variation in the density of oceanic water masses. And of these 11 models, which we may furthermore consider to be the most reliable, 5 (i.e., 45% of the models) predicted a rapid drop in North Atlantic temperatures.  


 

South Australian blackouts due to over-reliance on wind and solar were predicted 2 years ago

February 13, 2017

I see that in South Australia some people have been complaining about the “record” heat with temperatures of 44ºC. Of course they take this as “evidence” of global warming. Never mind that some 120 years ago without any urban heat effects and without any industrialisation, the temperature reached 48-49ºC. It wasn’t global warming then.

In any event, South Australians and their elected representatives must get used to the fact that they have only themselves – and their political correctness – to blame. Winning greenie points seems to take precedence over common sense.

The SA blackouts caused by unreliable solar and wind were predicted two years ago in the journal Transactions of the Royal Society of South Australia, and every MP in the Parliament was told.

The Telegraph: 

100,000 SA customers blacked out because of reliance on unreliable wind and solar power in our network – more than a third of SA’s generation capacity.

IT is hard to disagree with the blunt assessment of Business SA that South Australia has been caught on electricity planning like a frog in boiling water. The story goes, with mixed results in scientific experiments, that a frog suddenly put into hot water will jump out but if heated slowly it will not figure out the danger.

The state was warned of the electricity-shortage crisis – and consequent blackouts – yet ignored the warnings, according to Business SA executive Anthony Penney.

“The most frustrating aspect of this most recent event is that it was anticipated by many businesses and other energy industry experts well in advance but, like the frog in boiling water, nothing happened in time,” he says.

This week the SA frog boiled. About 100,000 customers were blacked out because of the reliance on unreliable wind and solar power in our network – more than a third of SA’s generation capacity. ……….

Ben Heard, a doctoral researcher at the University of Adelaide also runs environmental non-Government organisation Bright New World – which supports the use of nuclear – explains the problem. He says the SA blackouts caused by unreliable solar and wind were predicted two years ago in the journal Transactions of the Royal Society of South Australia, and every MP in the Parliament was told.

“Back when wind generation was providing only 28 per cent of SA’s electricity supply, we flagged the risk presented by low supply in extreme heat conditions,’’ he says. Mr Heard said it was well known that extreme heat conditions in SA were accompanied by very little wind. “Our expectation at the time was that this would make it impossible to retire other generators from the market because of the security risk. Instead, the generators were allowed to retire, we took the risk, and we have started paying the price.”

Trans. Royal Society of South Australia

sa-royal-society


“Dilbert” withdraws his support for Berkeley

February 7, 2017

Scott Adams (creator of Dilbert) got his MBA from UC Berkeley but he is not amused by the shenanigans there. He suggests Berkeley is closer to Hitler than the right wing Milo Yiannopoulos they stopped (by rioting).

dilbert-blog

Berkeley and Hitler

Here’s the best article you are likely to read about the absurdity of calling ANY American president Hitler. This is the sort of persuasion (sprinkled with facts) that can dissolve some of the post-election cognitive dissonance that hangs like a dark cloud over the country. Share it liberally, so to speak. You might save lives.

Speaking of Hitler, I’m ending my support of UC Berkeley, where I got my MBA years ago. I have been a big supporter lately, with both my time and money, but that ends today. I wish them well, but I wouldn’t feel safe or welcome on the campus. A Berkeley professor made that clear to me recently. He seems smart, so I’ll take his word for it.

I’ve decided to side with the Jewish gay immigrant who has an African-American boyfriend, not the hypnotized zombie-boys in black masks who were clubbing people who hold different points of view. I feel that’s reasonable, but I know many will disagree, and possibly try to club me to death if I walk on campus. 

Yesterday I asked my most liberal, Trump-hating friend if he ever figured out why Republicans have most of the Governorships, a majority in Congress, the White House, and soon the Supreme Court. He said, “There are no easy answers.”

I submit that there are easy answers. But for many Americans, cognitive dissonance and confirmation bias hide those easy answers behind Hitler hallucinations. 

I’ll keep working on clearing the fog. Estimated completion date, December 2017. It’s a big job.

As he says, the privileged elite, Trump-haters need to come to terms with the reality that most of the US Governors, both houses of Congress, the White House and most of (soon) the Supreme Court are Republican.


 

The faking of climate data before the Paris conference

February 5, 2017

The “global temperature” is calculated by dividing the world into a grid, determining the temperature applying to each grid element and then “calculating” (not a simple average) a “global temperature” to apply to the world. The problem is that there are actual measurements (raw data) for just about 20% of the grid elements. These 20% are then used to “fill in” temperatures for all the other grid elements. There are algorithms devised first for “correcting” the raw data, then there are those governing the manner in which the corrected data are to be combined to fill in empty grid elements, and further algorithms to be used when combining all the elements of the grid to give a single “global temperature”. The accuracy of the raw data is only about 0.1ºC while the “global temperature” is presented to 0.001ºC, and differences of the order of 0.001ºC are used to make conclusions for  “policy” decisions. Climategate 1 revealed how data has been cherry picked and fudged for the first time. The deception continues.

Dr John Bates (formerly of NOAA) is now blowing the whistle on how the NOAA has manipulated climate data:

John Bates received his Ph.D. in Meteorology from the University of Wisconsin-Madison in 1986. Post Ph.D., he spent his entire career at NOAA, until his retirement in 2016.  He spent the last 14 years of his career at NOAA’s National Climatic Data Center (now NCEI) as a Principal Scientist, where he served as a Supervisory Meteorologist until 2012.

…….. NOAA Administrator’s Award 2004 for “outstanding administration and leadership in developing a new division to meet the challenges to NOAA in the area of climate applications related to remotely sensed data”. He was awarded a U.S. Department of Commerce Gold Medal in 2014 for visionary work in the acquisition, production, and preservation of climate data records (CDRs). He has held elected positions at the American Geophysical Union (AGU), including Member of the AGU Council and Member of the AGU Board. He has played a leadership role in data management for the AGU.

He has a guest post at Judith Curry’s blog.

Climate scientists versus climate data

by John Bates

A look behind the curtain at NOAA’s climate data center.

I read with great irony recently that scientists are “frantically copying U.S. Climate data, fearing it might vanish under Trump” (e.g., Washington Post 13 December 2016). As a climate scientist formerly responsible for NOAA’s climate archive, the most critical issue in archival of climate data is actually scientists who are unwilling to formally archive and document their data. I spent the last decade cajoling climate scientists to archive their data and fully document the datasets. I established a climate data records program that was awarded a U.S. Department of Commerce Gold Medal in 2014 for visionary work in the acquisition, production, and preservation of climate data records (CDRs), which accurately describe the Earth’s changing environment.

The most serious example of a climate scientist not archiving or documenting a critical climate dataset was the study of Tom Karl et al. 2015 (hereafter referred to as the Karl study or K15), purporting to show no ‘hiatus’ in global warming in the 2000s (Federal scientists say there never was any global warming “pause”). The study drew criticism from other climate scientists, who disagreed with K15’s conclusion about the ‘hiatus.’ (Making sense of the early-2000s warming slowdown). The paper also drew the attention of the Chairman of the House Science Committee, Representative Lamar Smith, who questioned the timing of the report, which was issued just prior to the Obama Administration’s Clean Power Plan submission to the Paris Climate Conference in 2015.

In the following sections, I provide the details of how Mr. Karl failed to disclose critical information to NOAA, Science Magazine, and Chairman Smith regarding the datasets used in K15. I have extensive documentation that provides independent verification of the story below. I also provide my suggestions for how we might keep such a flagrant manipulation of scientific integrity guidelines and scientific publication standards from happening in the future. Finally, I provide some links to examples of what well documented CDRs look like that readers might contrast and compare with what Mr. Karl has provided.

Background …..

Read the whole post here.

Of course the mainstream, politically correct media have no time for this. However David Rose of the Mail on Sunday is one of the few reporters who still has the nerve to question the fanatic, religious orthodoxy on this subject.

Exposed: How world leaders were duped into investing billions over manipulated global warming data 

  • The Mail on Sunday can reveal a landmark paper exaggerated global warming
  • It was rushed through and timed to influence the Paris agreement on climate change
  • America’s National Oceanic and Atmospheric Administration broke its own rules
  • The report claimed the pause in global warming never existed, but it was based on misleading, ‘unverified’ data

The Mail on Sunday today reveals astonishing evidence that the organisation that is the world’s leading source of climate data rushed to publish a landmark paper that exaggerated global warming and was timed to influence the historic Paris Agreement on climate change.

A high-level whistleblower has told this newspaper that America’s National Oceanic and Atmospheric Administration (NOAA) breached its own rules on scientific integrity when it published the sensational but flawed report, aimed at making the maximum possible impact on world leaders including Barack Obama and David Cameron at the UN climate conference in Paris in 2015.

The report claimed that the ‘pause’ or ‘slowdown’ in global warming in the period since 1998 – revealed by UN scientists in 2013 – never existed, and that world temperatures had been rising faster than scientists expected. Launched by NOAA with a public relations fanfare, it was splashed across the world’s media, and cited repeatedly by politicians and policy makers.

But the whistleblower, Dr John Bates, a top NOAA scientist with an impeccable reputation, has shown The Mail on Sunday irrefutable evidence that the paper was based on misleading, ‘unverified’ data.

It was never subjected to NOAA’s rigorous internal evaluation process – which Dr Bates devised.

His vehement objections to the publication of the faulty data were overridden by his NOAA superiors in what he describes as a ‘blatant attempt to intensify the impact’ of what became known as the Pausebuster paper. …….

NOAA data manipulation (from David Rose - Mail on Sunday)

NOAA data manipulation (from David Rose – Mail on Sunday)

There will be more whistle-blowers now stepping out from behind the woodwork.


 


%d bloggers like this: