Posts Tagged ‘environment’

Breaking weather records from a century ago only shows that it was hotter before CO2 emissions began

July 14, 2012

I am off again on an assignment for a few days and blogging will be light.

It’s summer and where I’m going torrential rain or blistering sunshine with temperatures over 45 °C  are quite normal for this time of year. If it is raining the temperature may be down to 25°C. So I’m prepared for a possible variation of some 20 deg C. It’s just weather.

I note the usual summer stories from around the world of heat waves in some places and “coldest” Junes in a 100 years in others. Some farmers are complaining about droughts and others are complaining about floods. Where societies have ignored repairs or have not built up their infrastructure to match the changing concentrations of urban populations – disasters occur. But I also note that when parts of the US declares that they have just had the hottest period for 50 years or 100 years or whatever and that this is “proof” of global warming they conveniently forget that 50 years ago or 100 years ago or whenever, man-made emissions of carbon dioxide were orders of magnitude lower. When weather records from a hundred years ago are broken it only proves that it was hotter/colder/stormier/wetter/drier or whatever long before the modern industrial age and before any significant man man-made carbon dioxide emissions.  Breaking an old record only shows cyclic behaviour – not “runaway” behaviour!

It’s summer and people are on vacation and journalists are looking for stories and the silly season has begun!

A slight shift of focus

May 12, 2012

It has been just over two years since I started this blog – my first – and I now feel sufficiently comfortable to move away from the general and to try and focus just on the topics that interest me most. I have changed the sub-heading to reflect this.

My opinions on aspects of energy and power generation and climate and environment will now take centre stage on this blog. I shall have to try to address my interests in technology and materials and behaviour and management and anthropology and politics elsewhere. It has been the advent of accessible electric power which has been the single most liberating  force for the human condition – ever. For the foreseeable future humanity will continue to use – and need to use – electric power. And virtually all our sources for electric power – except perhaps some nuclear fuels – derive from the Sun.

Sol Invictus.

The blog image is of sunrise on a very cold day in February last year.

How many years of global cooling are needed to disprove AGW?

September 26, 2011

I am travelling this week.

I had an interesting – if rather depressing – discussion with a fellow traveler (a patent lawyer) at the airport yesterday. The discussion turned to the manner in which science which happened to be “in fashion” became political movements and  the manner in which science itself took on politically correct dimensions.

Sometimes – as with eugenics – the political movement came first and the science followed to fit the movement.  In fact, his contention was that even where the science had come  first, the development of a political movement would always lead to subsequent science being constrained to support the imperatives of the movement.

I brought up the caase of AGW and how  an uncertain science – in my opinion – had been hijacked by a political movement such that one particular hypothesis – which has still to be proven – had become the only politically correct or allowable science. I suggested that real observations might change what was considered politically correct. Since global temperature – if such a thing can be defined – has been declining for the last decade even though carbon dioxide has been increasing,  I expected that new science would have to take these real observations into account in their mathematical modelling and that the strength of the dogma would eventually decrease.

My companion however disagreed. He suggested that all political movements had to be fundamentally and economically viable to survive. If the movement was lucrative – as AGW had become – then there would be a vested interest in maintaining the science it was based on  even if the facts said otherwise. This would be achieved, he argued, by the “Science” allowing or accounting for some deviations – as for example with explanations made up for why a decade or two of cooling could occur without disturbing the central thesis of the “Science”. He cited medical science and examples of purported treatments which were continued for long periods after they were discredited because of the revenues that they were generating. He suggested that the chemical industry was the prime driver for the banning of some refrigerants (based on now outdated ozone depletion science) just so that they could shift production to newer refrigerants having much higher margins. Similarly he felt that the environmental benefits of switching to low energy lamps was minuscule but the lighting industry much preferred the margins and revenues generated by these to those generated by incandescent light bulbs which were suffering from intense competition.

His conclusion was that since the AGW “industry” was generating large revenues whether through carbon trading schemes or by the extraction of subsidies from taxpayer money for so-called “green” energy or “green” fuels, then the vested interest in showing that any conflicting measurements were a temporary aberration would be very strong. Since the timescales of climate change were in the order of hundreds of years, he felt that a mere 20 or 30 years of inconvenient measurements would do little to dent the momentum of a successful revenue generating “science”!!!

He made some good points. I am afraid that even 3 decades of cooling or the start of a mini-ice age will probably not suffice to dampen the ardour of the global warming enthusiast as long as the revenues from growing bio-fuels or getting subsidies for “green” energy keep rolling in. The AGW religion and its corresponding “science” will stop only if the revenues stop.

New Scientist blog: CEO of “Good” Energy complains that sceptics are resorting to emotion rather than science

September 23, 2011

Juliet Davenport, founder and CEO of something carrying the subjective and emotive name “Good” Energy writes in the New Scientist blog today bemoaning the fact that climate sceptics are winning the argument by the use of emotion rather than science!!

Scientists – she believes – are not doing enough to help her cause. But she might carry a little more credibility if she attempted to use science rather than dogma and consensus. And of course if she did not have a vested interest in extorting subsidies from taxpayers. Clearly Al Gore has failed her in being “charismatic and campaigning”- but then he is no scientist and perhaps he does not count.

A charismatic campaigning voice from the scientific community would make a huge difference in helping to combat the small but vocal minority of sceptics who tend to resort to emotion rather than science to make their arguments. …….. 

…. I can’t help but think it would be better to see all government departments arguing more loudly about the long term benefits of tackling climate change and the transition to a low carbon economy. To do that convincingly, however, they need to have information at their fingertips. Scientists have a huge role to play here, debating and responding to claims made through the media and simplifying messages for the public. They need to make the case that a low-carbon economy is not only necessary for tackling climate change, but also that it is technologically possible.

If we are going to act in time on climate change, it is vital that we keep up the pressure on the government to form a policy framework that we can then deliver.

The coming gas glut and the availability of shale gas – now even in the UK – must be giving her nightmares. Without climate change alarmism and the demonisation of carbon dioxide, the cost of wind and solar power would make them non-starters.

But the tide is turning.

Nonsense speculation posing as science

August 17, 2011

Another example of nonsense speculation which gets published and then drives headlines only because they invoke the magic words “climate change”. A case of speculative IPCC model results being used as inputs for another speculative model about fish extermination and coming to a mildly alarmist conclusion.

Not a measurement in sight. But many pages, lots of statistics, 4 tables, 2 figures and 66 references to come to the amazing conclusion and state the obvious that cold water fish may die out if they are forced to live in warm water. 

As my son would put it “Duh”!!

A new paper in PLOS One.

Comparing Climate Change and Species Invasions as Drivers of Coldwater Fish Population Extirpations

 Sharma S, Vander Zanden MJ, Magnuson JJ, Lyons J (2011), PLoS ONE 6(8): e22906. doi:10.1371/journal.pone.0022906

The “researchers” actually measured nothing. They took a data-base of the conditions in which populations of a particular kind of fish (the cisco) existed. These parameters include air temperature among many others. They then did a statistical regression to infer how the populations might depend upon air temperature. They then took temperature increase assumptions from the climate change scenarios of the IPCC and applied them to the data base to speculate what that might do to the fish populations. They then reach their conclusions that climate change would extirpate a large section of the fish population and that this would be worse than the impact of invasive species.

And this is considered peer-reviewed science!!!!

They write in their paper:

Coldwater fishes, such as cisco [Corgeonus artedii] require cold water temperatures, high dissolved oxygen concentrations, and oligotrophic conditions, and thereby are sensitive indicators of environmental change. In Wisconsin, cisco are close to the southern edge of their range and are listed as a species of special concern. Cisco live in larger and deeper inland lakes with cold, well-oxygenated deep waters. Under climate change scenarios, as air temperatures increase, epilimnion and hypolimnion water temperatures are expected to increase. As water temperatures increase, the duration of the lake stratification period is expected to increase, isolating the deep waters from exchanges with the atmosphere, making it more likely that metabolic activity will reduce dissolved oxygen concentrations in the hypolimnion to stressful or lethal levels. The combination of warmer water temperatures and lower dissolved oxygen concentrations under climate change scenarios in larger, deeper lakes typically suitable for coldwater fishes could result in their extirpation.

Cisco are sensitive to the introduction of non-native rainbow smelt (Osmerus mordax). Rainbow smelt is native to the northeastern coast of North America and was introduced to the Laurentian Great Lakes in the 1920s. In Wisconsin, rainbow smelt have been introduced into lakes deliberately by anglers for sport fishing purposes. Furthermore, fertilized eggs of rainbow smelt may have been unintentionally introduced into lakes by residents cleaning smelt on their piers. When rainbow smelt invade a system, they negatively interact with native species through predation and competition. Invasion of rainbow smelt has been linked directly to changes in zooplankton community composition, decline in recruitment of walleye (Sander vitreus), and extirpation of cisco and yellow perch (Perca flavescens) . For example in Sparkling Lake, Wisconsin, the cisco population was extirpated through predation-induced recruitment within eight years of detection of rainbow smelt. …… 

Geo-referenced lake-specific data were collected for 13,052 lakes in Wisconsin from a variety of sources including the North Temperate Lakes Long Term Ecological Research (NTL-LTER) program, Wisconsin GAP (Geographic Approach to Planning for Biological Diversity) database, Wisconsin Department of Natural Resources databases, refereed publications, government reports, and dissertations. From the aforementioned databases, a suite of variables describing lake morphology, water chemistry, physical habitat, and fish species occurrence were compiled. Environmental variables retained in the final dataset were: surface area (hectares), maximum depth (metres), perimeter (kilometres), Secchi depth (metres), pH, conductivity (µS/cm), and mean annual air temperatures (°C). For water chemistry variables, annual averages were used in the dataset. 

Current air temperatures and scenarios of future mean annual air temperatures were obtained from the Wisconsin Initiative on Climate Change Impacts (WICCI) Climate Working Group. Mean annual air temperatures were statistically downscaled for Wisconsin on a 0.1° latitude ×0.1° longitude grid. Climate data were summarised for three time periods: 1961–2000, 2046–2065, and 2081–2100 and averaged over these three sets of years as suggested by the Intergovernmental Panel on Climate Change (IPCC) to reduce temporal variation in climate. Then, projected air temperatures from 15 general circulation models and the IPCCs A1, A2 and B1 scenarios (although not all general circulation models incorporated all three scenarios) totalling 78 climate change scenarios were used to develop future projections of cisco occurrence. The A1, A2 and B1 scenarios incorporate a range of variation in greenhouse gas emissions inferred for various time periods in the 21st century. The A1 scenario is the most extreme and assumes the highest greenhouse gas concentrations, followed by the A2 and B1 scenarios. …….

Our results highlight the threats to coldwater fish species. The probability of cisco extirpations could be reduced in Wisconsin through three interventions. First, the mitigation of climate change through the reduction of greenhouse gas emissions could significantly reduce the worst case losses of cisco. ….

Oh dear!!

Needless to say this non-science – since it uses the magic phrase “climate change” and has an alarmist theme – has no difficulty in being published and generating nonsense headlines in Science Daily:

Climate Change Could Drive Native Fish out of Wisconsin Waters 

ScienceDaily (Aug. 16, 2011) — The cisco, a key forage fish found in Wisconsin’s deepest and coldest bodies of water, could become a climate change casualty and disappear from most of the Wisconsin lakes it now inhabits by the year 2100, according to a new study. In a report published online in the journal Public Library of Science One, researchers from the University of Wisconsin-Madison and the Wisconsin Department of Natural Resources project a gloomy fate for the fish — an important food for many of Wisconsin’s iconic game species — as climate warms and pressure from invasive species grows. ………

In addition to the ecological change that would be prompted by a warmer Wisconsin climate, Sharma notes, the impoverishment of aquatic ecosystems will have potential socio-economic implications, especially in a setting like Wisconsin where recreational fishing is an iconic pastime, not to mention an important industry.

“This could very well impact the fishing experiences we have,” avers the Wisconsin researcher.

But rather than make me concerned about climate change this nonsense report based on idle – but fashionable – speculation makes me much more concerned about the predominance of modelling over measurement and what passes in some quarters for for science.

Global Warming has “paused” and climate change forecasts are “flawed”

August 6, 2011

The Times They Are a-changin’

1. The UK Met Office is an ardent follower of the Global Warming Doctrine but even they have had to now admit that global warming has “paused”.

“Two research papers shed new light on why the upper layers of the world’s oceans have seen a recent pause in warming despite continued increases in greenhouse gases.”

But the religion of global warming need not worry. The pause is – conveniently – only due to “natural variability”. The Met Office does however admit that the science is a long way from being settled and that with more measurements (and perhaps with a little less slavish acceptance of model results) “it would be possible to account for movement of heat within the ocean and do a better job of monitoring future climate change”.  One can hope that they may be returning to a science based on observations leading to models leading to further measurements to validate the models , but religions are not cast aside so easily! 

The independent studies from the Royal Netherlands Meteorological Institute (KNMI) and the Met Office show how natural climate variability can temporarily mask longer-term trends in upper ocean heat content and sea surface temperature.

The upper 700 metres of the global ocean has seen a rise in temperature since reliable records began in the late 1960s. However, there has been a pause in this warming during the period from 2003 to 2010. The papers published this week offer explanations for this.

Climate model simulations from KNMI show that such pauses in upper ocean warming occur regularly as part of the climate system’s natural variability. … A different set of model simulations from the Met Office supports the idea of heat moving to the deeper ocean explaining the recent pause in upper ocean warming.

The same research also suggests that with deeper ocean observations it would be possible to account for movement of heat within the ocean and do a better job of monitoring future climate change.

GRL website (KNMI paper)(Katsman, C.A. and G.J. van Oldenborgh)

GRL website (Met Office paper) (Palmer, M. D., D. J. McNeall, and N. J. Dunstone)

2. In the meantime a study at Lancaster University charges that “inaccurate climate forecasts costs the world considerable money” and “the overwhelming focus on limiting green house gases alone may well be mis-guided”.

Climate change forecasts used to set policy and billions of pounds in investment are flawed, according to new research from Lancaster University Management School (LUMS).

Complex climate models have been used by scientists to reach a consensus (through the International Panel on Climate Change, the IPCC) of global warming of 0.2 °C per decade. But this fundamental finding for governments and the global population continues to be fiercely contested by sceptics of the role of human activity in climate change. The competing interest groups involved have led to a decline in confidence generally in the wake of claims of manipulated data from the University of East Anglia, and incorrect projections – such as Himalayan glaciers disappearing by 2035 .

The new study by Robert Fildes and Nikolaos Kourentzes at the Lancaster Centre for Forecasting applies the latest thinking on forecasting to the work of climate change scientists, in a bid to make 10 and 20 year ahead climate predictions more accurate and trustworthy for policy-makers, and help address growing doubts over the realities of climate change. Such decadal forecasts have the most relevance to current thinking and policy plans and if they are to be credible and useful, they need to demonstrate their accuracy.  But the forecasts produced by the current models do not achieve this.

The authors set out a new basis for ‘decadal’ forecasting which is to be a major component of the next IPCC assessment report. Using a combination of models, with statistical  benchmarking as checks,  current forecasts prove almost certainly less accurate than they could be. Inaccurate climate forecasts costs the world considerable money. The implication is that the climate modelling community needs to open up its research agenda. As yet it has not demonstrated that it can produce better forecasts than simpler statistical methods. A consequence of this, explored by Fildes and Kourentzes, is that the overwhelming focus on limiting green house gases alone may well be mis-guided. The hydrologist Keith Beven’s work on modelling carried out in the Lancaster Environment Centre leads to the same conclusion. In short, eclectic forecasting methods and a wide range of policy responses are what is needed if we are to overcome the problems of emerging warming. 


What population problem? More brains and hands could well cater for the extra mouths to feed

July 30, 2011

Malthus’ ideas haven’t quite been discredited but his alarmism certainly has. As world population increases from the current 6 billion and approaches around 9 to 10 billion by 2100 studies suggest that population growth can have economic and environmental benefits.

A new article in  Science 29 July 2011: Vol. 333 no. 6042 pp. 544-546

DOI: 10.1126/science.333.6042.544

Are More People Necessarily a Problem?

by David Malakoff

In 1937, A bureaucrat serving in the British Empire’s Kenya Colony penned an alarming memo to his bosses about conditions in the Machakos Reserve, a hilly, drought-prone farming region 50 kilo meters south of Nairobi. “Benevolent British rule” had encouraged the explosive “multiplication” of the “natives,” he reported, leading to massive environmental degradation. “Every phase of misuse of land is vividly and poignantly displayed in this Reserve, the inhabitants of which are rapidly drifting to a state of hopeless and miserable poverty and their land to a parching desert of rocks, stones and sand.” The apocalyptic warning came as the region’s population approached 250,000.

Today, more than 1.5 million people call Machakos home. Rather than a cautionary example of the perils of overpopulation, however, for some experts Machakos has become a symbol of something very different: the idea that rapid human population growth, even in some of Earth’s driest, most challenging environments, is not necessarily a recipe for disaster—and can even bring benefits. They argue that, over the past 75 years, population growth in Machakos and nearby Nairobi has triggered social and economic shifts that have made it possible for residents to regreen once-barren hillsides, reinvigorate failing soils, reduce birth rates, and increase crop production and incomes. “A landscape that was once declared good for nothing is now like a garden when the rain falls,” says Michael Mortimore, a geographer with Drylands Research, a United Kingdom–based nonprofit organization, who helped document the turnaround in More People, Less Erosion, a 1994 study that is still influential—and controversial—today. “Too many people still have the simplistic notion that too many people is a problem,” he says. “What happened in Machakos challenges that pessimism.”…..

 ……. Many see crisis looming in those numbers for people and the environment. Others, however, see some hope for a transition to more sustainable livelihoods and cite Ester Boserup, a Danish economist who died in 1999, as one source of their optimism. In 1965, the then-little-known Boserup, who spent most of her career consulting for international development institutions, published a slim volume titled The Conditions of Agricultural Growth: The Economics of Agrarian Change under Population Pressure(pdf Boserup1965) It examined the history of subsistence farming and offered a theory that essentially turned Malthus upside down. Instead of rising population density leading to barren fields and starvation, Boserup suggested it could naturally trigger “intensification”: the use of new technologies and more labor to get bigger harvests from less land.

“The idea was that people weren’t just mouths to feed but also brains that could think and hands and legs that could work very hard”.

….. In some parts of Africa, meanwhile, researchers are documenting a notable, Machakos-like “regreening” of arid areas with fast-growing populations. …. There’s some evidence that the extra greenery is helping to make poor farm communities more resilient to droughts and economic setbacks, but the long-term outlook remains at best unclear.

In the forest frontiers of South and Central America, researchers have found both Malthusian and Boserupian forces at work in deforestation. Depending on local circumstances, families faced with growing population densities have responded by both migrating to clear new farms in forested areas, the agricultural “extensification” predicted by Malthus, and intensified land use à la Boserup, a team led by David Carr of the University of California, Santa Barbara, reported in a 2009 study in Population and Development. Paradoxically, the result is that areas with relatively low population densities can have much higher deforestation rates than those with higher densities.

Read the whole article 

Related:

 Sustainable growth in Machakos by Francis Gichuki , Mary Tiffen , Michael Mortimore, ILEIA Newsletter • 9 nº 4 • December 1993

More People, Less Erosion, Overseas Development Institute UK, 1994

 

Reducing sulphur emissions caused post-1970 global warming!!!!

July 5, 2011

Whether warming or cooling it would seem that anthropogenic effects and man’s burning of coal is responsible.

“The post 1970 period of warming, which constitutes a significant portion of the increase in global surface temperature since the mid 20th century, is driven by efforts to reduce air pollution in general and acid deposition in particular”.

That’s the conclusion of a new paper from the “peer-reviewed” literature confirming the obvious that global temperatures have plateaued since 1998.

Reconciling anthropogenic climate change with observed temperature 1998–2008

Robert K. Kaufmann, Heikki Kauppi, Michael L. Mann, James, H. Stock

pnas. 201102467

PDF from WUWT

And though the paper cuts off  data in 2008 this temperature stability certainly continues till 2010 and it seems – on my own empirical observations  – even in 2011.

As the paper title shows this real stabilisation of temperatures which is not predicted by any climate model and which may well be a precursor of a few decades of global cooling is of some concern to the Anthropogenic Global Warming enthusiasts. The presumption is that the model results are supreme and that reality must be reconciled by invoking further anthropogenic effects.

Needless to say any global cooling is not acknowledged since that would be heretical and instead short-term anthropogenic factors (sulphur emissions from coal burning in China)  are blamed for this cessation of global warming!!

Given the widely noted increase in the warming effects of rising greenhouse gas concentrations, it has been unclear why global surface temperatures did not rise between 1998 and 2008. We find that this hiatus in warming coincides with a period of little increase in the sum of anthropogenic and natural forcings. Declining solar insolation as part of a normal eleven-year cycle, and a cyclical change from an El Nino to a La Nina dominate our measure of anthropogenic effects because rapid growth in short-lived sulfur emissions partially offsets rising greenhouse gas concentrations. As such, we find that recent global temperature records are consistent with the existing understanding of the relationship among global surface temperature, internal variability, and radiative forcing, which includes anthropogenic factors with well known warming and cooling effects.

The conclusion is formulated to avoid any semblance of heresy and to ensure publication no doubt.

The finding that the recent hiatus in warming is driven largely by natural factors does
not contradict the hypothesis: “most of the observed increase in global  average temperature since the mid 20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations (14)”. As indicated in Figure 1, anthropogenic activities that warm and cool the planet largely cancel after 1998, which allows natural variables to play a more significant role. ……   

The post 1970 period of warming, which constitutes a significant portion of the increase in global surface temperature since the mid 20th century, is driven by efforts to reduce air pollution in general and acid deposition in particular, which cause sulfur emissions to decline while the concentration of greenhouse gases continues to rise. 

That reality is being acknowledged is heartening but relying on the anthropogenic effects effects of coal burning alone (carbon dioxide emissions causing warming and sulphur emissions causing cooling) with only a passing reference to solar effects is not just naive – it is denying the obvious.

Related:

http://wattsupwiththat.com/2011/07/04/a-peer-reviewed-admission-that-global-surface-temperatures-did-not-rise-dr-david-whitehouse-on-the-pnas-paper-kaufmann-et-al-2011/

Reality: Quality and price are the buying criteria, not the environment

May 11, 2011

Reality is based on what people do and not on what do-gooders, alarmists and scaremongers say.

Svenska Dagbladet reports that:

Small companies ignore the environment

Sweden’s small-and medium-sized companies are primarily looking for quality and price when making purchases. The environment is least important according to the Visma Purchase Barometer, and low price is the most important. Environmentally friendly products are usually slightly more expensive and are at a disadvantage when companies chase low prices. Only 1.5 percent of Sweden’s small and medium-sized enterprises consider the environment as the most important criterion when making purchases. Quality and price are the most crucial according to the Visma  survey of more than 1600 small and medium-sized businesses.

But 1.5 percent is still a remarkably low figure given that climate change has been so hot the last few years”, says Henrik Salwen, CEO of VismaAdvantage. The companies were asked to specify one of six criteria and Quality was the most important followed by price and punctual delivery. The environment was the least important.

Another perversion of science: Confirmation bias in the name of global warming dogma is also scientific misconduct

January 25, 2011

A new paper has been published in Ecology Letters

Ran Nathan, Nir Horvitz, Yanping He, Anna Kuparinen, Frank M. Schurr, Gabriel G. Katul. Spread of North American wind-dispersed trees in future environmentsEcology Letters, 2011; DOI: 10.1111/j.1461-0248.2010.01573.

In this paper the authors have assumed that climate change will cause changes to CO2 concentration and wind speed. They have assumed also that increased CO2 will “increase fecundity and advance maturation”. They have then modelled the spread of 12 species as a function of wind speed.

So far so good – they have actually modelled only the effect of wind speed  which they assume will reduce due to climate change.

Their results basically showed no effect of wind speed:

“Future spread is predicted to be faster if atmospheric CO2 enrichment would increase fecundity and advance maturation, irrespective of the projected changes in mean surface windspeed”.

And now comes the perversion!

From their fundamental conclusion that wind speed has no effect and that therefore any CO2 increase resulting from climate change will enhance the spread of the trees, they invoke “expected” effects to deny what they have just shown:

“Yet, for only a few species, predicted wind-driven spread will match future climate changes, conditioned on seed abscission occurring only in strong winds and environmental conditions favouring high survival of the farthest-dispersed seeds. Because such conditions are unlikely, North American wind-dispersed trees are expected to lag behind the projected climate range shift.”

This final conclusion is based on absolutely nothing  and their modelling showed nothing and yet this paper was accepted for publication. I have no problem that a result showing “no effect of wind speed” be published but suspect that it needed the nonsense, speculative conclusion to comply with current dogma.

Science Daily then produces the headline: Climate Change Threatens Many Tree Species

when the reality is

This study Shows No Effect of Wind Speed But Yet We Believe that Climate Change Threatens Many Tree Species

“Our research indicates that the natural wind-driven spread of many species of trees will increase, but will occur at a significantly lower pace than that which will be required to cope with the changes in surface temperature,” said Prof. Nathan. “This will raise extinction risk of many tree populations because they will not be able to track the shift in their natural habitats which currently supply them with favorable conditions for establishment and reproduction. As a result, the composition of different tree species in future forests is expected to change and their areas might be reduced, the goods and services that these forests provide for man might be harmed, and wide-ranging steps will have to be taken to ensure seed dispersal in a controlled, directed manner.”

Whether the perversion is by the authors themselves anticipating what is needed to get a paper published or whether it is due to pressure from the Journal Ecology Letters or by their referees is unclear.

Abstract:

Despite ample research, understanding plant spread and predicting their ability to track projected climate changes remain a formidable challenge to be confronted. We modelled the spread of North American wind-dispersed trees in current and future (c. 2060) conditions, accounting for variation in 10 key dispersal, demographic and environmental factors affecting population spread. Predicted spread rates vary substantially among 12 study species, primarily due to inter-specific variation in maturation age, fecundity and seed terminal velocity. Future spread is predicted to be faster if atmospheric CO2 enrichment would increase fecundity and advance maturation, irrespective of the projected changes in mean surface windspeed. Yet, for only a few species, predicted wind-driven spread will match future climate changes, conditioned on seed abscission occurring only in strong winds and environmental conditions favouring high survival of the farthest-dispersed seeds. Because such conditions are unlikely, North American wind-dispersed trees are expected to lag behind the projected climate range shift.

In essence this paper is only based on belief and the results actually obtained are denied. It seems to me that denying or twisting or “moulding” results actually obtained to fit pre-conceived notions is not just a case of confirmation bias but comes very close to scientific misconduct.


%d bloggers like this: