Archive for the ‘Science’ Category

The never ending wonders of Carbon

January 27, 2011

Not just all life as we know it and coal and diamonds and graphite and carbon nanotubes and now the new wonder-world of  graphene.

Carbon also has the highest melting and sublimation point of all elements. At atmospheric pressure it has no melting point as its triple point is at 10.8 ± 0.2 MPa and 4600 ± 300 K, so it sublimates at about 3900 K.

File:Carbon basic phase diagram.png

Theoretical phase diagram of carbon: Wikipedia

Evidence is mounting that a new crystal form of carbon – body-centered tetragonal (bct) – something between diamond and graphene must exist. Simulations show that it must. It is now up to experimentalists to prove it.

Image: From "Ab Initio study of the formation of transparent carbon under pressure," by Xiang-Feng Zhou et al., in Physical Review B, Vol. 82, No. 13; October 29, 2010

From Scientific American:

Now evidence is mounting that there is yet another crystal structure to add to carbon’s catalogue of wonders: a material that could find applications in mechanical components whose hardness varies depending on the pressure to which they are exposed.

This new type of carbon was first observed in 2003, when researchers placed graphite, a stacking of chicken-wire-shaped networks of carbon atoms, under high pressure at room temperature. Under this “cold” compression, the graphite began to assume a hybrid form, between that of graphene and of diamond, but its exact nature was unknown.

Two computer simulation studies now suggest that cold-compressed graphite contains crystals of a structure called body-centered tetragonal, or bct, in addition to another type called M carbon. In bct, groups of four atoms are arranged in a square. The squares are stacked in an offset manner, and each square forms chemical bonds with four squares in the layers above and four below. A team led by Hui-Tian Wang of Nankai University in Tianjin, China, showed that during cold compression the transition to bct carbon results in a release of energy, which means it is likely to happen in the real world.

A Japanese and American team also conducted a simulation in which bct carbon produced x-ray patterns similar to those seen in the 2003 study. …. Whether bct carbon exists or can be synthesized in its pure form “is still a task for experimentalists to test.” 

Vladimir Nabokov vindicated – about butterflies

January 27, 2011
Polyommatus icarus 5

Polyommatus blue: Image via Wikipedia

Vladimir Nabokov (yes, he of Lolita fame) lived in Cambridge, Massachusets from 1942 to 1948 and while teaching at Wellesley he was curator of lepidoptery at Harvard University’s Museum of Comparative Zoology. His career in entomology was almost as distinguished – but not as lucrative – as his career in literature. In his spare time he composed chess problems. But his work in all these fields was characterised by his “love of detail and contemplation and symmetry”.

In 1945 he published a theory of butterfly evolution claiming that butterflies came to the New World in five waves of migration, through Asia across the Bering Strait into Alaska and then southward through North and then South America (much as humans migrated). Other butterfly experts scoffed at the idea. Nabokov’s theory was not taken seriously until after his death in 1977. Then, in the past decade, gene-sequencing technology finds that Nabokov was right all along and ironically during his life  “Nabokov never accepted that genetics or the counting of chromosomes could be a valid way to distinguish species of insects, and relied on the traditional (for lepidopterists) microscopic comparison of their genitalia”.

Now his theory stands vindicated by work  reported in a new paper in the Proceedings B of the Royal Society:

Phylogeny and palaeoecology of Polyommatus blue butterflies show Beringia was a climate-regulated gateway to the New World by Roger Vila, Charles D. Bell, Richard Macniven, Benjamin Goldman-Huertas, Richard H. Ree, Charles R. Marshall, Zsolt Bálint, Kurt Johnson, Dubi Benyamini and Naomi E. Pierce  doi:10.1098/rspb.2010.2213 Proc. R. Soc. B

Abstract: ….. By integrating molecular phylogeny, historical biogeography and palaeoecology, we test a bold hypothesis proposed by Vladimir Nabokov regarding the origin of Neotropical Polyommatus blue butterflies, and show that Beringia has served as a biological corridor for the dispersal of these insects from Asia into the New World. We present a novel method to estimate ancestral temperature tolerances using distribution range limits of extant organisms, and find that climatic conditions in Beringia acted as a decisive filter in determining which taxa crossed into the New World during five separate invasions over the past 11 Myr. Our results reveal a marked effect of the Miocene–Pleistocene global cooling, and demonstrate that palaeoclimatic conditions left a strong signal on the ecology of present-day taxa in the New World. …..

From Neatorama:

There were several plausible hypotheses for how the butterflies might have evolved. They might have evolved in the Amazon, with the rising Andes fragmenting their populations. If that were true, the species would be closely related to one another.

But that is not what Dr. Pierce found. Instead, she and her colleagues found that the New World species shared a common ancestor that lived about 10 million years ago. But many New World species were more closely related to Old World butterflies than to their neighbors. Dr. Pierce and her colleagues concluded that five waves of butterflies came from Asia to the New World — just as Nabokov had speculated.

“By God, he got every one right,” Dr. Pierce said. “I couldn’t get over it — I was blown away.”

Dr. Pierce and her colleagues also investigated Nabokov’s idea that the butterflies had come over the Bering Strait. The land surrounding the strait was relatively warm 10 million years ago, and has been chilling steadily ever since. Dr. Pierce and her colleagues found that the first lineage of Polyommatus blues that made the journey could survive a temperature range that matched the Bering climate of 10 million years ago. The lineages that came later are more cold-hardy, each with a temperature range matching the falling temperatures.


Measurement standards which can no longer be touched or seen or felt….

January 27, 2011

I have a sense of loss.

The Royal Society was home to a conference on 24th and 25th to consider how to bring the kilogram – the last of the seven base units of measurement – into line with the other six. This meeting was to discuss proposals of defining the kilogram in terms of the “fundamental” constants and to move away from using a lump of metal stored very carefully as the standard of mass.  The first General Conference on Weights and Measures was held in 1889 and meets every 4 years. The 24th Conference will be held in October this year and will table a proposal for the new definition of the kilogram. Then by the 25th Conference in 2015 the new definition may be adopted.

And when this happens there will no longer be any standard of measure left which can be seen or touched or felt. There will no longer be a King’s foot to refer to or an “Iron Ulna of our Lord the King” to signify a yard or some standard stones stored carefully to represent mass. The Mètre des Archives gave way to the International Prototype Metre.  The Imperial Standard Yard like the IPM was the distance between markings on specified bars of metal carefully stored. By 2015 all these standard definitions may be based only on the “fundamental constants” of nature (in the hope that they will truly remain constant across the reaches of space and time).

The International System of Units (SI) defines seven units of measure as a basic set from which all other SI units arederived. These SI base units and their physical quantities are:

  • metre for length
  • kilogram for mass
  • second for time
  • ampere for electric current
  • kelvin for temperature
  • candela for luminous intensity
  • mole for the amount of substance.
The seven SI base units and the interdependency of their definitions: for example

The seven SI base units and the interdependency of their definitions: for example

There used to be a time when measurements could be easily related to. Length and mass (weight) and light and temperature were all given units which were of practical and everyday use.

A candle-power was the light from one candle, a foot was a foot, an inch was either the width of your thumb or the distance from the tip of your index finger to the first knuckle, and a grain was the mass of a barley-corn. Water froze at zero °C and it boiled at one hundred divisions higher at 100 °C. Alternatively Daniel Fahrenheit set zero ° F to be the coldest stable temperature he could reach with a particular brine solution (ice, water and ammonium chloride which is a frigorific mixture) and he set 100 °F to the temperature of his wife’s armpit. Later others set the boiling point of water  to be exactly 180 divisions higher than the zero at 212 °F. This resulted in normal body temperature now becoming 98.6 °F instead of the 100. The point at which water freezes then happened to be 32 divisions higher than the zero. A comfortable temperature – inside or out – was 80 °F while 60 °F was chilly and 100 °F was on a hot day. A year was set by the seasons and the sun and the month was set by the moon. An average day was set by the sun rising and setting and this day was divided up – arbitrarily – into days and nights of 12 parts each and each part was further divided into 60 and 60 again – probably first by the Babylonians. It is only in our times that we have needed to split the second and anachronistically these further subdivisions of the second follow the metric system with milliseconds and microseconds and nanoseconds. A mile used to be 1000 paces (2 steps with a pace being a left step and a right step) of a standard Roman legionary.

But science and industry have moved on. Machines and instruments and medicine and electronics and computers and going to the moon can no longer manage with the old rules and measures of  everyday living.

From the purpose of the meeting at the  Royal Society:

From the origins of the metric system, when the metre was a fraction of the arc of the Paris meridian and the kilogram the weight of a cubic decimetre of water, the ultimate goal has been a system of measurement based on invariant quantities of nature. After more than 200 years we are now within reach of achieving this. While the kilogram is still defined as the mass of a Pt-Ir cylinder kept in a vault in Sèvres, serious plans now exist to redefine the kilogram by fixing the numerical value of the Planck constant h; and the ampere, kelvin and mole by fixed numerical values for e, k and NA. With the metre already being defined by the speed of light and the second by an atomic microwave transition, but likely soon to be redefined by an optical transition of much higher frequency, we shall have at last achieved what the savants of the 18th century had sought.

Today all the units except the kilogram are defined by natural constants:

  1. The metre is the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second with the speed of light in a vacuum being the natural constant. (And I can’t help wondering if this will remain constant under changing gravity conditions or the changing state of the expanding – or contracting – universe).
  2. The second is the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom at rest at a temperature of 0 K.
  3. The ampere is that constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed 1 metre apart in vacuum, would produce between these conductors a force equal to 2 × 10−7newton per metre of length.
  4. The kelvin, unit of thermodynamic temperature, is the fraction 1/273.16 of the thermodynamic temperature of thetriple point of water having the isotopic composition defined exactly by the following amount of substance ratios: 0.000 155 76 mole of2H per mole of 1H, 0.000 379 9 mole of 17O per mole of 16O, and0.002 005 2 mole of 18O per mole of 16O.
  5. The mole is the amount of substance of a system which contains as many elementary entities as there are atoms in 0.012 kilogram of carbon 12 where unbound atoms of carbon 12, at rest and in their ground state, are referred to.
  6. The candela is the luminous intensity, in a given direction, of a source that emits monochromatic radiation of frequency 540 × 1012 hertz and that has a radiant intensity in that direction of 1/683 watt per steradian.

The plan is to base the kilogram on the Planck constant .

File:CGKilogram.jpg

A computer-generated image of the International Prototype kilogram (IPK): Wikipedia

Physics World reports

In the current system, the kilogram, ampere, kelvin and the mole are all linked to exact numerical values of the mass of the international prototype kilogram in Paris, the permeability of the vacuum, the triple-point temperature of water, and to the molar-mass of carbon-12 respectively. The plan is to change all that so that these four units are linked to exact numerical values of the Planck constant, the charge of the electron, the Boltzmann constant and to the Avogadro constant respectively.

All of this is no doubt a great advance and necessary but I have difficulty to relate to the new definitions. I cannot invoke any image of 9 192 631 770 periods of a radiation or the 299 792 458th part of a second and I feel that something is being lost……

The strange and murky case of Silvia Bulfone-Paus: 12 retractions so far …..

January 25, 2011

Twelve papers where Sylvia Bulfone-Paus was the senior author have been retracted.

Silvia Bulfone-Paus: image retraction watch

That itself is sufficiently unusual and remarkable. But the story seems to go back a long way. Retraction Watch has been following the story.

In September 2010 Nature carried the story of a formal investigation which had started in July 2010 of scientific misconduct being carried out at the Research Center Borstel in Germany (Forschungszentrum Borstel, Leibniz Gemeinschaft, Deutsche Forschungsgemeinschaft, Medizinischen Fakultäten der Universitäten Lübeck und Kiel). The story however was not about the investigation but about a destabilising influence:

But events around such an investigation in Germany have taken a troubling and damaging turn from such good practice in the past few months. An unknown agitator using the presumed pseudonym Marco Berns is engaged in an e-mail and Internet offensive against two biomedical researchers whom he accuses of scientific fraud.

Berns’s libellous messages are targeted at dermatologist Ralf Paus and immunologist Silvia Bulfone-Paus, a married couple who both hold joint positions at the University of Manchester, UK, and the University of Lübeck, Germany.

The trial-by-Internet is disturbing a formal investigation, organized by the Research Center Borstel in Germany and begun in July, into some of the pair’s publications.

“Marco Berns” and his accomplice (or alter ego?) “Martin Frost” were posting articles on the internet since at least the end of 2009 and had also been subjected to “cease and desist” demands from lawyers representing the Research Centre. Some of these articles questioned Sylvia Bulfone-Pause’s use of academic titles among other wrong doings.

It does seem that the formal investigation started in July 2010 partly – if not wholly – because of the allegations which had been made by Berns and Frost in 2009. Formally the reason for the investigation was that Bulfone-Pause herself had reported some manipulation of data in papers published about research carried out under her supervision but how and when such manipulation had been discovered is not very clear.

In any case the investigation was completed and the results were published by the Research Centre on 2nd December 2010.

The Commission concludes that scientific misconduct (has ocurred) within the laboratory group “Immunobiology” for years. In 6 out of 8 of the analyzed publications in which, the two former research assistants, Dr. E. Bulanova and Dr. V. Budagian, recorded as first authors, manipulation of images found that the manipulation of the reproduction of scientific results. Data corruption within the meaning of “independent inventors” of results are not available. Given the primary responsibility of the first authors in data collection and data presentation in a scientific publication, these two first authors of these publications (bear the) main responsibility for scientific misconduct.

The blame was put squarely on Drs. Bulanova and Budagian – both Russian – and as their supervisor, Bulfone-Paus received a firm slap on the wrist:

The 6  publications complained of (were) produced under the senior authorship of Prof. Dr. Bulfone-Paus………  The Commission considers that this a lack of supervision which must be expected from the senior author / group leader, even if the first author of the offending publications are experienced scientists. The senior author / research group leader, therefore, carries a key responsibility for the scientific wrongdoings within their work group.

Nature reported:

An external investigation, launched in July and chaired by Werner Seeger, a biomedical researcher at the University of Giessen, Germany, found that two former postdocs with the centre’s immunology group were guilty of using pictures of protein blots from unrelated experiments to support their findings on signalling in cells involved in allergic reactions such as asthma. The pair’s supervisor, Silvia Bulfone-Paus, who chairs the centre’s immunology and cell biology department, bears “substantial responsibility” for the manipulations, the committee found, but added that they found no evidence of data fabrication.

The three Directors of the Research Centre are Prof. Dr. Dr. S. Bulfone-Paus, Prof. Dr. U. Schaible and Prof. Dr. P. Zabe. And the elusive Martin Frost continued his writings implying that the Russian scientists could actually just be scapegoats for the more senior authors involved and that some of their wrongdoing could be traced back to 1999!!

Now the six retractions have grown to be 12 papers so far.

This story is not over yet……..

Another perversion of science: Confirmation bias in the name of global warming dogma is also scientific misconduct

January 25, 2011

A new paper has been published in Ecology Letters

Ran Nathan, Nir Horvitz, Yanping He, Anna Kuparinen, Frank M. Schurr, Gabriel G. Katul. Spread of North American wind-dispersed trees in future environmentsEcology Letters, 2011; DOI: 10.1111/j.1461-0248.2010.01573.

In this paper the authors have assumed that climate change will cause changes to CO2 concentration and wind speed. They have assumed also that increased CO2 will “increase fecundity and advance maturation”. They have then modelled the spread of 12 species as a function of wind speed.

So far so good – they have actually modelled only the effect of wind speed  which they assume will reduce due to climate change.

Their results basically showed no effect of wind speed:

“Future spread is predicted to be faster if atmospheric CO2 enrichment would increase fecundity and advance maturation, irrespective of the projected changes in mean surface windspeed”.

And now comes the perversion!

From their fundamental conclusion that wind speed has no effect and that therefore any CO2 increase resulting from climate change will enhance the spread of the trees, they invoke “expected” effects to deny what they have just shown:

“Yet, for only a few species, predicted wind-driven spread will match future climate changes, conditioned on seed abscission occurring only in strong winds and environmental conditions favouring high survival of the farthest-dispersed seeds. Because such conditions are unlikely, North American wind-dispersed trees are expected to lag behind the projected climate range shift.”

This final conclusion is based on absolutely nothing  and their modelling showed nothing and yet this paper was accepted for publication. I have no problem that a result showing “no effect of wind speed” be published but suspect that it needed the nonsense, speculative conclusion to comply with current dogma.

Science Daily then produces the headline: Climate Change Threatens Many Tree Species

when the reality is

This study Shows No Effect of Wind Speed But Yet We Believe that Climate Change Threatens Many Tree Species

“Our research indicates that the natural wind-driven spread of many species of trees will increase, but will occur at a significantly lower pace than that which will be required to cope with the changes in surface temperature,” said Prof. Nathan. “This will raise extinction risk of many tree populations because they will not be able to track the shift in their natural habitats which currently supply them with favorable conditions for establishment and reproduction. As a result, the composition of different tree species in future forests is expected to change and their areas might be reduced, the goods and services that these forests provide for man might be harmed, and wide-ranging steps will have to be taken to ensure seed dispersal in a controlled, directed manner.”

Whether the perversion is by the authors themselves anticipating what is needed to get a paper published or whether it is due to pressure from the Journal Ecology Letters or by their referees is unclear.

Abstract:

Despite ample research, understanding plant spread and predicting their ability to track projected climate changes remain a formidable challenge to be confronted. We modelled the spread of North American wind-dispersed trees in current and future (c. 2060) conditions, accounting for variation in 10 key dispersal, demographic and environmental factors affecting population spread. Predicted spread rates vary substantially among 12 study species, primarily due to inter-specific variation in maturation age, fecundity and seed terminal velocity. Future spread is predicted to be faster if atmospheric CO2 enrichment would increase fecundity and advance maturation, irrespective of the projected changes in mean surface windspeed. Yet, for only a few species, predicted wind-driven spread will match future climate changes, conditioned on seed abscission occurring only in strong winds and environmental conditions favouring high survival of the farthest-dispersed seeds. Because such conditions are unlikely, North American wind-dispersed trees are expected to lag behind the projected climate range shift.

In essence this paper is only based on belief and the results actually obtained are denied. It seems to me that denying or twisting or “moulding” results actually obtained to fit pre-conceived notions is not just a case of confirmation bias but comes very close to scientific misconduct.

Cold Fusion: Another fraud or a breakthrough?

January 22, 2011

In March 1989, Stanley Pons and Martin Fleishmann claimed to have achieved cold fusion at room temperature but their experiment could not be reproduced.

File:Igloo.jpg

Cold fusion lab (igloo) under construction : image wikipedia

While cold fusion is considered highly improbable, it is not impossible and there remains a nagging suspicion (hope?) that some “miracle”, perpetual machine may suddenly appear in the most unlikely place and perhaps even outside main-stream science.

Physorg reports on another claim this time from Bologna, Italy:

Despite the intense skepticism, a small community of scientists is still investigating near-room-temperature fusion reactions. The latest news occurred last week, when Italian scientists Andrea Rossi and Sergio Focardi of the University of Bologna announced that they developed a cold fusiondevice capable of producing 12,400 W of heat power with an input of just 400 W. Last Friday, the scientists held a private invitation press conference in Bologna, attended by about 50 people, where they demonstrated what they claim is a nickel-hydrogen fusion reactor
. Further, the scientists say that the reactor is well beyond the research phase; they plan to start shipping commercial devices within the next three months and start mass production by the end of 2011.

Rossi and Focardi say that, when the atomic nuclei of nickel and hydrogen are fused in their reactor, the reaction produces copper and a large amount of energy. The reactor uses less than 1 gram of hydrogen and starts with about 1,000 W of electricity, which is reduced to 400 W after a few minutes. Every minute, the reaction can convert 292 grams of 20°C water into dry steam at about 101°C. Since raising the temperature of water by 80°C and converting it to steam requires about 12,400 W of power, the experiment provides a power gain of 12,400/400 = 31. As for costs, the scientists estimate that electricity can be generated at a cost of less than 1 cent/kWh, which is significantly less than coal or natural gas plants……….

…… Rossi and Focardi’s paper on the nuclear reactor has been rejected by peer-reviewed journals, but the scientists aren’t discouraged. They published their paper in the Journal of Nuclear Physics, an online journal founded and run by themselves, which is obviously cause for a great deal of skepticism. They say their paper was rejected because they lack a theory for how the reaction works. According to a press release in Google translate, the scientists say they cannot explain how the cold fusion is triggered, “but the presence of copper and the release of energy are witnesses.”

http://www.physorg.com/news/2011-01-italian-scientists-cold-fusion-video.html

But not everybody is dismissing this latest claim.

Steven Krivit of the New Energy Times describes why he believes that the Rossi and Focardi LENR device is probably real and is an advancement on the Piantelli process.

But there seems to be a vested interest here and I remain unconvinced.

Especially since they claim that they cannot fully explain what happens but are going to be producing “commercial units” anyway it sounds like a scam. They will probably sell some units to the gullible  before they disappear from view.

Just another fraud.

Indian Environment Ministry challenges IPCC and CO2 conclusions

January 21, 2011
Cropped from image of Jairam Ramesh the Indian...

Jairam Ramesh: Image via Wikipedia

That there is little love lost between Rajendra Pachauri and the Indian Minister of Environment Jairam Ramesh is no secret. (Pachauri made the ill-advised and stupid remark about “voodoo science” regarding Ramesh and the Ministry’s claims debunking the IPCC ststements on Himalyan glaciers). Now according to the Hindustan Times the Ministry of Envirionment has produced a paper concluding that solar effects on clouds represent about half the warming effects attributed to CO2:

India has once again challenged the UN’s climate science body – the Intergovernmental Panel on Climate Change (IPCC) — through a new scientific paper. The Environment ministry sponsored paper says that human induced global warming is much less than what the R K Pachauri headed IPCC had said. The cause is reduced impact of Galactic Cosmic Rays (GCRs) on formulation of low clouds over earth in the last 150 years, says a paper by U R Rao, former chairman of Indian Space Research Organisation, released by Environment minister Jairam Ramesh. ….

Analyzing the data between 1960 and 2005, Rao found that lesser GCRs were reaching the earth due to increase in solar magnetic field and thereby leading to increase in global warming. “Consequently the contribution of increased CO2 emission to be observed global warming of 0.75 degree Celsius would only be 0.42 degree Celsius, considerably less than what predicted by IPCC,” the paper said to be published in Indian Journal Current Science had said. This is about 44 % less than what IPCC had said.

Ramesh in 2009 had released a similar scientific paper saying that the IPCC’s claim that most Himalayan glaciers will melt by 2035 was wrong. A few months later, after a review the IPCC regretted the error. If Ramesh latest bid gets globally recognition, it can alter the rules of UN run climate negotiations of 200 nations.

Impact of GCRs on global warming had been highly controversial since 1998, when Henrik Svensmark of Danish National Space Center said it was causing global warming. A decade later a joint European study debunked the claim, saying there was no co-relation. …

“I just want to expand scientific debate on impact of non-Green House Gases on climate change,” Ramesh said, when asked whether he was again challenging the IPCC. “Science is all about raising questions.”

International climate science is mainly western driven and collaborates the view of the rich world that gases such as Carbon Dioxide (CO2) are the main contributor for global warming. Any scientific work challenging the view has been debunked as work of a sceptic.

“Climate science is much more complex than attributing everything to CO2,” said Subodh Verma, climate change advisor in the Environment ministry.

And, its first impact has come from IPCC chairperson R K Pachauri, who has told the government, that impact of GCRs on global warming will be studied in depth in the fifth assessment report to be published in 2013-14. In its earlier four assessment reports, IPCC had not studied the impact of GCRs in detail.

The Global Warming establishment does not much like this paper. But they have now been reduced to claiming that everything – even directly conflicting evidence – supports the theory of man-made CO2 on global warming:

V Ramanathan of US based Scripps Institute of Oceanography at University of California said the Rao’s paper strengthens the case for greenhouse a primary driver for global warming. “The observed rapid warming trends during the last 40 years cannot be accounted for (by) the trends in GCRs,” he said, in his comments on Rao’s paper.

Science is indeed all about asking questions.

It seems to have been forgotten that anyone who  is not a sceptic deep down  is not – and can not be – a scientist.

NASA reduces forecast for Solar Cycle 24 again

January 19, 2011

In December NASA had reduced its forecasts for SC24 to a peak sunspot number of 64 being reached in June 2013. Now less than two months later, the latest forecast has been reduced again to a peak sunspot number of 59 to be reached in June / July 2013.

We find a starting time of May 2008 with minimum occurring in December 2008 and maximum of about 59 in June/July of 2013.

NASA SC24 forecast - January 2011: image NASA

At these levels the current Landscheidt minimum is comparable to SC5 and SC6 – the Dalton Minimum of 1790 to 1830 – where peak sunspot numbers were just over 50. (The earlier Maunder Minimum – 1645 to 1715- was before the modern period of sunspot number measurement and nominally was a period with no significant sunspots – presumably at sunspot numbers of less than 20 in today’s measurement values).

From my previous post:

It is not inconceivable that the SC24 will not peak till early 2014 and will only achieve peak sunspot numbers around 55. Solar cycle 24 could well have a length of 150+ months instead of the nominal 132 months.

The development of the NASA predictions are in the table below:

NASA Forecasts for SC24

Date of

forecast

Expected date

of peak

Expected peak

sunspot number

March 2006 June 2010 168
October 2008 March 2012 137
January 2009 June 2012 104
January 2010 June 2013 90
December 2010 June 2013 64
January 2011 June / July 2013 59


Related:

https://ktwop.wordpress.com/2010/12/23/is-the-landscheidt-minimum-a-precursor-for-a-grand-minimum/

When plagiarism is not plagiarism : part 2

January 17, 2011

Plagiarism has never been considered misconduct in the political arena. And it would seem it is not considered misconduct when purported science is used in a political or religious cause.

I posted earlier about how plagiarism is not plagiarism in the eyes of a journal editor when it is done in his own journal.

https://ktwop.wordpress.com/2010/12/31/ethics-of-journals-when-plagiarism-is-not-plagiarism/

Steve McIntyre reports on another case where science is subordinated to political and religious beliefs.

http://climateaudit.org/2011/01/16/trenberth-and-lifting-text-verbatim-2/

Apparently plagiarism is not plagiarism when carried out by Kevin Trenberth in support of his religious beliefs. But the actions reported here to hastily introduce attributions wherever plagiarism had been detected, suggest that Trenberth realises that if his scientific misconduct is shown then his religious positions are undermined and discredited.

Since last ice age, warming and cooling have been caused by ocean currents

January 16, 2011

A new paper in Science giving ocean currents in the Atlantic their due (and without finding it necessary to appeal to tales of carbon dioxide). Perhaps the science is not so settled after all!

File:Oceanic gyres.png

The five major ocean-wide gyres — the North Atlantic, South Atlantic, North Pacific, South Pacific, and Indian Ocean gyres. Each is flanked by a strong and narrow “western boundary current,” and a weak and broad “eastern boundary current”: Wikimedia

The Deglacial Evolution of North Atlantic Deep Convection. by D. J. R. Thornalley, S. Barker, W. S. Broecker, H. Elderfield, I. N. McCave.  Science, 2011; 331 (6014): 202 DOI: 10.1126/science.1196812

Science Daily reports:

… Scientists have long suspected that far more severe and longer-lasting cold intervals have been caused by changes to the circulation of the warm Atlantic ocean currents themselves.

Now new research led by Cardiff University, with scientists in the UK and US, reveals that these ocean circulation changes may have been more dramatic than previously thought. The findings, published January 14, 2011 in the journal Science, show that as the last Ice Age came to an end (10,000 — 20,000 years ago) the formation of deep water in the North-East Atlantic repeatedly switched on and off. This caused the climate to warm and cool for centuries at a time.

The circulation of the world’s ocean helps to regulate the global climate. One way it does this is through the transport of heat carried by vast ocean currents, which together form the ‘Great ocean conveyor’. Key to this conveyor is the sinking of water in the North-East Atlantic, a process that causes warm tropical waters to flow northwards in order to replace the sinking water. Europe is kept warmer by this circulation, so that a strong reduction in the rate at which deep water forms can cause widespread cooling of up to 10 degrees Celsius. ….. The new results suggest that the Atlantic ocean is capable of radical changes in how it circulates on time scales as short as a few decades.

Dr Thornalley said: “These insights highlight just how dynamic and sensitive ocean circulation can be. Whilst the circulation of the modern ocean is probably much more stable than it was at the end of the last Ice Age, and therefore much less likely to undergo such dramatic changes, it is important that we keep developing our understanding of the climate system and how it responds when given a push.”

Paper Abstract:

Deepwater formation in the North Atlantic by open-ocean convection is an essential component of the overturning circulation of the Atlantic Ocean, which helps regulate global climate. We use water-column radiocarbon reconstructions to examine changes in northeast Atlantic convection since the Last Glacial Maximum. During cold intervals, we infer a reduction in open-ocean convection and an associated incursion of an extremely radiocarbon (14C)–depleted water mass, interpreted to be Antarctic Intermediate Water. Comparing the timing of deep convection changes in the northeast and northwest Atlantic, we suggest that, despite a strong control on Greenland temperature by northeast Atlantic convection, reduced open-ocean convection in both the northwest and northeast Atlantic is necessary to account for contemporaneous perturbations in atmospheric circulation.