Archive for the ‘Science’ Category

Solar Cycle 24 double peak now clearly evident

May 9, 2013

Already in March there were signs that this Solar Cycle 24 would exibit a double peak. NASA’s latest sunspot prediction for Solar Cycle 24 as of 1st May 2013 clearly shows that the sunspot activity is into its “double peak for this Cycle. A double peak was also evident in Cycles 22 and 23 and also in Cycles 5 and 14. The levels for SC24 are still going to be the lowest for 100 years and predictions for SC 25 are that they will be even lower still. Most second peaks have been somewhat smaller than the first – though not in SC5 – and seem to add around 6 months to the cycle time.

If this is indeed a double peak then I expect that solar maximum will perhaps be a few months delayed from the NASA prediction of Fall 2013. End 2013 now seems more likely.

SC24 may 2013

The Dalton minimum spanned Solar Cycles 5 and 6 from 1790 to 1820.  The Maunder Minimum from 1645 to 1715 preceded the numbering of Solar Cycles (Solar Cycle 1 started in 1755). The likelihood that SC 24 and 25 may be similar to SC 5 and 6 is growing and so is the likelihood that we will see 2  – 3 decades of global cooling. It is more likely that for the next 20- 30 years this Landscheidt Minimum will resemble the Dalton Minimum period, but if SC25 is a very small cycle then we may even approach the conditions of the Little Ice Age during the Maunder Minimum. Landscheidt’s prediction was that this minimum would last from 2000 to 2060 and the global temperature stand-still for the last 15 years gives greater credence to his forecasts.

NASA: The Sunspot Cycle —

The Maunder Minimum

Early records of sunspots indicate that the Sun went through a period of inactivity in the late 17th century. Very few sunspots were seen on the Sun from about 1645 to 1715 (38 kb JPEG image). Although the observations were not as extensive as in later years, the Sun was in fact well observed during this time and this lack of sunspots is well documented. This period of solar inactivity also corresponds to a climatic period called the “Little Ice Age” when rivers that are normally ice-free froze and snow fields remained year-round at lower altitudes. There is evidence that the Sun has had similar periods of inactivity in the more distant past. The connection between solar activity and terrestrial climate is an area of on-going research.

Cheers (hic)! Champagne research at the University of Reading.

May 9, 2013

Back in 2009, Dr Jeremy Spencer from the University of Reading published a paper in the British Journal of Nutrition about how drinking champagne was good for your heart.

image – LiveScience

Research from the University of Reading suggests that two glasses of Champagne a day may be good for your heart and circulation. The researchers have found that drinking Champagne wine daily in moderate amounts causes improvements in the way blood vessels function. …… 

….. Dr Jeremy Spencer, from the Department of Food and Nutritional Sciences said: “Our research has shown that drinking around two glasses of Champagne can have beneficial effects on the way blood vessels function, in a similar way to that observed with red wine. We always encourage a responsible approach to alcohol consumption, but the fact that drinking Champagne has the potential to reduce the risks of suffering from cardiovascular diseases such as heart disease and stroke, is very exciting news.” 

University of Reading

Four years on he is now Professor of Nutritional Biochemistry and Medicine and – presumably – many cases of champagne later, he has just published another paper on the benefits of champagne in improving memory and holding back dementia .

New research shows that drinking one to three glasses of champagne a week may counteract the memory loss associated with ageing, and could help delay the onset of degenerative brain disorders, such as dementia.

Scientists at the University of Reading have shown that the phenolic compounds found in champagne can improve spatial memory, which is responsible for recording information about one’s environment, and storing the information for future navigation. …. 

….. Professor Jeremy Spencer, Department of Food and Nutritional Sciences, University of Reading, said: “These exciting results illustrate for the first time that the moderate consumption of champagne has the potential to influence cognitive functioning, such as memory.  Such observations have previously been reported with red wine, through the actions of flavonoids contained within it. 

“However, our research shows that champagne, which lacks flavonoids, is also capable of influencing brain function through the actions of smaller phenolic compounds, previously thought to lack biological activity. …

The paper is published in Antioxidants and Redox Signalling.

I have a very clear “vision” of what Professor Spencer’s lab might look like. A lot more genteel than a pub or a bar — since it’s champagne! I don’t suppose Prof. Spencer has much difficulty in recruiting post-grads and post-docs (whose alcohol consumption capacity is legendary and insatiable). The Department of Food and Nutritional Sciences boasts that it has over 100 PhD and MPhil students!

We currently have over 100 PhD and MPhil students, who each belong to one or more of our 3 research groups:

  • Food and Bioprocessing Sciences
  • Food Microbial Sciences
  • Human Nutrition

Since 1998, we have enjoyed a 97-100% pass rate. Sponsorship comes from research councils, Government departments, the European Union, charities, industry, the Reading Endowment Trust Fund and overseas scholarships.

Note — a 97 – 100% pass rate!

Peer review for funding is different to that for publication

May 8, 2013

I note that battle lines are being drawn in the US between the parties concerning peer review and the NSF. The Republicans are questioning a number of NSF grants and demanding some justification of the review process for funding awards.  The Democrats are taking this as an heretical attack on SCIENCE. But I also note that one important distinction is not being drawn.

Choosing projects for funding from the public purse is fundamentally a political process and requires justification in simple terms to the providers of that funding (the taxpayer). While peer review – for all its faults – may be used to select projects the reviewers cannot escape the responsibility to justify their selections to the funders (and not just to the funding organisation – NSF – set up to channel the funds). Of course the NSF would prefer that they have complete freedom in disbursing the funds allocated to them in any way they please – but that won’t wash. The acceptance of public funds demands public accountability.

Peer review for publication is a very different thing. This should be in – engineering terms – a “Quality gate”. It should be a check of the quality of the work done and its independence. But here reviewers also carry  a “fiduciary” responsibility which is not always met. The reviewers carry an obligation of trust and ethical propriety not only to the journals they serve but also to the readers and subscribers of that journal. Where funding is involved this “fiduciary” responsibility extends to the providers of the funds. Unlike reviewers for funding selection who – I think – must be able to justify their choices to a wider audience than the “in-crowd”, the publication reviewer does not need to provide explanations for his opinions. But his opinions cannot be secret opinions – and that requires that such reviewers not be anonymous and that their opinions be available. Journal editors have the final responsibility for what is published or not. But reviewers should not escape being held responsible and accountable for their share of such decisions. They cannot escape from ownership and consequences of their own opinions and judgements on which decisions to publish or reject may be based.

Financial auditors cannot escape their fiduciary responsibilities (though they often escape accountability). Can the scientific community continue to take – or appear to take –  less responsibility than the financial community? Accountability is quite another matter.

ScienceInsider: 

The new chair of the House of Representatives science committee has drafted a bill that, in effect, would replace peer review at the National Science Foundation (NSF) with a set of funding criteria chosen by Congress. For good measure, it would also set in motion a process to determine whether the same criteria should be adopted by every other federal science agency.

The legislation, being worked up by Representative Lamar Smith (R-TX), represents the latest—and bluntest—attack on NSF by congressional Republicans seeking to halt what they believe is frivolous and wasteful research being funded in the social sciences. Last month, Senator Tom Coburn (R-OK) successfully attached language to a 2013 spending bill that prohibits NSF from funding any political science research for the rest of the fiscal year unless its director certifies that it pertains to economic development or national security. Smith’s draft bill, called the “High Quality Research Act,” would apply similar language to NSF’s entire research portfolio across all the disciplines that it supports.

Nature: 

In a brief 15-minute speech today, US President Barack Obama championed independence for the peer-review process, in front of an audience of elite researchers at the 150th annual meeting of the National Academy of Sciences in Washington DC.

“In order for us to maintain our edge, we’ve got to protect our rigorous peer review system,” Obama said. His support comes on the heels of draft legislation, dated 18 April, that ScienceInsider reports is being discussed by the chairman of the US House of Representatives Science Committee, Lamar Smith (Republican, Texas). That legislation would overhaul peer review of grants submitted to the National Science Foundation (NSF) and require the NSF director to certify each funded project as benefitting the economic or public health of the United States.

“Climate science” now hunting for cooling effects – and finds the brightness of clouds

May 6, 2013

How is it that – for a settled science – all these new “cooling” mechanisms are suddenly being found? Could it have something to do with trying to rescue climate models which have failed to predict the slowdown in global warming? “Climate science” is now hunting for previously unidentified cooling effects to explain the warming that has not happened.

This time it is the brightness of clouds! Apparently manmade pollution in the form of organics can enhance the formation of clouds which happen to be brighter from above and which reflect more of the suns radiation. Voilà! An as yet unidentified cooling effect.

But this conclusion comes not from measurements but from yet another model!

From the University of Machester (via Alpha Galileo):

Organic vapours affect clouds leading to previously unidentified climate cooling

University of Manchester scientists, writing in the journal Nature Geoscience, have shown that natural emissions and manmade pollutants can both have an unexpected cooling effect on the world’s climate by making clouds brighter.

Clouds are made of water droplets, condensed on to tiny particles suspended in the air. When the air is humid enough, the particles swell into cloud droplets. It has been known for some decades that the number of these particles and their size control how bright the clouds appear from the top, controlling the efficiency with which clouds scatter sunlight back into space. A major challenge for climate science is to understand and quantify these effects which have a major impact in polluted regions.

The tiny seed particles can either be natural (for example, sea spray or dust) or manmade pollutants (from vehicle exhausts or industrial activity). These particles often contain a large amount of organic material and these compounds are quite volatile, so in warm conditions exist as a vapour (in much the same way as a perfume is liquid but gives off an aroma when it evaporates on warm skin).

The researchers found that the effect acts in reverse in the atmosphere as volatile organic compounds from pollution or from the biosphere evaporate and give off characteristic aromas, such as the pine smells from forest, but under moist cooler conditions where clouds form, the molecules prefer to be liquid and make larger particles that are more effective seeds for cloud droplets.

“We discovered that organic compounds such as those formed from forest emissions or from vehicle exhaust, affect the number of droplets in a cloud and hence its brightness, so affecting climate,” said study author Professor Gordon McFiggans, from the University of Manchester’s School of Earth, Atmospheric and Environmental Sciences.

“We developed a model and made predictions of a substantially enhanced number of cloud droplets from an atmospherically reasonable amount of organic gases.

“More cloud droplets lead to brighter cloud when viewed from above, reflecting more incoming sunlight. We did some calculations of the effects on climate and found that the cooling effect on global climate of the increase in cloud seed effectiveness is at least as great as the previously found entire uncertainty in the effect of pollution on clouds.”

  • ‘Cloud droplet number enhanced by co-condensation of organic papers,’ by Gordon McFiggans et al, will be published in Nature Geoscience on Sunday 5 May 2013.

Fifty shades of fraud in Flanders

May 3, 2013

Following on from the publicity surrounding the Diedrik Stapel case, a new survey of medical researchers in Flanders confirms that fraud is fairly prevalent. This takes the form of making up data, manipulating data to make it match a hypothesis, plagiarism, double publishing (self-plagiarism), withholding undesirable research results, undeserved authorships or dividing research into as many separate science articles as possible (salami slicing). The article by Reinout Verbeke and Joeri Tijdink was produced with the support of the Pascal Decroos Fund for Investigative Journalism. One in twelve medical scientists admits to making up or ‘massaging’ data in order for it to match a hypothesis. And almost six in twelve see such fraudulent practices happening around them. They identify high publication pressure as one of the causes. Ivan Oransky of Retraction Watch points out that for the medical research fraternity the high rewards from pharmaceutical companies can also play a role.

In November and December 2012 Belgian  science journalist Reinout Verbeke (editor of Eos Magazine) spread an anonymous survey on fraud and pressure to publish among scientists of the Medical Science faculties of all Flemish universities. ……..  Psychiatrist and researcher Joeri Tijdink (VU University Medical Center Amsterdam) collaborated on the survey. He did another sounding in 2011 in the Netherlands, before the scandal surrounding Diederik Stapel had broken out – the social psychologist who had made up data and experiments. For years nobody had been on to him. Stapel and his unsuspecting doctoral students and co-authors even made top magazines with their fictitious studies. Luckily though, such large-scale fraud is rather rare. ……..

Fifty shades of fraud

Fifty shades of fraud

The results of the Flemish survey are striking. Of the 315 participating scientists, four (1.3%) admit to having made up data at least once in the last three years. If what they say is true, this probably concerns fraud that is still undiscovered. 23 respondents (7.3%) admit to having selectively removed data or results to make research match a hypothesis, so-called ‘data massaging’. Overall, about 8% of the Flemish medical scientists admits to recently having made up and/or massaged data. The figures are worse than the international average. A meta-analysis of 18 scientific studies on fraud by Daniele Fanelli showed that on average 2% of all scientists (from different fields of study) admitted to having done similar practices at least once (PloS ONE, 2009). Why are the results among Flemish respondents even worse? “That doesn’t surprise me, because we are talking about medical scientists”, says American journalist and fraud expert Ivan Oransky from RetractionWatch. com. “Cooperating with the pharmaceutical industry gains researchers financial rewards. That could pressurise scientists to cut corners.” André Van Steirteghem, a pioneer in reproductive medicine and secretary of the Committee on Publication Ethics (COPE), thinks there is something else at play. “There’s is a significant lack of openness on fraud and malpractice at Flemish universities. This survey asked scientists about their perceptions for the very first time. They were able to vent their feelings. I think that explains the high figures in Flanders.” We can even suspect malpractices in Flanders to be more wide-spread still. “Surveys have their limits”, says Daniele Fanelli. “Many cheaters won’t admit to having done it, or will falsely assume they have a clean conscience.” …… 

Scientific American reports on this story here.

“Consensus science” – by definition – is not “science” and is a dangerous thing

April 30, 2013

The internet is full of polls that I generally find irritating. How many believe that “A” will happen? or that “B will win? or that “C” is better than “D”? Whatever the result of the poll may be, they show nothing more than where the preponderance of belief  lies. The polls are evidence only of what people believe; they are not evidence of the subject being voted upon.

Either something is or it is not.

If we don’t know whether it is or is not, we can formulate it as a hypothesis and address it by the scientific method. The formulation is then as a falsifiable hypothesis and we then predict what data might be collectable if the hypothesis was false. We then collect data and where data is not available we design and carry out experiments to provide such data. These data and their analysis should be tested – for the classical scientific method – to see if the hypothesis is false (not – it should be noted – to show that the hypothesis is true). Where the data cannot show the hypothesis to be false it means only that the hypothesis is still unproven but the data set adds to the body of evidence in favour of the hypothesis in the particular circumstances in which that data-set was collected.

When we don’t know we can still suppose the hypothesis to be true or false. But that is just a supposition and lies in the realms of belief and religion. We can take a vote within some group and see how many believe it to be true or to be false. Commercial and other interests may be vested in the supposition. Lobbying and persuasion can be applied in favour of or against the supposition. Voters can be influenced and cajoled and persuaded to vote for or against. A completely democratic and transparent system of voting may be applied. And  the result may be overwhelmingly in favour or against the supposition. But even where a majority – even an overwhelming majority of say 97% – of some group believes the proposed hypothesis to be true, the vote adds not one iota of evidence in favour of or against the hypothesis. An overwhelming vote that a hypothesis is true when it is actually false makes it no less false. All the vote can show is the preponderance of belief (and belief – by definition – comes into play when and because evidence is lacking).

And all that democratic process to establish what people believe brings us no closer to answering the question of whether the supposition is true.

But it gets worse.

Once a “democratic” majority has confirmed its belief in a supposed “truth” of a supposition, then there is a immense societal pressure against proving the supposition to be false. Falsifiable hypotheses are reformulated to be no longer falsifiable. The scientific method is perverted – for reasons of the vested interests – to now produce anecdotal evidence trying to “prove the hypothesis” rather than trying to collect data to try and show the hypothesis to be false. Evidence against the majority belief is not collected because it is no longer expedient to do so. Not only is it not collected, it is ignored even when it is plain and obvious. The moment a scientific hypothesis invokes or has to invoke a majority vote or a consensus in its support it leaves the scientific arena and enters the  political universe. Truth becomes whatever the majority believes. Proper scientific effort directed to falsifying the supposition is not just discouraged, it is penalised and attracts sanctions in the form of reduced funding and rejection of publications. It becomes heresy. Even where the believed supposition is actually true, the supposition remains as belief and cannot easily be brought back into the rational world.

As Judith Curry wrote recently:

With genuinely well-established scientific theories, ‘consensus’ is not discussed and the concept of consensus is arguably irrelevant.  For example, there is no point to discussing a consensus that the Earth orbits the sun, or that the hydrogen molecule has less mass than the nitrogen molecule.  While a consensus may arise surrounding a specific scientific hypothesis or theory, the existence of a consensus is not itself the evidence. ……. 

Given the complexity of the climate problem, ‘expert judgments’ about uncertainty and confidence levels are made by the IPCC on issues that are dominated by unquantifiable uncertainties. It is difficult to avoid concluding that the IPCC consensus is manufactured and that the existence of this consensus does not lend intellectual substance to their conclusions.

“Consensus science” has no option but to become science by majority vote. Polls replace evidence. And where the belief is false, the belief itself prevents a return to the truth. “Consensus science” as belief cannot be “science”. The simple fact is that whenever a “scientific hypothesis” invokes a consensus in its support it is – per force – just a belief. It becomes religion and not science. And that is a dangerous thing.

Related: Climate change: no consensus on consensus

Resentment and charges of misconduct and bias at the Delhi component of the International Centre for Genetic Engineering and Biotechnology (ICGEB)

April 30, 2013

It is not so easy to judge if the charges of bias and misconduct at the New Delhi component of the International Centre for Genetic Engineering (ICGEB) and Biotechnology are just because

  • some disgruntled junior researchers are envious about the much higher salaries of their seniors, or
  • it is because of resentment by mediocre scientists when their work is not considered of a quality and significance sufficient to earn them authorship of scientific papers, or
  • because senior scientists are exploiting junior post-doctoral researchers

The ICGEB is part of the United Nations System where of course officials tend to take care of their own.

But whatever the real reason a “scientific institution” which establishes and perpetuates  two classes of scientists where salary scales of the one are double that of the other seems a particularly ill thought-out scheme and – at best – just plain stupid. It not only invites resentment but also implies that the quality of the research done is judged by the salary paid to the researcher.

ICGEB-ND_Building

New Delhi Component of the ICGEB

The ICGEB is an international, nonprofit research organization. Established as a special project of UNIDO, it became fully autonomous in 1994 and now counts over 60 Member States. … With Components in Trieste, Italy, New Delhi, India and Cape Town, South Africa, the Centre forms an interactive network with Affiliated Centres in ICGEB Member States. The New Delhi component of the International Centre for Genetic Engineering (ICGEB) and Biotechnology is dedicated to advanced research and training in molecular biology, infectious disease biology, and biotechnology.

The Calcutta Telegraph reports: 

Allegations of discrimination, academic misconduct and lack of transparency over dramatic differences in researchers’ salaries have tainted a 25-year-old international research centre here that is hailed for its excellence in science.

Indian and foreign scientists are trying to resolve what they say is a dual crisis gripping the New Delhi component of the International Centre for Genetic Engineering and Biotechnology (ICGEB): loss of foreign funding and discontent among researchers.

A panel of Indian scientists set up by the department of biotechnology is examining options to resolve the issue of future funding. ICGEB director-general Francisco Baralle from Italy is expected to meet department of biotechnology secretary Krishnaswamy VijayRaghavan and the research institution’s staff here on April 30. ……

Twenty-four of the 30 senior scientists at the ICGEB, New Delhi, have asked Baralle to remove the Delhi director, Virander Chauhan, correspondence between the scientists and Baralle between September 2012 and February 2013 shows.

Also, a grievance committee report from within the ICGEB shows that two former researchers have complained that a senior scientist at the institution, Kanury Venkata Subba Rao, denied them authorship on a research paper.

Both Chauhan and Rao have denied any wrongdoing. ……. 

……. Some of the discontent appears to stem from differences in the salaries of scientists. The ICGEB has a two-tier pay structure — an international scale where a post-doctoral scientist could start at Rs 150,000 per month, paid in US dollars, and a national scale where a similarly qualified scientist would begin at about Rs 75,000 a month.

“The original idea at the ICGEB’s creation in 1988 was to draw the best from international faculty,” said a senior Indian scientist involved in the efforts to resolve the crisis.

“But all the 10 international-grade scientists’ positions there are now held by Indians. There seems to be discord now because sections of scientists feel there should not be huge salary differences between similarly performing and similarly qualified researchers.” ……

Disconnect between man-made CO2 and atmospheric levels of CO2

April 28, 2013

The evidence grows that

  1. Temperature drives carbon dioxide, and 
  2. man made carbon dioxide is a minor contributor to carbon dioxide concentration in the atmosphere

Atmospheric verification of anthropogenic CO2emission trendsRoger J. Francey et al, Nature Climate Change 3, 520–524 (2013) doi:10.1038/nclimate1817

The Hockey Schtick reports:

A recent paper published in Nature Climate Change finds a disconnect between man-made CO2 and atmospheric levels of CO2, demonstrating that despite a sharp 25% increase in man-made CO2 emissions since 2003, the growth rate in atmospheric CO2 has slowed sharply since 2002/2003. The data shows that while the growth rate of man-made emissions was relatively stable from 1990-2003, the growth rate of atmospheric CO2 surged up to the record El Nino of 1997-1998. Conversely, growth in man-made emissions surged ~25% from 2003-2011, but the growth rate of atmospheric CO2 has flatlined since 1999 along with global temperatures. The data demonstrates temperature drives CO2 levels due to ocean outgassing, man-made CO2 does not drive temperature, and that man is not the primary cause of the rise in CO2 levels.

Criegee intermediates further unsettle climate science

April 26, 2013

Far from being a “settled” science, global warming in particular and “climate science” in general are looking decidedly shaky these days!

What is not in doubt is that clouds and their formation are of critical importance for our climate. But clouds can both “warm” and “cool”. They can attenuate the sun’s radiation that reaches the earth during the day and they can prevent the radiation of heat  from the earth into space during the night. They can absorb some of the sun’s radiation and transfer that heat into the atmosphere and radiate some of it back into space as well. The net effect of clouds is uncertain. Solar effects themselves can affect the formation of clouds (Svensmark’s theory) as has been confirmed recently by experiments at CERN. Current climate models speculate that carbon dioxide can affect the moisture levels and therefore increase clouds in the atmosphere. But no mechanisms are known and these assumptions are more fanciful than based on any evidence. Moreover the assumed enhanced warming due to the increased moisture (positive forcing) is even more fanciful since it is also not known as to whether any such extra moisture results and whether any exists as clouds. Computer models – in the absence of any known mechanisms for such forcing – merely assume some “net, resultant” level of the forcing which (of course) causes warming and can be attributed to carbon dioxide. These assumptions about the forcing due to carbon dioxide effectively presuppose the forcing and are little more than “fudge factors”.

But even the chemistry of and the chemical reactions in the upper atmosphere are uncertain. A new paper  provides new evidence of how Criegee intermediate molecules in the atmosphere could help in aerosol and cloud formation and contribute to cooling in the atmosphere.

A  Criegee intermediate is a carbonyl oxide with two free radical centres which act independently of each other. These molecules could help to break down sulfur dioxide and nitrogen dioxide in the atmosphere and their existence formation of Criegee biradicals was first postulated in the 1950s by Rudolf Criegee. 

Rudolf Criegee (1902-1975) was a German chemist. He studied in Tübingen, Greifswald, and Würzburg and received his doctorate at Würzburg in 1925. He proposed a reaction mechanism for ozonolysis in 1953. The Criegee intermediate and the Criegee rearrangement are named after him. In this context, his research on cyclic reactions and cyclic rearrangement-mechanisms led him, independently of the Nobel-Prize winning work of R.B.Woodward and R.Hoffmann (Woodward-Hoffmann rules), to the same conclusions as theirs, but he failed to publish his findings in time.

File:Carbonyl oxide (Criegee zwitterion).svg

Carbonyl oxide (Criegee zwitterion): wikipedia

The new paper is published in Science:

Direct Measurements of Conformer-Dependent Reactivity of the Criegee Intermediate CH3CHOOCraig A Taatjes et al, Science 12 April 2013: Vol. 340 no. 6129 pp. 177-180 DOI: 10.1126/science.1234689

Abstract: Although carbonyl oxides, “Criegee intermediates,” have long been implicated in tropospheric oxidation, there have been few direct measurements of their kinetics, and only for the simplest compound in the class, CH2OO. Here, we report production and reaction kinetics of the next larger Criegee intermediate, CH3CHOO. Moreover, we independently probed the two distinct CH3CHOO conformers, syn- and anti-, both of which react readily with SO2 and with NO2. We demonstrate that anti-CH3CHOO is substantially more reactive toward water and SO2 than is syn-CH3CHOO. Reaction with water may dominate tropospheric removal of Criegee intermediates and determine their atmospheric concentration. An upper limit is obtained for the reaction of syn-CH3CHOO with water, and the rate constant for reaction of anti-CH3CHOO with water is measured as 1.0 × 10−14 ± 0.4 × 10−14 centimeter3 second−1.

From the Manchester University Press Release:

Scientists have discovered further evidence for the existence of new molecules in the atmosphere that have the potential to off-set global warming by reacting with airborne pollutants.

Researchers from The University of Manchester, Bristol University, Southampton University and Sandia National Laboratories in California have detected the second simplest Criegee intermediate molecule – acetaldehyde oxide – and measured its reactivity.

Intermediates are molecules that are formed during a chemical reaction and react further to produce the final chemicals of the reaction. Criegee intermediates – carbonyl oxides – were first identifies by the team in January last year and shown to be powerful oxidisers, reacting with pollutants such as nitrogen dioxide and sulphur dioxide.

The authors, whose latest study is again published in the journal Science, believe Criegee intermediates have the potential to cool the planet by converting these pollutants into sulphate and nitrate compounds that will lead to aerosol and cloud formation.

Professor Carl Percival, who led the Manchester team in the University’s School of Earth, Atmospheric and Environmental Sciences, said: “We have carried out the first ever studies on the second simplest Criegee intermediate and were able to show that it also reacts extremely quickly with sulphur dioxide to produce sulphates under experimental conditions.

“We can therefore say that the reaction of these intermediates with sulphur dioxide will have a significant impact on sulphuric acid production in the atmosphere if they follow the pattern established by these two studies.

He continued: “One of the main questions from our first study was if this increased reactivity would be observed for other Criegee intermediates, so with these findings we now have additional evidence that Criegee intermediates are indeed powerful oxidisers of pollutants such as nitrogen dioxide and sulphur dioxide.

“What this study suggests is that the biosphere could have a significant impact on aerosol production and thus potentially climate cooling via the formation of Criegee intermediates. The next steps will be to carry out modelling studies to quantify the impact of Criegee intermediates on climate and to quantify the level of alkene present in various environments.”

The formation of Criegee intermediates or biradicals was first postulated by the German chemist Rudolf Criegee in the 1950s but, despite their importance, it had not been possible to study the chemicals in the laboratory. The detection of the molecules was made possible through a unique apparatus that uses light from a third-generation synchrotron facility at the Lawrence Berkeley National Laboratory.

The latest study has also revealed which of the two isomers of acetaldehyde oxide is the most reactive. Isomers are molecules that contain the same atoms but arranged in different combinations, while conformational isomerism refers to the way the atoms of a molecule are rotated around a single chemical bond.

“In this new paper we have been able to show that the reactivity depends on the conformer of acetaldehyde oxide in a dramatic way,” said Professor Percival. “The ‘anti’ conformer is much more reactive than the ‘syn’ conformer, which we believe more likely to be formed in the atmosphere. This enabled us to measure the rate coefficient for reaction with water for the first time; the removal, via reaction with water, is of vital importance if we want to understand the role of Criegee intermediates in the atmosphere.”

Sandia combustion chemist Craig Taatjes, the lead author on the paper, added: “Observing conformer-dependent reactivity represents the first direct experimental test of theoretical predictions. The work will be of tremendous importance in validating the theoretical methods that are needed to accurately predict the kinetics for reactions of Criegee intermediates that still cannot be measured directly.”

Related: 

Criegee Intermediates Found to Have Big Impact On Troposphere

Offsetting Global Warming: Molecule in Earth’s Atmosphere Could ‘Cool the Planet’

 

“Irreproducible results and spurious claims” in neuroscience

April 26, 2013

The practice of science in today’s “publish or die” world together with the headlong pursuit of funding leaves me somewhat cynical.

My gut feeling has always been that it is the “social sciences” which are plagued most by the “irreproducible study” sickness but it seems to be prevalent across many more disciplines than I would have thought. Poor studies in neuroscience – it would seem – are followed by “meta-studies” to summarise the poor studies and are in turn followed by analysis to prove that the studies are not significant. And poor studies with irreproducible results would seem to be the norm and not the exception.

Gary Stix blogs at The Scientific American:

Brain Lobes : image Scientific American

New Study: Neuroscience Research Gets an “F” for Reliability

Brain studies are  the current darling of the sciences, research capable of garnering  tens or even hundreds of millions in new funding for ambitious new projects, the kind of money that was once reserved only for big physics projects.

Except the house of neuroscience, which attracts tens of thousands of attendees each year to the annual meeting of the Society for Neuroscience, may be built on a foundation of clay. Those are the implications of an analysis published online  April 10 in Nature Reviews Neuroscience, which questions the reliability of much of the research in the field.

The study—led by researchers at the University of Bristol—looked at 48 neuroscience meta-analyses (studies of studies) from 2011 and found that their statistical power reaches only 21 percent, meaning that there is only about a one in five chance that any effect being investigated by the researchers—whether a compound acts as an anti-depressant in rat brains, for instance—will be discovered. Anything that does turn up, moreover, is more likely to be false. …..

John Ioannidis of Stanford University School of Medicine, says ….. “Neuroscience has tremendous potential and it is a very exciting field. However, if it continues to operate with very small studies, its results may not be as credible as one would wish. A combination of small studies with the high popularity of a highly-funded, bandwagon-topic is a high-risk combination and may lead to a lot of irreproducible results and spurious claims for discoveries that are out of proportion.”

Update: Moses Chao, a former president of the Society for Neuroscience and a professor of cell biology at New York University Medical School, got back to me with a comment after I posted the blog, which is excerpted here:

“I agree that many published papers in neuroscience  are based upon small effects or changes.  One issue is that many studies have not been blinded.  There have been numerous reports in my field which have not been reproduced, some dealing with small molecule receptor agonists.  This has set back progress.  The lack of reproducibility is one of the reasons that pharmaceutical companies have reduced their effort in neuroscience research. But irreproducibility also applies to other fields, such as cancer…