Archive for the ‘Science’ Category

Hamza subterranean “river” criticised by Petrobras geologist – a case of “sour grapes”?

August 28, 2011

It seems that a Petrobras geologist – Dr. Jorge Figueiredo – does not like all the attention that Hamza and Pimental are getting. He may well have a point but there seems to be a touch of “sour grapes” about his carping. The data was Petrobras data available to their own geologists after all. Perhaps if the river had been named after him Dr. Figueiredo would not be quite so petulant.

If the flow is regular and serves to drain the Amazon then I think it can be called a river even if it is only a tiny seepage through the pores in rocks. The key is that the flow should be regular. Continuity of flow or the velocity of flow cannot be the arbiter. Even surface rivers can be seasonal and discontinuous, can completely dry up at times but they all regularly return from the same source to the same sink.

BBC:

…. Valiya Hamza and Elizabeth Tavares Pimentel, from the Brazilian National Observatory, deduced the existence of the “river” by using temperature data from boreholes across the Amazon region. The holes were dug by the Brazilian oil company Petrobras in the search for new oil and gas fields, and Petrobras has since released its data to the scientific community.

Using mathematical models relating temperature differences to water movement, the scientists inferred that water must be moving downwards through the ground around the holes, and then flowing horizontally at a depth of several km.

They concluded that this movement had to be from West to East, mimicking the mighty Amazon itself. A true underground river on this scale – 6,000km (4,000 miles) long – would be the longest of its kind in the world by far. But Professor Hamza told BBC News that it was not a river in the conventional sense. “We have used the term ‘river’ in a more generic sense than the popular notion,” he said.

In the Amazon, he said, water was transported by three kinds of “river” – the Amazon itself, as water vapour in atmospheric circulation, and as moving groundwater. “According to the lithologic sequences representative of Amazon [underground sedimentary] basins, the medium is permeable and the flow is through pores… we assume that the medium has enough permeability to allow for significant subsurface flows.”

The total calculated volume of the flow – about 4,000 cubic metres per second – is significant, although just a few percent of the amount of water transported by the Amazon proper. The underground flow could be confirmed with coastal measurements, scientists suggest. But the speed of movement is even slower than glaciers usually display, never mind rivers.

And whether water really is transported right across the region in this way is disputed by Jorge Figueiredo, a geologist with Petrobras. “First of all, the word ‘river’ should be burned from the work – it’s not a river whatsoever,” he told BBC News.

Water and other fluids could indeed flow through the porous sedimentary rock, he said, but would be unlikely to reach the Atlantic Ocean because the sedimentary basins containing the porous rock were separated by older rock deposits that would form an impermeable barrier. “But the main problem is that at depths of 4,000m, there is no possibility that we have fresh water – we have direct data that this water is saline,” said Dr Figueiredo. “My colleagues and I think this work is very arguable – we have a high level of criticism.” 

The research – Indications of an Underground “River” beneath the Amazon River: Inferences from Results of Geothermal Studies – was presented at the 12th International Congress of the Brazilian Geophysical Society in Rio de Janeiro, and has not been published in a peer-reviewed scientific journal.

The team has named the underground flow the “Hamza River”.

Jatinder Ahluwalia – End-game in progress

August 27, 2011

Jatinder Ahluwalia’s career of scientific misconduct has cut a swathe through academia over the last 15 years but is now approaching its end-game as Imperial College reviews the award of his PhD.

At Cambridge University he lost his studentship funding from the Engineering and Physical Sciences Research Council at the end of 1997, and was dismissed from the graduate studies program in 1998. He then went on to “earn” his PhD at Imperial College after which he was employed at University College London. An investigation at UCL  found that not only had he faked experimental results but also that he had sabotaged the experiments of some of his colleagues. He resigned or was dismissed by UCL in 2009 but then turned up as a senior lecturer at the University of East London. As retractions of his papers and allegations by co-workers mounted, UEL also investigated and Imperial College started checking the experiments which had led to the award of his PhD. Earlier this year he “left” UEL. Retraction Watch has documented the entire, sorry story.

This week another paper of his was retracted and Imperial College announced that the results on which his PhD were based could not be replicated. Imperial will now set up a committee to review the award of his doctorate.

The academics asked to independently re-run the experiments were unable to replicate the findings published in the paper Activation of capsaicin-sensitive primary sensory neurones induces anandamide production and release and so the authors decided to withdraw this from the Journal of Neurochemistry. The findings also formed the basis of Dr Ahluwalia’s PhD. The College has therefore written to Dr Ahluwalia to notify him that it believes it has grounds to investigate the validity of the data in his PhD. It will be convening a panel to review the award in accordance with its policy for investigating allegations of research misconduct.

I find it an incredible waste that in so many cases of scientific misconduct there is such a great deal of misplaced creativity and ingenuity – and even hard work – which goes into the misconduct and in then covering it up.

“It’s the Sun, stupid” > Svensmark vindicated: CERN shows cosmic rays do influence cloud formation

August 25, 2011

The much awaited results from the CLOUD experiments at CERN have now been published in Nature and show that cosmic rays can influence cloud formation – as Henrik Svensmark has hypothesised. This mechanism – ultimately dependent upon the sun – is far more credible as an explanation of the climate variations seen in recent times than dubious computer models based on implausible forcings due to carbon dioxide in the atmosphere.

Cloud formation may be linked to cosmic rays Kirkby, J. et alNature 476, 429-433 (2011).

Nigel Calder (via GWPF) writes:

Long-anticipated results of the CLOUD experiment at CERN in Geneva appear in tomorrow’s issue of the journal Nature (25 August). The Director General of CERN stirred controversy last month, by saying that the CLOUD team’s report should be politically correct about climate change (see my 17 July post below). The implication was that they should on no account endorse the Danish heresy – Henrik Svensmark’s hypothesis that most of the global warming of the 20th Century can be explained by the reduction in cosmic rays due to livelier solar activity, resulting in less low cloud cover and warmer surface temperatures.

Willy-nilly the results speak for themselves, and it’s no wonder the Director General was fretful.

Henrik Svensmark (born 1958) is a physicist at the Danish National Space Center in Copenhagen who studies the effects of cosmic rays on cloud formation. His work presents hypotheses about solar activity as an indirect cause of global warming; his research has suggested a possible link through the interaction of the solar wind and cosmic rays. His conclusions have been controversial as the prevailing scientific opinion on climate change considers solar activity unlikely to be a major contributor to recent warming, though it is thought to be the primary driver of many earlier changes in climate.

I cannot do any better than reproduce Nigel Calder’s explanation of the CERN experimental results:

Jasper Kirkby of CERN and his 62 co-authors, from 17 institutes in Europe and the USA, announce big effects of pions from an accelerator, which simulate the cosmic rays and ionize the air in the experimental chamber. The pions strongly promote the formation of clusters of sulphuric acid and water molecules – aerosols of the kind that may grow into cloud condensation nuclei on which cloud droplets form. What’s more, there’s a very important clarification of the chemistry involved.

(more…)

Solar Cycle 24: Still on track to be smallest sunspot number cycle in 100 years

August 25, 2011

The August solar cycle 24 forecast from NASA is unchanged from the previous month though the maximum has increased to 69 from the 64 forecast about 6 months ago.

The current prediction for Sunspot Cycle 24 gives a smoothed sunspot number maximum of about 69 in June of 2013 (same as last month). We are currently over two and a half years into Cycle 24. Four out of the last five months with average daily sunspot numbers above 40 has raised the predicted maximum above the 64.2 for the Cycle 14 maximum in 1907. This predicted size still make this the smallest sunspot cycle in over 100 years.

NASA - Solar Cycle 24 forecast

Solar Cycle 24 continues to invite comparisons with Solar Cycle 5.

SC24 versus SC5 - from http://sc25.com

Number of species on the planet is an unknown unknown – so what is the importance of biodiversity?

August 24, 2011
PLoS Biology_front page_2010-03-01

Image by Bettaman via Flickr

The importance of biodiversity and the loss of species as humans take over their habitats is one of the favourite themes of the environmental movement. But I have yet to see a clear exposition as to why the natural loss of species unable to cope with change is something to be opposed. The diversity of life is certainly one of the most striking aspects of our planet and it is not hard to accept that knowing how many species inhabit Earth is a fundamental question in science. In fact, without knowing this number any comments – let alone conclusions – about the danger of loss of species or the importance of bio-diversity to humanity can only be speculation.

The problem is not only that we have not identified all the eukaryote species in existence (and about 1.3 million have been classified and named) but we have no idea whether the number in existence is to be measured in millions or in hundreds of millions. About 15,000 new species are identified and catalogued each year. If  Bacteria and Archaea are added to eukaryotes, the total number of species could be in the billions.

A new paper in PLoS Biology using a relatively new methodology predicts the total number of species that exist. They claim that “the higher taxonomic classification of species (i.e., the assignment of species to phylum, class, order, family, and genus) follows a consistent and predictable pattern from which the total number of species in a taxonomic group can be estimated”. Using this approach they conclude that “there are ~8.7 million (±1.3 million SE) eukaryotic species globally, of which ~2.2 million (±0.18 million SE) are marine. …. (and) some 86% of existing species on Earth and 91% of species in the ocean still await description”.

Mora C, Tittensor DP, Adl S, Simpson AGB, Worm B (2011) How Many Species Are There on Earth and in the Ocean? PLoS Biol 9(8): e1001127. doi:10.1371/journal.pbio.1001127 

But the paper is already facing objections. The methodology – some claim – actually only estmates human activity in classifying species and not the species themselves. It does seem like an extrapolation from “what has been found” to predict “all that can possibly be found” and the argument is somewhat circular and not fully convincing. It brings to mind the quotation from Donald Rumsfeld which he was castigated for but which I am finding increasingly profound:

There are known knowns; there are things we know we know.
We also know there are known unknowns; that is to say we know there are some things we do not know.
But there are also unknown unknowns – the ones we don’t know we don’t know.

I think we are still in the state of not knowing what we do not know.

NY Times

“It’s astounding that we don’t know the most basic thing about life,” said Boris Worm, a marine biologist at Dalhousie University in Nova Scotia. On Tuesday, Dr. Worm, Dr. Mora and their colleagues presented the latest estimate of how many species there are, based on a new method they have developed. They estimate there are 8.7 million species on the planet, plus or minus 1.3 million.

In 1833, a British entomologist named John Obadiah Westwood made the earliest known estimate of global biodiversity by guessing how many insect species there are. He estimated how many species of insects lived on each plant species in England, and then extrapolated that figure across the whole planet. “If we say 400,000, we shall, perhaps, not be very wide of the truth,” he wrote. Today, scientists know the Westwood figure is far too low. They’ve already found more than a million insect species, and their discovery rate shows no signs of slowing down.

In 1988, Robert May, an evolutionary biologist at the University of Oxford, observed that the diversity of land animals increases as they get smaller. He reasoned that we probably have found most of the species of big animals, like mammals and birds, so he used their diversity to calculate the diversity of smaller animals. He ended up with an estimate 10 to 50 million species of land animals.

For the new estimate, the scientists came up with a method of their own, based on how taxonomists classify species. Each species belongs to a larger group called a genus, which belongs to a larger group called a family, and so on. .. In 2002, researchers at the University of Rome published a paper in which they used these higher groups to estimate the diversity of plants around Italy. There were fewer higher-level groups than lower ones at each site, like the layers of a pyramid. The scientists could estimate how many species there were at each site, much as it’s possible to estimate how big the bottom layer of a pyramid based on the rest of it. . … The scientists built a taxonomic pyramid to estimate the total number of species in well-studied groups, like mammals and birds. (They) then used it on all major groups of species, coming up with estimates of 7.7 million species of animals, for example, and 298,000 species of plants.

Terry Erwin, an entomologist at the Smithsonian Institution, think there’s a big flaw in the study. There’s no reason to assume that the diversity in little-studied groups will follow the rules of well-studied ones. “They’re measuring human activity, not biodiversity,” he said. David Pollock, an evolutionary biologist at the University of Colorado who studies fungi — a particularly understudied group — agrees. “This appears to be an incredibly ill-founded approach,” he said. There are 43,271 cataloged species of fungi, based on which Dr. Mora and his colleagues estimate there are 660,000 species of fungi on Earth. But other studies on fungus diversity suggest the number may be as high as 5.1 million species. ….

Jonathan Eisen, an expert on microbial diversity at the University of California, Davis, said he found the new paper disappointing. “This is akin to saying, ‘Dinosaurs roamed the Earth more than 500 years ago,’ ” he said. “While true, what is the point of saying it?”

At least it could be argued that there are not less than 8.7 million eukaryotic species. This is not likely to be the end of this story. Nevertheless it does at least show the scale of the problem and that the number of all living species (and not just eukaryotes) is still in the realm of the unknown.

And the importance of the disappearance of unsuccessful species and of the resulting bio-diversity on humanity is still an unknown unknown.

Nonsense speculation posing as science

August 17, 2011

Another example of nonsense speculation which gets published and then drives headlines only because they invoke the magic words “climate change”. A case of speculative IPCC model results being used as inputs for another speculative model about fish extermination and coming to a mildly alarmist conclusion.

Not a measurement in sight. But many pages, lots of statistics, 4 tables, 2 figures and 66 references to come to the amazing conclusion and state the obvious that cold water fish may die out if they are forced to live in warm water. 

As my son would put it “Duh”!!

A new paper in PLOS One.

Comparing Climate Change and Species Invasions as Drivers of Coldwater Fish Population Extirpations

 Sharma S, Vander Zanden MJ, Magnuson JJ, Lyons J (2011), PLoS ONE 6(8): e22906. doi:10.1371/journal.pone.0022906

The “researchers” actually measured nothing. They took a data-base of the conditions in which populations of a particular kind of fish (the cisco) existed. These parameters include air temperature among many others. They then did a statistical regression to infer how the populations might depend upon air temperature. They then took temperature increase assumptions from the climate change scenarios of the IPCC and applied them to the data base to speculate what that might do to the fish populations. They then reach their conclusions that climate change would extirpate a large section of the fish population and that this would be worse than the impact of invasive species.

And this is considered peer-reviewed science!!!!

They write in their paper:

Coldwater fishes, such as cisco [Corgeonus artedii] require cold water temperatures, high dissolved oxygen concentrations, and oligotrophic conditions, and thereby are sensitive indicators of environmental change. In Wisconsin, cisco are close to the southern edge of their range and are listed as a species of special concern. Cisco live in larger and deeper inland lakes with cold, well-oxygenated deep waters. Under climate change scenarios, as air temperatures increase, epilimnion and hypolimnion water temperatures are expected to increase. As water temperatures increase, the duration of the lake stratification period is expected to increase, isolating the deep waters from exchanges with the atmosphere, making it more likely that metabolic activity will reduce dissolved oxygen concentrations in the hypolimnion to stressful or lethal levels. The combination of warmer water temperatures and lower dissolved oxygen concentrations under climate change scenarios in larger, deeper lakes typically suitable for coldwater fishes could result in their extirpation.

Cisco are sensitive to the introduction of non-native rainbow smelt (Osmerus mordax). Rainbow smelt is native to the northeastern coast of North America and was introduced to the Laurentian Great Lakes in the 1920s. In Wisconsin, rainbow smelt have been introduced into lakes deliberately by anglers for sport fishing purposes. Furthermore, fertilized eggs of rainbow smelt may have been unintentionally introduced into lakes by residents cleaning smelt on their piers. When rainbow smelt invade a system, they negatively interact with native species through predation and competition. Invasion of rainbow smelt has been linked directly to changes in zooplankton community composition, decline in recruitment of walleye (Sander vitreus), and extirpation of cisco and yellow perch (Perca flavescens) . For example in Sparkling Lake, Wisconsin, the cisco population was extirpated through predation-induced recruitment within eight years of detection of rainbow smelt. …… 

Geo-referenced lake-specific data were collected for 13,052 lakes in Wisconsin from a variety of sources including the North Temperate Lakes Long Term Ecological Research (NTL-LTER) program, Wisconsin GAP (Geographic Approach to Planning for Biological Diversity) database, Wisconsin Department of Natural Resources databases, refereed publications, government reports, and dissertations. From the aforementioned databases, a suite of variables describing lake morphology, water chemistry, physical habitat, and fish species occurrence were compiled. Environmental variables retained in the final dataset were: surface area (hectares), maximum depth (metres), perimeter (kilometres), Secchi depth (metres), pH, conductivity (µS/cm), and mean annual air temperatures (°C). For water chemistry variables, annual averages were used in the dataset. 

Current air temperatures and scenarios of future mean annual air temperatures were obtained from the Wisconsin Initiative on Climate Change Impacts (WICCI) Climate Working Group. Mean annual air temperatures were statistically downscaled for Wisconsin on a 0.1° latitude ×0.1° longitude grid. Climate data were summarised for three time periods: 1961–2000, 2046–2065, and 2081–2100 and averaged over these three sets of years as suggested by the Intergovernmental Panel on Climate Change (IPCC) to reduce temporal variation in climate. Then, projected air temperatures from 15 general circulation models and the IPCCs A1, A2 and B1 scenarios (although not all general circulation models incorporated all three scenarios) totalling 78 climate change scenarios were used to develop future projections of cisco occurrence. The A1, A2 and B1 scenarios incorporate a range of variation in greenhouse gas emissions inferred for various time periods in the 21st century. The A1 scenario is the most extreme and assumes the highest greenhouse gas concentrations, followed by the A2 and B1 scenarios. …….

Our results highlight the threats to coldwater fish species. The probability of cisco extirpations could be reduced in Wisconsin through three interventions. First, the mitigation of climate change through the reduction of greenhouse gas emissions could significantly reduce the worst case losses of cisco. ….

Oh dear!!

Needless to say this non-science – since it uses the magic phrase “climate change” and has an alarmist theme – has no difficulty in being published and generating nonsense headlines in Science Daily:

Climate Change Could Drive Native Fish out of Wisconsin Waters 

ScienceDaily (Aug. 16, 2011) — The cisco, a key forage fish found in Wisconsin’s deepest and coldest bodies of water, could become a climate change casualty and disappear from most of the Wisconsin lakes it now inhabits by the year 2100, according to a new study. In a report published online in the journal Public Library of Science One, researchers from the University of Wisconsin-Madison and the Wisconsin Department of Natural Resources project a gloomy fate for the fish — an important food for many of Wisconsin’s iconic game species — as climate warms and pressure from invasive species grows. ………

In addition to the ecological change that would be prompted by a warmer Wisconsin climate, Sharma notes, the impoverishment of aquatic ecosystems will have potential socio-economic implications, especially in a setting like Wisconsin where recreational fishing is an iconic pastime, not to mention an important industry.

“This could very well impact the fishing experiences we have,” avers the Wisconsin researcher.

But rather than make me concerned about climate change this nonsense report based on idle – but fashionable – speculation makes me much more concerned about the predominance of modelling over measurement and what passes in some quarters for for science.

Idiot research to show that global warming can be solved by cutting obesity!

August 16, 2011

That researchers need to use “fashionable” catch phrases to ensure funding is not uncommon. That “global warming” is one such catch phrase which has been exploited by a variety of disciplines to justify the most inane work which has then been passed off as cutting-edge research is not new. It has been particularly evident for the last 15 years or so. Linking any research project in any discipline to “global warming” has increased the probability of getting funded.  Linking obesity via human respiration to global warming is one such example of trivialising the already trivial.

Even IF global warming is a problem (which I doubt) and IF carbon dioxide emissions are a cause (which is unlikely) and IF human production of carbon dioxide is significant (which it is not) and IF human respiration produces sufficient carbon dioxide to matter (and it is hardly measurable) and IF general obesity in the human population increases the total of vegetable and animal matter on the planet (which it does not), THEN this so-called research would come up to the level of being just silly.  

As such it is just high quality, idiot-research. 

The latest nonsense is from the Robert Gordon University in Scotland. But the International Journal of Obesity will not gain much in reputation by publishing  such drivel.

International Journal of Obesity , (26 July 2011) | doi:10.1038/ijo.2011.151Global warming: is weight loss a solution?A Gryka, J Broom and C Rolland

But even such nonsense – which is not new – can still capture headlines.

2011: Researchers Suggest Link Between Obesity & Global Warming

2008: Obesity as a cause of global warming? 

2006: Global warming and obesity: the links revealed

If you can’t kill the virus, kill the cells that contain the virus

August 11, 2011

An ingenious way of getting around the problem of attacking viruses. An MIT press release desribes a development that could transform how viral infections are treated. A team of researchers at MIT’s Lincoln Laboratory has designed a drug that can identify cells that have been infected by any type of virus, then kill those cells to terminate the infection.

Rider TH, Zook CE, Boettcher TL, Wick ST, Pancoast JS, et al. (2011) Broad-Spectrum Antiviral Therapeutics. PLoS ONE 6(7): e22572. doi:10.1371/journal.pone.0022572

Todd Rider invented the PANACEA and DRACO antiviral therapeutics, and previously invented the CANARY (Cellular Analysis and Notification of Antigen Risks and Yields) sensor for rapid pathogen detection and identification: Image MIT

In a paper published July 27 in the journal PLoS One, the researchers tested their drug against 15 viruses, and found it was effective against all of them — including rhinoviruses that cause the common cold, H1N1 influenza, a stomach virus, a polio virus, dengue fever and several other types of hemorrhagic fever.

The drug works by targeting a type of RNA produced only in cells that have been infected by viruses. “In theory, it should work against all viruses,” says Todd Rider, a senior staff scientist in Lincoln Laboratory’s Chemical, Biological, and Nanoscale Technologies Group who invented the new technology.

Because the technology is so broad-spectrum, it could potentially also be used to combat outbreaks of new viruses, such as the 2003 SARS (severe acute respiratory syndrome) outbreak, Rider says.

Other members of the research team are Lincoln Lab staff members Scott Wick, Christina Zook, Tara Boettcher, Jennifer Pancoast and Benjamin Zusman.

McGill University reprimands Professor for medical ghostwriting

August 11, 2011

Something stinks when academics are “helped” to write their papers by professional ghostwriters who are paid for by pharmaceutical companies. It is even worse when the papers are written by the pharmaceutical companies  and academics in the field are flattered or otherwise persuaded by their agents to put their names to the papers. McGill University has “reprimanded” a senior professor, Barbara Sherwin, for the practice but are at pains to point out that she has not been “sanctioned”.

What exactly does a reprimand – which is no sanction – accomplish?

The ghostwriting for what was ostensibly a peer-reviewed scientific article was essentially just promotional literature for Wyeth Pharmaceuticals’ and hormone replacement therapy (HRT). Wyeth paid a New Jersey professional-writing firm, DesignWrite, to help Sherwin produce a paper on treatment options for age associated memory loss that was eventually published in the Journal of the American Geriatrics Society. The paper was published in 2000. Sherwin was listed as the sole author of that paper, even though Karen Mittleman, an employee of DesignWrite, was involved in the process. The paper was published just when critics started raising doubts about hormone-replacement therapy.

Wyeth – through DesignWrite – had commissioned at least 40 scientific papers endorsing the therapy. During 2001, Wyeth sold hormones for HRT worth $2.1 billion.

Apparently Dr. Sherwin is no longer a member of the Quebec Order of Psychologists, which means she can no longer practice under the title of psychologist.

The Montreal Gazette has the full story.

Even more worrying is the Macleans story that Karen Mittleman of DesignWrite – on behalf of Wyeth – actually solicited this paper. There is also a hint of a rather cozy relationship between the Journal of the American Geriatrics Society and DesignWrite.

The stink is more of a stench!

Her alleged transgression came to light in a class-action suit involving 8,400 women against the drug company Wyeth (now part of Pfizer). Lawyers representing the women, who claim they were harmed by their hormone replacement therapy (HRT) drugs, discovered that scientific research papers extolling the virtues of the treatment while downplaying potential harm appeared to have been written, not by the academics who signed their name to the papers, but by writers hired by the pharmaceutical company.
According to court documents filed by the plaintiffs, Wyeth paid the Princeton, New Jersey-based medical communications company DesignWrite to produce articles on HRT for publication in academic journals between 1997 and 2003. DesignWrite would write the papers, then approach leading academics to claim authorship for them.

Sherwin’s relationship with the pharmaceutical company started innocently enough. In the early 1990s, she was invited to give a presentation about her work on androgens and psychological functioning in women. There, she met a woman named Karen Mittleman during the lunch break. Mittleman introduced herself as a PhD and a former academic who worked in medical communications. The pair hit it off, and kept in touch. “I liked her, and considered her a casual friend,” Sherwin told Maclean’s over the phone from her office at McGill.
Several years later, in 1998, Mittleman called Sherwin to ask if she wanted to write a paper for the Journal of the American Geriatrics Society at the invitation of the journal’s editor. The subject was pharmacological treatment options for age-associated memory loss. Sherwin, an expert on hormones and how they influence memory and mood in people, had just completed a grant proposal on the subject, and said she’d be happy to write the article. 
“[Mittleman] told me she would provide support by typing the manuscript and formatting it in the style of that particular journal,” explains Sherwin. The work itself would be based on Sherwin’s notes. In return, Mittleman, a senior writer at DesignWrite, promised to send Sherwin typed drafts for editing, and hard copies of references the professor requested. “I was completely under the impression that [Mittleman] was working for the journal, that it was the journal who hired her.” 

What Mittleman never revealed was that her employer, DesignWrite, had a business relationship with Wyeth and other pharmaceutical companies.

Karen Mittleman, as Antidote has noted, has the perfect Dickensian name for her job as the go-between finding researchers willing to sign their names to papers written by drug companies.

The reprimand by McGill seems little more than a very mild slap on the wrist.

Related: McGill sets bad example on integrity

Marc Hauser’s lobbyists get to work but only end up excusing scientific misconduct

August 9, 2011

Marc Hauser’s friends have started on the process of repairing some of the damage to his reputation brought about by his own misconduct. He has “resigned” from Harvard but – with a little help from his friends – he will no doubt pop-up with a fancy title at some other institution soon.

 The Harvard Crimson reports that a group of academics have written a “letter” criticising the investigation of Hauser’s misconducts by Harvard. The letter was written by Pierre Pica, a scientist at the National Center for Scientific Research, Bert Vaux, director of studies in linguistics at King’s College in the University of Cambridge, and Jeffrey Watumull, a Ph.D. student at the University of Cambridge. Watumull previously worked in Hauser’s lab. Eight other academics including Naom Chomsky have added their signatures.

But they protest too much about one of their own. I felt on reading their letter that while they accuse Harvard of a witch-hunt and express concern about the undermining of scientific inquiry they actually end up trivialising ethical behaviour and excusing scientific misconduct. Their concern does not ring true. The letter talks about a media frenzy against Hauser but ignores the fact that nothing came up in the media until after the 3 year investigation had shown the misconduct and Hauser had taken a year’s gardening leave.

Harvard Crimson: Monday, August 08, 2011

The letter—which was signed by MIT Linguistics Professor Noam Chomsky, one of Hauser’s mentors—criticizes the scope of the inquiry into Hauser’s research, the media frenzy that followed the release of Harvard’s findings, and insinuations that Hauser’s body of work has been thrown into question by the investigation. ….

Eight academics from the United States, the United Kingdom, France, and Brazil signed the letter, including Harvard Professor of Molecular and Cellular Biology Florian Engert. It has been circulated among top academics.

The Crimson obtained a copy of the letter—titled “Could the Process of Investigating Scientific Misconduct Undermine Scientific Inquiry?”—from the authors.

Following allegations that Hauser falsified research data, a three-year investigation into Hauser’s research found him “solely responsible for eight counts of scientific misconduct,” Dean of the Faculty of Arts and Sciences Michael D. Smith wrote in a letter last August. Reports attributed the source of those allegations to his graduate students.

In the fallout from the investigation, Hauser took a year-long leave of absence, was then barred from teaching for another year, and ultimately resigned from his tenured position this summer.

Related: Hausergate posts