Archive for the ‘Science’ Category

Microplastic misconduct: Swedish paper about fish larvae eating microplastics was fabricated

April 28, 2017

A paper claiming evidence about fish larvae eating micro-plastics to their detriment was fabricated. To be published, any paper about the impact of humans on the environment must always be negative. Exaggerated and even fabricated data are rarely questioned. Studies which are positive about human impact are – by definition of “political correctness” – never publishable. There is clearly “politically incorrect” and “politically correct” science.

This is another case of made up work being passed off as “politically correct”science.

Swedish Radio reports today that

A study about fish larvae eating micro-plastics contains such serious flaws that it should be retracted from Science, where it was published says Sweden’s Central Ethics Review Board’s expert panel for misconduct in research.

The panel believes that two researchers at Uppsala University are guilty of misconduct. It is a remarkable study from last year, which deals with the fact that perch young seem to prefer to eat micro-plastics to regular fish food.

After criticism by external researchers, an investigation was made by the Central Ethics Review Board, which today delivered an opinion. The researchers have been found guilty of misconduct in several cases.

“The most serious is the lack of original data,” says Jörgen Svidén, Department Head at the Central Ethics Review Board.

The study was published in the journal Science last year. The Central Ethics Review Board writes in its opinion that it is remarkable that the article was ever accepted. The opinion has been sent to Uppsala University, which must now make a decision on the matter. 

The researcher’s claimed that a laptop containing the data had been stolen. Really? And this was not backed up? Uppsala University had rejected claims of misconduct by its staff in the wake of serious allegations in 2015. How gullible can a University be?

ScienceMag wrote then:

When Fredrik Jutfelt and Josefin Sundin read a paper on a hot environmental issue in the 3 June issue of Science, the two researchers immediately felt that something was very wrong. Both knew Oona Lönnstedt,  the research fellow at Sweden’s Uppsala University (UU) who had conducted the study, and both had been at the Ar research station on the island of Gotland around the time that Lönnstedt says she carried out the experiments, which showed that tiny particles called microplastics can harm fish larvae. Jutfelt, an associate professor at the Norwegian University of Science and Technology in Trondheim, and Sundin, a UU postdoc, believed there was no way that Lönnstedt had been able to carry out the elaborate study.

Less than 3 weeks later, the duo wrote UU that they had “a strong suspicion of research misconduct” and asked for an investigation. Their letter, initially reported by Retraction Watch in August, was cosigned by five scientists from Canada, Switzerland, and Australia, who hadn’t been at the research station but also had severe misgivings about the paper and who helped Sundin and Jutfelt build their case. 

This week, Science is publishing an “Editorial expression of concern” about the paper, because Lönnstedt and her supervisor at UU, Peter Eklöv, have been unable to produce all of the raw data behind their results. Lönnstedt says the data were stored on a laptop computer that was stolen from her husband’s car 10 days after the paper was published, and that no backups exist. ……

…… The paper, which received a lot of press attention, focused on plastic fragments of less than half a millimeter in size that result from the mechanical breakdown of bags and other products. There’s increasing evidence that these microplastics collect in rivers, lakes, and oceans around the world, but so far, little is known about their effects on aquatic organisms and ecosystems. What Lönnstedt and Eklöv reported was alarming: They had exposed larvae of European perch maintained in aquaria at the research station to microplastics and found that they had decreased growth and altered feeding and behavior. Microplastics made the larvae less responsive to chemical warning signals and more likely to be eaten by pike in a series of predation experiments, the pair further reported. In an accompanying Perspective, Chelsea Rochman of the University of Toronto in Canada wrote that the study “marks an important step toward understanding of microplastics” and was relevant to policymakers. ……

….. In the report of its “preliminary investigation,” the UU panel sided with Lönnstedt. She and Eklöv had explained everything “in a satisfactory and credible manner,” wrote the panel, which asked UU to “take diligent steps to restore the reputation of the accused.” But the panel’s report didn’t provide detailed rebuttals of the long list of problems provided by Sundin and Jutfelt, who say that the investigation was superficial. ….. 

Much may now depend on the conclusions of an expert group on misconduct at Sweden’s Central Ethical Board, which is doing its own, independent investigation. Jutfelt says he’s hopeful because it appears that the group is “doing a more thorough job.” Lönnstedt says she’s not worried about the outcome. A spokesperson for the board says it is not clear when it will wrap up the inquiry. 

Microplastic misconduct Foto: Uppsala universitet

The Ethics Review Board has now reported and it is clear that this “politically correct” paper was fabricated. Uppsala University’s so-called investigation is also shown to have been less than serious and merely carried out a whitewash of their own staff.


Second European Mars lander (Schiaparelli) also lost (after Beagle 2 in 2003)

October 20, 2016

While the ExoMars Trace Gas Orbiter by the European/Russian space agencies (ESA/Roscosmos) seems to have successfully entered the correct orbit around Mars, ESA’s Mars lander, Schiaparelli seems to have been lost on its way down to the surface.

schiaparelli-descent image-esa

schiaparelli-descent image-esa


There are growing fears a European probe that attempted to land on Mars on Wednesday has been lost. Tracking of the Schiaparelli robot’s radio signals was dropped less than a minute before it was expected to touch down on the Red Planet’s surface.

Satellites at Mars have attempted to shed light on the probe’s status, so far without success. One American satellite even called out to Schiaparelli to try to get it to respond. The fear will be that the robot has crashed and been destroyed. The European Space Agency, however, is a long way from formally calling that outcome. Its engineers will be running through “fault trees” seeking to figure out why communication was lost and what they can do next to retrieve the situation.

This approach could well last several days. 

One key insight will come from Schiaparelli’s “mothership” – the Trace Gas Orbiter (TGO). As Schiaparelli was heading down to the surface, the TGO was putting itself in a parking ellipse around Mars. But it was also receiving telemetry from the descending robot.

If the lander is indeed lost, it will be the second failure of a European Mars lander after the failure of Beagle 2 in 2003.

Beagle 2 was a British landing spacecraft that formed part of the European Space Agency’s 2003 Mars Express mission. The craft lost contact with Earth during its final descent and its fate was unknown for over twelve years. Beagle 2 is named after HMS Beagle, the ship used by Charles Darwin.

The spacecraft was successfully deployed from the Mars Express on 19 December 2003 and was scheduled to land on the surface of Mars on 25 December; however, no contact was received at the expected time of landing on Mars, with the ESA declaring the mission lost in February 2004, after numerous attempts to contact the spacecraft were made.

Beagle 2‘s fate remained a mystery until January 2015, when it was located intact on the surface of Mars in a series of images from NASA’s Mars Reconnaissance Orbiter HiRISE camera. The images suggest that two of the spacecraft’s four solar panels failed to deploy, blocking the spacecraft’s communications antenna.

The ESA’s plans and budget for landing a six-wheeled roving vehicle on Mars in 2021 will face further critical scrutiny. The rover is expected “to use some of the same technology as Schiaparelli, including its doppler radar to sense the distance to the surface on descent, and its guidance, navigation and control algorithms”.

ESA has an annual budget of about €5.25 billion.

Of course the EU sees the ESA as a matter of prestige first (and science, only second) which does help to protect the budget.

Perhaps some “frugal engineering” (a la ISRO) is called for.


What is “statistically significant” is not necessarily significant

October 12, 2016

“Statistical significance” is “a mathematical machine for turning baloney into breakthroughs, and flukes into funding” – Robert Matthews.

Tests for statistical significance generating the p value are supposed to give the probability of the null hypothesis (that the observations are not a real effect and fall within the bounds of randomness). So a low p value only indicates that the null hypothesis has a low probability and therefore it is considered “statistically significant” that the observations do, in fact, describe a real effect. Quite arbitrarily it has become the custom to use 0.05 (5%) as the threshold p-value to distinguish between “statistically significant” or not. Why 5% has become the “holy number” which separates acceptance for publication and rejection, or success from failure is a little irrational. Actually what “statistically significant” means is that “the observations may or may not be a real effect but there is a low probability that they are entirely due to chance”.

Even when some observations are considered just “statistically significant” there is a 1:20 chance that they are not. Moreover it is conveniently forgotten that statistical significance is called for only when we don’t know. In a coin toss there is certainty (100% probability) that the outcome will be a heads or a tail or a “lands on its edge”. Thereafter to assign a probability to one of the only 3 outcomes possible can be helpful – but it is a probability constrained within the 100% certainty of the 3 outcomes. If a million people take part in a lottery, then the 1: 1,000,000 probability of a particular individual winning has significance because there is 100% certainty that one of them will win. But when conducting clinical tests for a new drug, it is often so that there is no certainty anywhere to provide a framework and a boundary within which to apply a probability.

A new article in Aeon by David Colquhoun, Professor of pharmacology at University College London and a Fellow of the Royal Society, addresses The Problem with p-values.

In 2005, the epidemiologist John Ioannidis at Stanford caused a storm when he wrote the paper ‘Why Most Published Research Findings Are False’,focusing on results in certain areas of biomedicine. He’s been vindicated by subsequent investigations. For example, a recent article found that repeating 100 different results in experimental psychology confirmed the original conclusions in only 38 per cent of cases. It’s probably at least as bad for brain-imaging studies and cognitive neuroscience. How can this happen?

The problem of how to distinguish a genuine observation from random chance is a very old one. It’s been debated for centuries by philosophers and, more fruitfully, by statisticians. It turns on the distinction between induction and deduction. Science is an exercise in inductive reasoning: we are making observations and trying to infer general rules from them. Induction can never be certain. In contrast, deductive reasoning is easier: you deduce what you would expect to observe if some general rule were true and then compare it with what you actually see. The problem is that, for a scientist, deductive arguments don’t directly answer the question that you want to ask.

What matters to a scientific observer is how often you’ll be wrong if you claim that an effect is real, rather than being merely random. That’s a question of induction, so it’s hard. In the early 20th century, it became the custom to avoid induction, by changing the question into one that used only deductive reasoning. In the 1920s, the statistician Ronald Fisher did this by advocating tests of statistical significance. These are wholly deductive and so sidestep the philosophical problems of induction.

Tests of statistical significance proceed by calculating the probability of making our observations (or the more extreme ones) if there were no real effect. This isn’t an assertion that there is no real effect, but rather a calculation of what wouldbe expected if there were no real effect. The postulate that there is no real effect is called the null hypothesis, and the probability is called the p-value. Clearly the smaller the p-value, the less plausible the null hypothesis, so the more likely it is that there is, in fact, a real effect. All you have to do is to decide how small the p-value must be before you declare that you’ve made a discovery. But that turns out to be very difficult.

The problem is that the p-value gives the right answer to the wrong question. What we really want to know is not the probability of the observations given a hypothesis about the existence of a real effect, but rather the probability that there is a real effect – that the hypothesis is true – given the observations. And that is a problem of induction.

Confusion between these two quite different probabilities lies at the heart of why p-values are so often misinterpreted. It’s called the error of the transposed conditional. Even quite respectable sources will tell you that the p-value is the probability that your observations occurred by chance. And that is plain wrong. …….

……. The problem of induction was solved, in principle, by the Reverend Thomas Bayes in the middle of the 18th century. He showed how to convert the probability of the observations given a hypothesis (the deductive problem) to what we actually want, the probability that the hypothesis is true given some observations (the inductive problem). But how to use his famous theorem in practice has been the subject of heated debate ever since. …….

……. For a start, it’s high time that we abandoned the well-worn term ‘statistically significant’. The cut-off of P < 0.05 that’s almost universal in biomedical sciences is entirely arbitrary – and, as we’ve seen, it’s quite inadequate as evidence for a real effect. Although it’s common to blame Fisher for the magic value of 0.05, in fact Fisher said, in 1926, that P= 0.05 was a ‘low standard of significance’ and that a scientific fact should be regarded as experimentally established only if repeating the experiment ‘rarely fails to give this level of significance’.

The ‘rarely fails’ bit, emphasised by Fisher 90 years ago, has been forgotten. A single experiment that gives P = 0.045 will get a ‘discovery’ published in the most glamorous journals. So it’s not fair to blame Fisher, but nonetheless there’s an uncomfortable amount of truth in what the physicist Robert Matthews at Aston University in Birmingham had to say in 1998: ‘The plain fact is that 70 years ago Ronald Fisher gave scientists a mathematical machine for turning baloney into breakthroughs, and flukes into funding. It is time to pull the plug.’ ………

Related: Demystifying the p-value


2016 Ig Nobels are more embarrassing than satirical

September 23, 2016

The problem with the Ig Nobels has become that they actually take themselves seriously. Unfortunately what should be satirical and irreverent has become an embarrassing example of politically correct, science “humour”. The awards have turned into a glorification of non-science, science fraud and stupidity.

The 2016 Ig Nobel Prize Winners

 The 2016 Ig Nobel Prizes were awarded on Thursday night, September 22, 2016 at the 26th First Annual Ig Nobel Prize Ceremony, at Harvard’s Sanders Theatre. The ceremony was webcast live.

REPRODUCTION PRIZE [EGYPT] — The late Ahmed Shafik, for studying the effects of wearing polyester, cotton, or wool trousers on the sex life of rats, and for conducting similar tests with human males.

REFERENCE: “Effect of Different Types of Textiles on Sexual Activity. Experimental study,” Ahmed Shafik, European Urology, vol. 24, no. 3, 1993, pp. 375-80.

REFERENCE: “Contraceptive Efficacy of Polyester-Induced Azoospermia in Normal Men,” Ahmed Shafik, Contraception, vol. 45, 1992, pp. 439-451.

ECONOMICS PRIZE [NEW ZEALAND, UK] — Mark Avis, Sarah Forbes, and Shelagh Ferguson, for assessing the perceived personalities of rocks, from a sales and marketing perspective.

REFERENCE: “The Brand Personality of Rocks: A Critical Evaluation of a Brand Personality Scale,” Mark Avis, Sarah Forbes, Shelagh Ferguson, Marketing Theory, vol. 14, no. 4, 2014, pp. 451-475.

WHO ATTENDED THE CEREMONY: Mark Avis and Sarah Forbes

PHYSICS PRIZE [HUNGARY, SPAIN, SWEDEN, SWITZERLAND] — Gábor Horváth, Miklós Blahó, György Kriska, Ramón Hegedüs, Balázs Gerics, Róbert Farkas, Susanne Åkesson, Péter Malik, and Hansruedi Wildermuth, for discovering why white-haired horses are the most horsefly-proof horses, and for discovering why dragonflies are fatally attracted to black tombstones.

REFERENCE: “An Unexpected Advantage of Whiteness in Horses: The Most Horsefly-Proof Horse Has a Depolarizing White Coat,” Gábor Horváth, Miklós Blahó, György Kriska, Ramón Hegedüs, Balázs Gerics, Róbert Farkas and Susanne Åkesson, Proceedings of the Royal Society B, vol. 277 no. 1688, pp. June 2010, pp. 1643-1650.

REFERENCE: “Ecological Traps for Dragonflies in a Cemetery: The Attraction of Sympetrum species (Odonata: Libellulidae) by Horizontally Polarizing Black Grave-Stones,” Gábor Horváth, Péter Malik, György Kriska, Hansruedi Wildermuth, Freshwater Biology, vol. 52, vol. 9, September 2007, pp. 1700–9.


CHEMISTRY PRIZE [GERMANY] — Volkswagen, for solving the problem of excessive automobile pollution emissions by automatically, electromechanically producing fewer emissions whenever the cars are being tested.

REFERENCE: “EPA, California Notify Volkswagen of Clean Air Act Violations”, U.S. Environmental Protection Agency news release, September 18, 2015.

MEDICINE PRIZE [GERMANY] — Christoph Helmchen, Carina Palzer, Thomas Münte, Silke Anders, and Andreas Sprenger, for discovering that if you have an itch on the left side of your body, you can relieve it by looking into a mirror and scratching the right side of your body (and vice versa).

REFERENCE: “Itch Relief by Mirror Scratching. A Psychophysical Study,” Christoph Helmchen, Carina Palzer, Thomas F. Münte, Silke Anders, Andreas Sprenger, PLoS ONE, vol. 8, no 12, December 26, 2013, e82756.


PSYCHOLOGY PRIZE [BELGIUM, THE NETHERLANDS, GERMANY, CANADA, USA] — Evelyne Debey, Maarten De Schryver, Gordon Logan, Kristina Suchotzki, and Bruno Verschuere, for asking a thousand liars how often they lie, and for deciding whether to believe those answers.

REFERENCE: “From Junior to Senior Pinocchio: A Cross-Sectional Lifespan Investigation of Deception,” Evelyne Debey, Maarten De Schryver, Gordon D. Logan, Kristina Suchotzki, and Bruno Verschuere, Acta Psychologica, vol. 160, 2015, pp. 58-68.


PEACE PRIZE [CANADA, USA] — Gordon Pennycook, James Allan Cheyne, Nathaniel Barr, Derek Koehler, and Jonathan Fugelsang for their scholarly study called “On the Reception and Detection of Pseudo-Profound Bullshit”.

REFERENCE: “On the Reception and Detection of Pseudo-Profound Bullshit,” Gordon Pennycook, James Allan Cheyne, Nathaniel Barr, Derek J. Koehler, and Jonathan A. Fugelsang, Judgment and Decision Making, Vol. 10, No. 6, November 2015, pp. 549–563.

WHO ATTENDED THE CEREMONY: Gordon Pennycook, Nathaniel Barr, Derek Koehler, and Jonathan Fugelsang

BIOLOGY PRIZE [UK] — Awarded jointly to: Charles Foster, for living in the wild as, at different times, a badger, an otter, a deer, a fox, and a bird; and to Thomas Thwaites, for creating prosthetic extensions of his limbs that allowed him to move in the manner of, and spend time roaming hills in the company of, goats.

REFERENCE: GoatMan; How I Took a Holiday from Being Human, Thomas Thwaites, Princeton Architectural Press, 2016, ISBN 978-1616894054.

REFERENCE: Being a Beast, by Charles Foster, Profile Books, 2016, ISBN 978-1781255346.

WHO ATTENDED THE CEREMONY: Charles Foster, Thomas Thwaites. [NOTE: Thomas Thwaites’s goat suit was kindly released for Ig Nobel purposes from the exhibition ‘Platform – Body/Space’ at Het Nieuwe Instituut in Rotterdam, and will be back on display at the museum from 4 October 2016 till 8 January 2017.]

LITERATURE PRIZE [SWEDEN] — Fredrik Sjöberg, for his three-volume autobiographical work about the pleasures of collecting flies that are dead, and flies that are not yet dead.

REFERENCE: “The Fly Trap” is the first volume of Fredrik Sjöberg’s autobiographical trilogy, “En Flugsamlares Vag” (“The Path of a Fly Collector”), and the first to be published in English. Pantheon Books, 2015, ISBN 978-1101870150.


PERCEPTION PRIZE [JAPAN] — Atsuki Higashiyama and Kohei Adachi, for investigating whether things look different when you bend over and view them between your legs.

REFERENCE: “Perceived size and Perceived Distance of Targets Viewed From Between the Legs: Evidence for Proprioceptive Theory,” Atsuki Higashiyama and Kohei Adachi, Vision Research, vol. 46, no. 23, November 2006, pp. 3961–76.



Elementary particle turns out to have been a mirage as CERN serves up more inelegant physics

August 6, 2016

Current physics and the Standard Model of the Universe it describes are no longer elegant. I have no doubt that the underlying structure of the universe is simple and beautiful. But models which require more than 61 elementary particles and numerous fudge factors (dark energy and dark matter) and an increasing complexity, are ugly and do not convince. Especially when they cannot explain the four “magical” forces we observe (gravitation, magnetic, strong nuclear and the weak nuclear forces).

I have a mixture of admiration and contempt for the “Big Physics” as practised by CERN and their experiments at the Large Hadron Collider. So, I was actually quite relieved to hear that CERN has just announced that, after much publicity, they hadn’t actually detected yet another elementary particle which was not predicted by the Standard Model. Since they found some anomalous data last December they have hyped the possibility of a new extra-heavy, elementary particle. Over 500 papers have been written (and published) postulating explanations of the data anomaly and fantasising about the nature of this particle. But the data has just disappeared. The postulated particle does not exist.

I remain convinced that 90% of modern physics is all about raising questions – some genuine and some fantasised – to ensure that funding for Sledgehammer Science continues. So not to worry. CERN may not have found another elementary particle this time. But they will soon come up with another unexpected particle, preceded by much publicity and hype, which will spawn much further speculation, and, most importantly, keep the funds flowing.

New York Times:

A great “might have been” for the universe, or at least for the people who study it, disappeared Friday.  

Last December, two teams of physicists working at CERN’s Large Hadron Collider reported that they might have seen traces of what could be a new fundamental constituent of nature, an elementary particle that is not part of the Standard Model that has ruled particle physics for the last half-century.  

A bump on a graph signaling excess pairs of gamma rays was most likely a statistical fluke, they said. But physicists have been holding their breath ever since.  

If real, the new particle would have opened a crack between the known and the unknown, affording a glimpse of quantum secrets undreamed of even by Einstein. Answers to questions like why there is matter but not antimatter in the universe, or the identity of the mysterious dark matter that provides the gravitational glue in the cosmos. In the few months after the announcement, 500 papers were written trying to interpret the meaning of the putative particle.

Science Alert:

CERN made the announcement this morning at the International Conference of High Energy Physics (ICHEP) in Chicago, alongside a huge slew of new Large Hadron Collider (LHC) data.

“The intriguing hint of a possible resonance at 750 GeV decaying into photon pairs, which caused considerable interest from the 2015 data, has not reappeared in the much larger 2016 data set and thus appears to be a statistical fluctuation,” CERN announced in a press release sent via email.

Why did we ever think we’d found a new particle in the first place?

Back in December, researchers at CERN’s CMS and ATLAS experiments smashed particles together at incredibly high energies, sending subatomic particles flying out as debris.

Among that debris, the researchers saw an unexpected blip of energy in form of an excess in pairs of photons, which had a combined energy of 750 gigaelectron volts (GeV). 

The result lead to hundreds of journal article submissions on the mysterious energy signature – and prompted many physicists to hypothesise that the excess was a sign of a brand new fundamental particle, six times more massive than the Higgs boson – one that wasn’t predicted by the Standard Model of particle physics.

But, alas, the latest data collected by the LHC shows no evidence that this particle exists – despite further experiments, no sign of this 750 GeV bump has emerged since the original reading

So, we’re no closer to finding a new particle – or evidence of a new model that could explain some of the more mysterious aspects of the Universe, such as how gravity works (something the Standard Model doesn’t account for).

The Large Hadron Collider is the world’s largest and most powerful particle accelerator (Image: CERN)

The Higgs Boson that CERN claims to have found last year has turned out to be not quite the boson predicted by the Standard Model. So while the Higgs boson was supposed to be the God particle, the boson found only indicated that there were more bosons to be found. I dislike the publicity and hype that CERN generates — which is entirely about securing further funding.  (The LHC cost $4.75 billion to build and sucks up about $5 billion annually to conduct their experiments).

Constantly adding complexity to a mathematical model and the increasing use of fudge factors is usually a sign that the model is fundamentally wrong. But some great insight is usually needed to simplify and correct a mathematical model. Until that insight comes, the models are the best available and just have to be fudged and added to in an ad hoc manner, to correct flaws as they are found.

The Standard Model and its 61+ particles will have to be replaced at some point by something more basic and more simple. But that will require some new Einstein-like insight, and who knows when that might occur. But the Standard Model is inelegant. The LHC is expected to operate for another 20 years. But the very weight of the investment in the LHC means that physicists cannot build a career by being heretical or by questioning the Standard Model itself.

I miss the elegance that Physics once chased:

Physics has become a Big Science where billion dollar sledgehammers are used to crack little nuts. Pieces of nut and shell go flying everywhere and each little fragment is considered a new elementary particle. The Rutherford-Bohr model still applies, but its elementary particles are no longer considered elementary. Particles with mass and charge are given constituent particles, one having mass and no charge, and one having charge and no mass. Unexplainable forces between particles are assigned special particles to carry the force. Particles which don’t exist, but may have existed, are defined and “discovered”. Errors in theoretical models are explained away by assigning appropriate properties to old particles or defining new particles. Every new particle leaves a new slime trail across the painting. It is as if a bunch of savages are doodling upon a masterpiece. The scribbling is ugly and hides the masterpiece underneath, but it does not mean that the masterpiece is not there.

The “standard model” does not quite fit observations so new theories of dark energy and dark matter are postulated (actually just invented as fudge factors) and further unknown particles are defined. The number of elementary particle have proliferated and are still increasing. The “standard model” of physics now includes at least 61 elementary particles (48 fermions and 13 bosons). Even the ancient civilisations knew better than to try and build with too many “standard” bricks. Where did simplicity go? Just the quarks can be red, blue or green. They can be up, down, charm, strange, top or bottom quarks. For every type of quark there is an antiquark. Electrons, muons and taus have each their corresponding neutrinos. And they all have their anti-particles.Gluons come in eight colour combinations. There are four electroweak bosons and there ought to be only one higgs boson. But who knows? CERN could find some more. I note that fat and thin or warm and cool types of particles have yet to be defined. Matter and antimatter particles on meeting each other, produce a burst of energy as they are annihilated. If forces are communicated by particles, gravity by gravitons and light by photons then perhaps all energy transmission can give rise to a whole new family of elementary particles.

The 61 particles still do not include the graviton or sparticles or any other unknown, invisible, magic particles that may go to making up dark matter and dark energy. Some of the dark matter may be stealthy dark matter and some may be phantom dark matter. One might think that when dark matter goes phantom, it ought to become visible, but that would be far too simple.  The level of complexity and apparent chaos is increasing. Every new particle discovered requires more money and Bigger Science to find the next postulated elementary particle.

When CERN claimed to have found the God Particle – the higgs boson – they still added the caveat that it was just one kind of the higgs boson and there could be more as yet unknown ones to come. So the ultimate elementary particle was certainly not the end of the road. Good grief! The end of the road can never be found. That might end the funding. And after all, even if the God Particle has been found, who created God? Guess how much all that is going to cost?


Nobel week starts today

October 5, 2015

This the schedule (Swedish time) for announcing the awards

PHYSIOLOGY OR MEDICINE – Monday 5 October, 11:30 a.m. at the earliest
PHYSICS – Tuesday 6 October, 11:45 a.m. at the earliest
CHEMISTRY – Wednesday 7 October, 11:45 a.m. at the earliest
PEACE – Friday 9 October, 11:00 a.m.
ECONOMIC SCIENCES – Monday 12 October, 1:00 p.m. at the earliest
LITERATURE – The date will be set later

I find the economic science and peace prizes are little more than nonsense (with a few exceptions). The literature prize is not so bad but does get bogged down by political correctness from time to time. But to each his own.

Some Nobel trivia:

105 Nobel Prizes in Physiology or Medicine have been awarded between 1901 and 2014.

38 Medicine Prizes have been given to one Laureate only.

11 women have been awarded the Medicine Prize so far.

32 years was the age of the youngest Medicine Laureate ever, Frederick G. Banting, who was awarded the 1923 Medicine Prize for the discovery of insulin.

87 years was the age of the oldest Medicine Laureate ever, Peyton Rous, when he was awarded the Medicine Prize in 1966 for his discovery of tumour-inducing viruses.

58 is the average age of the Nobel Laureates in Physiology or Medicine the year they were awarded the prize.

The Thomson Reuters prediction page is here. The medicine predictions are for work on human gut microbes, proteins and T-cells.

Nasa set to announce detection of flowing water on Mars

September 27, 2015


NASA’s live stream is down (for configuration errors) but Nature Geoscience has released this paper from embargo. (Probably water on the brain)

Spectral evidence for hydrated salts in recurring slope lineae on Mars

Looks like the earlier speculation was correct.

NASA is hyping an announcement to be made tomorrow about Mars.

Press Release:

NASA to Announce Mars Mystery Solved

NASA will detail a major science finding from the agency’s ongoing exploration of Mars during a news briefing at 11:30 a.m. EDT on Monday, Sept. 28 at the James Webb Auditorium at NASA Headquarters in Washington. The event will be broadcast live on NASA Television and the agency’s website.

News conference participants will be: 

  • Jim Green, director of planetary science at NASA Headquarters
  • Michael Meyer, lead scientist for the Mars Exploration Program at NASA Headquarters
  • Lujendra Ojha of the Georgia Institute of Technology in Atlanta
  • Mary Beth Wilhelm of NASA’s Ames Research Center in Moffett Field, California and the Georgia Institute of Technology
  • Alfred McEwen, principal investigator for the High Resolution Imaging Science Experiment (HiRISE) at the University of Arizona in Tucson

A brief question-and-answer session will take place during the event with reporters on site and by phone.

It will probably be connected with this paper to be presented this week at the European Planetary Science Congress. Three of the authors are to be at the Press Announcement. Even if not specifically about this paper the announcement is likely to be about water on Mars.

Recurring slope lineae observed in HiRISE images of Mars. The RSL form on Sun facing slopes during warm season and fade during cold season. image

L Ojha et al, Spectral Evidence for Hydrated Salts in Seasonal Brine Flows on Mars, Vol. 10, EPSC 2015-838-1, 2015 European Planetary Science Congress 2015

AbstractRecurring Slope Lineae (RSL) are seasonal flows on warm Martian slopes initially proposed, but not confirmed, to be caused by briny water seeps. Here we report spectral evidence for hydrated salts on RSL slopes from four different RSL locations from the Compact Reconnaissance Imaging Spectrometer for Mars on board Mars Reconnaissance Orbiter. These results confirm the hypothesis that RSL are due to present-day activity of briny water.

It would suggest that the dark streaks observed periodically on the surface of Mars are caused by the seasonal flow of salt-laden water across the surface. The salt levels would have to be high enough to allow the water to remain liquid long enough to create the streaks before it freezes. Some of the streaks seem to be of the order of several hundred metres in length.

If there actually is sub-surface ice on Mars, then it is not an unthinkable geo-engineering step (terraforming) to achieving a Martian atmosphere which, in time, could contain free oxygen in addition to water vapour and carbon dioxide. It is still not clear how Earth got its Nitrogen which provides a stable ballast and it is unclear if a similar “ballast gas” could be engineered around Mars.

The paper continues:

Pure water would rapidly evaporate and/or freeze on the present-day surface of Mars at most times and places; however brines are far less volatile compared to pure water due to their lower freezing points and evaporation rates. Various salts (e.g. sulfates, chlorides and perchlorates) have been detected on the surface of Mars from remote and in situ investigations. These salts can lower the freezing point of water by up to 80 K, lower the evaporation rate of water by an order of magnitude, and can be hygroscopic (i.e. able to easily absorb atmospheric moisture), thus increasing the possibility of forming and stabilizing liquid water on the surface of present day Mars. Recurring Slope Lineae (RSL) are narrow, low reflectance features forming on present-day Mars that have been hypothesized to be due to the transient flow of liquid water. …. 

and concludes:

The origin of water forming the RSL is not understood, given the extreme aridity of Mars’ surface environment. Water could form by the surface/sub-surface melting of ice, but the presence of near-surface equatorial ice is highly unlikely. Water could also form via deliquescence by hygroscopic salts, although it is unclear how the Martian atmosphere can sufficiently supply water vapor every year to create RSL. The absence of concentrated deliquescent salts would rule out this hypothesis. Another hypothesis is seasonal discharge of a local aquifer, which concentrates salt deposits as the brine evaporates, but then lineae emulating from the tops of local peaks are difficult to explain. It is conceivable that RSL are forming in different parts of Mars via different formation mechanisms. The new compositional insights reported here from widely separated sites provide essential new clues.

Water on Mars not only gives a higher probability of some life-form having existed, or existing, on Mars but also increases the probability of human life coming to exist on Mars. That would be something to be around for, but it will be after I am long gone.

Physics invokes magical “stealthy dark matter”

September 25, 2015

Physics is just a branch of the ultimate science – that of “Magic”. All the fundamental unknowns of physics are given special names (in lieu of explanation) and are assumed to have just those properties (often fantastical) which allow theoretical models of cosmology to maintain some credibility and come close to matching observations.

“Spacetime”, “gravitation”, “dark matter”, “dark energy” and even “phantom dark matter” can all just be termed “the mellifluous aether”, “magical attraction”, “magic matter”, “magic energy”, and “even more magic energy” without any loss of whatever rigour exists in Physics.

The methodology is quite simple.

First, invent a theory to explain what we don’t know. Then do some fancy maths to back up the theory. Whenever the theory fails, define a magic particle or event or property which brings credibility back to the theory. Spend vast amounts of money on Big Science experiments to find the magic particle or event or property. Find something other than the magic particle or property or event that was predicted. Claim that what was found was a special case of the magic “thing” that was predicted and due to some new magic particle or event or property. Demand more money to do more and bigger Big Science experiments. Magic demands more magic. And so an ad infinitum.

But magic demands more magic – deeper and more profound.

And so we have a new theory of stealthy dark matter to explain why it is undetectable.

Thomas Appelquist et al, Direct Detection of Stealth Dark Matter through Electromagnetic Polarizability. Physical Review Letters, 2015

Press Release: Lawrence Livermore National Laboratory (LLNL) scientists have come up with a new theory that may identify why dark matter has evaded direct detection in Earth-based experiments.

A group of national particle physicists known as the Lattice Strong Dynamics Collaboration, led by a Lawrence Livermore National Laboratory team, has combined theoretical and computational physics techniques and used the Laboratory’s massively parallel 2-petaflop Vulcan supercomputer to devise a new model of dark matter. It identifies it as naturally “stealthy” ( like its namesake aircraft, difficult to detect) today, but would have been easy to see via interactions with ordinary matter in the extremely high-temperature plasma conditions that pervaded the early universe. ……. 

Dark matter makes up 83 percent of all matter in the universe and does not interact directly with electromagnetic or strong and weak nuclear forces. Light does not bounce off of it, and ordinary matter goes through it with only the feeblest of interactions. Essentially invisible, it has been termed dark matter, yet its interactions with gravity produce striking effects on the movement of galaxies and galactic clusters, leaving little doubt of its existence. …….. 

The key to stealth dark matter’s split personality is its compositeness and the miracle of confinement. Like quarks in a neutron, at high temperatures these electrically charged constituents interact with nearly everything. But at lower temperatures they bind together to form an electrically neutral composite particle. Unlike a neutron, which is bound by the ordinary strong interaction of quantum chromodynamics (QCD), the stealthy neutron would have to be bound by a new and yet-unobserved strong interaction, a dark form of QCD. …..

But there is something more than a little circular in the argument that “its interactions with gravity produce striking effects on the movement of galaxies and galactic clusters, leaving little doubt of its existence”.

This is just magical mumbo jumbo. (Not that there is anything wrong with the magical incantations of physicists which are just as valid as any magical incantation by a shaman or a High Priest). Various universe evolution scenarios. A universe with too much density collapses in on itself, a critical density universe stays static, while a universe with not enough density keeps expanding at a steady (coasting) rate. However, today’s cosmology puts emphasis upon the cosmological constant, which gives an accelerating expansion. Does this mean that density is irrelevant? Credit: NASA.

The universe is accelerating instead of slowing down therefore “dark energy must exist”. Because objects moving very fast in some clusters of galaxies do not escape the clusters, it becomes necessary to invent magical “dark matter” exercising gravitational effects. The gravitation/speed anomaly is used to postulate that dark matter exists, but actually all it says is that gravitation theory alone is insufficient to explain the observations. We cannot detect dark matter so we generate theories for why it must be “phantom” or “stealthy” now. We infer it and its properties because the magic invoked to explain gravitation (relativity and spacetime) is not upto the task.

Note (diagram above) that all theories about the shape of the Universe have it surrounded by an infinite, unbounded, unknown, unknowable space of Deep, Dark Something. Let’s call it Magic.

Physics appears to come first in the hierarchy of Science. But Magic probably comes before Physics. Perhaps the most fundamental law of the Universe is actually the Conservation of Magic (and energy and mass and the curvature of spacetime are merely facets of Magic). Before the Big Bang there was first a critical accumulation of Magic which caused the Big Bang. And the quantity of magic gives the cosmological constant because of the Deep, Dark Magic underlying simple Magic …….

Physics/Magic posts:

  1.  The fundamentals of physics are just magic
  2. Magic is to physics as Heineken is to the human body
  3. Physics and cosmology are more magical than alchemy as dark energy goes phantom
  4. Gravitation could just as well be called “magical attraction”
  5. Dark energy and dark matter are just fudge factors for cosmic models that don’t work
  6. Physics came first and then came chemistry and later biology

12,000 years of agriculture has only reduced tree numbers by 46%

September 3, 2015

There is a new paper in Nature which tries to count the number of trees in the world.

Crowther, T. W. et al., Mapping tree density at a global scale, Nature (2015)

They find that there are 3.04 trillion trees which is about 7 times larger than was previously thought. They further estimate that in spite of 12,000 years of agriculture, the number of trees in the world has only reduced by 46%.

But then I heard the lead author, Thomas Crowther, being interviewed on the BBC today. I was less than impressed by his apparent lack of common sense. “The scale of human impact is astonishing” he says.


There has been a less than 50% reduction of tree-numbers in 12,000 years of agriculture, with all the forest clearing that entails, to feed 7 billion people. A human impact of less than 0.0038% per year (or 3.8% per 1000 years) is claimed to be “astonishing”.

He also claims that their results do not impact “carbon science”. Again, Really?

Seven times as many trees as thought before is 7 times as much bio-mass existing as trees, which leads to 7 times as much carbon inventory locked-up in trees as thought before. It means 7 times more trees die every year than was previously thought. Which makes man-made carbon emissions an even smaller fraction (c. 4%) of natural carbon emissions.

Crowther sounded like a vacuum cleaner salesman talking up his product. He kept emphasising the importance of his paper while denying that it had any implications for “politically correct science”. The estimate of the number of trees is interesting and could be something to build on for carbon cycle calculations. It also suggests that alarmism about bio-diversity of trees is ill founded. But some of their conclusions are just stupid and geared to getting more funding by demonstrating “political correctness”.

Reasonably interesting science but with idiot conclusions.

Oh dear!

Abstract: The global extent and distribution of forest trees is central to our understanding of the terrestrial biosphere. We provide the first spatially continuous map of forest tree density at a global scale. This map reveals that the global number of trees is approximately 3.04 trillion, an order of magnitude higher than the previous estimate. Of these trees, approximately 1.39 trillion exist in tropical and subtropical forests, with 0.74 trillion in boreal regions and 0.61 trillion in temperate regions. Biome-level trends in tree density demonstrate the importance of climate and topography in controlling local tree densities at finer scales, as well as the overwhelming effect of humans across most of the world. Based on our projected tree densities, we estimate that over 15 billion trees are cut down each year, and the global number of trees has fallen by approximately 46% since the start of human civilization.

The fundamentals of physics are just magic

September 1, 2015

Physicists would like to think that they deal in reality and are cold, rational, objective observers of the physical universe we live in. But deep, deep down, they just rely on magic. The Universe is nothing but a place of pervasive magic. Gravity is just a magical attraction. Spacetime is just an attractiferous aether. Physicists are thus practitioners of magic and may even be able to use the forces of magic, but they have no inkling as to why the magical forces exist.


  1. “gravity” or “gravitation” by “magical attraction”
  2. “spacetime” by “the attractiferous aether”
  3. “electromagnetic” by “electromagical”
  4. the “strong force” by the “strong magic force”
  5. the “weak force” by the “weak magic force”

and the Wikipedia entry for Gravity then reads as follows:

Magical attraction is a natural phenomenon by which all things are brought towards one another – irrespective of size, i.e. stars, planets, galaxies and even light and sub-atomic particles. Magical attraction has an infinite range, and it cannot be absorbed, transformed, or shielded against. Magical attraction is responsible for the formation of structures within the universe (namely by creating spheres of hydrogen, igniting them with enough pressure to form stars and then grouping them together into galaxies), as without magical attraction, the universe would be composed only of equally spaced particles. On Earth, magical attraction is commonly recognized in the form of weight where physical objects are harder to pick-up and carry the ‘heavier’ they are.

Magical attraction is most accurately described by the general theory of relativity (proposed by Albert Einstein in 1915) which describes the force of magical attraction, not as a force, but as a consequence of the curvature of the attractiferous aether caused by the uneven distribution of mass/energy; and resulting in time dilation, where time lapses more slowly under strong magical attraction. However, for most applications, magical attraction is well approximated by Newton’s law of Universal Magical Attraction, which postulates that magical attraction is a force where two bodies of mass are directly drawn to each other according to a mathematical relationship, where the attractive magical force is proportional to the product of their masses and inversely proportional to the square of the distance between them. This is considered to occur over an infinite range, such that all bodies (with mass) in the universe are drawn to each other no matter how far they are apart.

Magical attraction is the weakest of the four fundamental magical interactions of nature. The force of magical attraction is approximately 10−38 times the strength of the strong magic force (i.e. gravity is 38 orders of magnitude weaker), 10−36 times the strength of the electromagical force, and 10−29 times the strength of the weak magic force. As a consequence, magical attraction has a negligible influence on the behavior of sub-atomic particles, and plays no role in determining the internal properties of everyday matter (but see quantum magical attraction). On the other hand, magical attraction is the dominant force at the macroscopic scale, that is the cause of the formation, shape, and trajectory (orbit) of astronomical bodies, including those of asteroids,comets, planets, stars, and galaxies. It is responsible for causing the Earth and the other planets to orbit the Sun; for causing the Moon to orbit the Earth; for the formation of tides; for natural convection, by which fluid flow occurs under the influence of a density gradient and magical attraction; for heating the interiors of forming stars and planets to very high temperatures; for solar system, galaxy, stellar formation and evolution; and for various other phenomena observed on Earth and throughout the universe.

In pursuit of a theory of everything magical, the merging of general relativity and quantum mechanics (or quantum field theory) into a more general theory of quantum magical attraction has become an area of research.

Of course it is still not clear if magic is a continuous thing or composed of discrete magical quanta. One theory has it that all things are connected by invisible, undetectable magical strings and it is the elastic nature of these strings which gives rise to the forces of magical attraction.

The reality is that the Universe came into being by magic and the fundamental forces which have governed, and still govern, are all magical. If there ever was a Big Bang it was a magical event. And every sunrise and sunset which occurs is just due to the magical forces of attraction which apply. We live in a world of magic. Magic is normal.

%d bloggers like this: