Archive for the ‘Physics’ Category

New tests support previous result of faster than light neutrinos

November 18, 2011

The CERN results in September indicating faster than light neutrinos have stood up to one set of tests to check the result.

Reuters:

The new experiment at the Gran Sasso laboratory, using a neutrino beam from CERN in Switzerland, 720 km (450 miles) away, was held to check findings in September by a team of scientists which were greeted with some skepticism. Scientists at the Italian Institute for Nuclear Physics (INFN) said in a statement on Friday that their new tests aimed to exclude one potential systematic effect that may have affected the original measurement. ….

(more…)

Nature’s laws may vary across the universe – and so what if they should

November 5, 2011

There were headlines last week because according to a new paper published  in Physical Review Letters it may be that one of the “laws of nature” may vary across the Universe. Observations from two large telescopes pointed in different directions of the universe seem to show that the electromagnetic force which is measured by the fine structure constant, α,  may be different in different parts of the universe.

Indications of a spatial variation of the fine structure constant, by J. K. Webb, J. A. King, M. T. Murphy, V. V. Flambaum, R. F. Carswell and M. B. Bainbridge, Phys. Rev. Lett., 107, 191101, 2011, http://arxiv.org/abs/1008.3907 

Since the “laws of nature” and the “laws of physics” are merely expressions of observed regularities in our observable time and space they are – of necessity – empirical conclusions. Since we – as yet – have no idea “why” the “laws” we observe should be as they are and why the “fundamental constants” take the values they do, it seems to me unremarkable that there should be areas of time or space (not observed as yet) where these “laws” – as we have formulated them – do not hold exactly. There may well be errors of observation of course but observations made correctly must trump theories and models – no matter how simple or beautiful they might be.

(more…)

Rossi and his cold fusion E-Cat still smells like a fraud

October 31, 2011

Rossi & Co. announced their nickel-hydrogen fusion reactor back in January and have made regular press releases since then to keep the interest alive. They seem to have gathered a tail of staunch believers and, of course, there is a large body of dis-believers and a smaller group of sceptics.

But they are getting some attention even in business media as with this Forbes article:

On October 28th the biggest test of Rossi’s system, which is called the E-Cat, was conducted in Italy and some results were made public ….. Rossi’s E-Cat is claimed to use a secret catalyst to react hydrogen with nickel and, in the process, transmute the nickel into copper producing considerable heat. Whether this reaction works or not and if it does, exactly how it works, has been enormously contentious and the subject of numerous learned and amateur debates.

(more…)

Physics Nobel goes to Perlmutter, Schmidt and Riess

October 4, 2011

Staffan Normark has just announced that the Physics Nobel has been awarded half to Prof. Saul Perlmutter and half to Prof. Brian P Schmidt and Prof. Adam G Riess for work on the universe and supernovae. They discovered separately that the expansion of the universe was accelerating and not slowing down.

http://www.nobelprize.org/

The Press release is here:

“Some say the world will end in fire, some say in ice…” *
What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role. ….. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

…. For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

None of the winners were among the Thomson Reuters predictions.

http://science.thomsonreuters.com/nobel/2011predictions/#physics

Hope still alive for faster than light travel as Einsteinian physics is challenged (maybe)

September 23, 2011

The news is packed today with reports about the CERN measurements which apparently show that some neutrinos have travelled at faster than the speed of light. Since the conclusions are crucially dependent upon a time difference of 60 nanoseconds in a total travel time of 2.43 milliseconds the conclusion may well be found to be in error.

But I hope not.

Wired News: If it’s true, it will mark the biggest discovery in physics in the past half-century: Elusive, nearly massless subatomic particles called neutrinos appear to travel just faster than light, a team of physicists in Europe reports. If so, the observation would wreck Einstein’s theory of special relativity, which demands that nothing can travel faster than light. …. 

Over three years, OPERA researchers timed the roughly 16,000 neutrinos that started at CERN and registered a hit in the detector. They found that, on average, the neutrinos made the 730-kilometer, 2.43-millisecond trip roughly 60 nanoseconds faster than expected if they were traveling at light speed. “It’s a straightforward time-of-flight measurement,” says Antonio Ereditato, a physicist at the University of Bern and spokesperson for the 160-member OPERA collaboration. “We measure the distance and we measure the time, and we take the ratio to get the velocity, just as you learned to do in high school.” Ereditato says the uncertainty in the measurement is 10 nanoseconds. However, even Ereditato says it’s way too early to declare relativity wrong. “I would never say that,” he says. Rather, OPERA researchers are simply presenting a curious result that they cannot explain and asking the community to scrutinize it. “We are forced to say something,” he says. “We could not sweep it under the carpet because that would be dishonest.” The results will be presented at a seminar tomorrow at CERN.

The concept of light having a maximum speed is acceptable but that nothing can exceed this speed is somehow depressing and lacks elegance and it kills hope. It is even more confining and depressing if the universe is expanding. It “settles” science when science needs to be unsettled.

For the sake of wonder and discovery and challenge I hope that the measurements are correct and that some part of Einsteinian physics is turned on its head and that the dream of FTL travel remains alive.

“Make it so” – Star Trek

The Guardian: Faster than light particles found, claim scientists

Wall Street Journal: Roll over Einstein: Law of physics challenged

 

Pamela finds anti-matter in the Van Allen belt

August 9, 2011

I had no difficulty as a student and later as an engineer in using  imaginary and complex numbers  involving i, where

i 2 = −1

and I am reasonably confident that I grasp the general concept of imaginary numbers. It took me a while when I was a student to realise that “imaginary” here meant “being capable of being imagined” and not something that ” did not exist and could only be imagined”.

I have much greater difficulty in following the concepts of “anti-matter” and why it is rational and necessary that anti-matter must exist. But I am no high energy physicist. On the other hand, I have no difficulty in “imagining” an alternative universe composed of anti-matter subject to anti-gravity, lit up with anti-light and which presumably began with an anti-Big Bang (an implosion)! But why anti-matter must exist in our universe is something I am content to leave to physicists. But like black holes they make me vaguely uncomfortable and I suppose it’s a good thing that anti-matter does not exist naturally on the earth’s surface. Of course if the physicists could suggest how I could use anti-matter to annihilate about 20kgs of my mass I would sign on immediately!

Anti-matter

The modern theory of antimatter begins in 1928, with a paper by Paul Dirac. Dirac realised that his relativistic version of the Schrödinger wave equation for electrons predicted the possibility of antielectrons. These were discovered by Carl D. Anderson in 1932 and named positrons (a contraction of “positive electrons”). Although Dirac did not himself use the term antimatter, its use follows on naturally enough from antielectrons, antiprotons, etc.A complete periodic table of antimatter was envisaged by Charles Janet in 1929.

Antimatter cannot be stored in a container made of ordinary matter because antimatter reacts with any matter it touches, annihilating itself and an equal amount of the container. Antimatter that is composed of charged particles can be contained by a combination of an electric field and a magnetic field in a device known as a Penning trap.

In any case, when cosmic rays smash into molecules in the Earth’s upper atmosphere, a shower of smaller particles is created. Physicists have assumed that a small number of those resulting particles will be anti-protons. Most of those will be instantly annihilated when they collide with particles of ordinary matter. But those which don’t collide should get trapped in the Earth’s torus-shaped Van Allen radiation belt, and form a layer of antimatter in the Earth’s atmosphere.

Van Allen radiation belts : image stars.astro.illinois.edu

Wired reports on the paper in The Astrophysical Journal Letters –  The discovery of geomagnetically trapped cosmic ray antiprotons bu O. Adriani et al. 2011 ApJ 737 L29 doi: 10.1088/2041-8205/737/2/L29

Data from the cosmic ray satellite PAMELA has added substantial weight to the theory that the Earth is encircled by a thin band of antimatter. The satellite, named Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics, was launched in 2006 to study the nature of cosmic rays — high-energy particles from the Sun and beyond the solar system which barrel into Earth.  ……

It was one of PAMELA’s goals to hunt out those tiny numbers of antimatter particles among the ludicrously more abundant normal matter particles, like protons and the nuclei of helium atoms. To find them, the satellite regularly moved through a particularly dense section of the Van Allen belt called the South Atlantic Anomaly. Over a period of 850 days — from July 2006 to December 2008 — sensors aboard PAMELA detected 28 anti-protons. That might not sound like much, but it’s three times more than would be found from a random sample of the solar wind, and is the most abundant source of anti-protons ever seen near the Earth.

But what does this discovery mean, other than proving that a bunch of theorizing physicists were correct? The discovery opens the doors to harnessing those anti-protons for a variety of medical, sensing and, most importantly, rocket-propelling applications.

In a 2006 NASA-founded study by Draper Laboratory, researchers wrote, “it has been suggested that tens of nanograms to micrograms of anti-protons can be used to catalyze nuclear reactions and propel spacecraft to velocities up to 100 km/sec.”

Ununquadium = Flerovium and Ununhexium = Moscovium?

June 9, 2011

In June last year it was reported that element 114 – with the temporary name ununquadium – had been manufactured in the lab.

Periodic table gets bigger: Element 114 Ununquadium

Now a a joint working party of the International Union of Pure and Applied Chemistry (IUPAC) and the International Union of Pure and Applied Physics (IUPAP) have concluded that elements 114 and 116 have fulfilled criteria for official inclusion in the periodic table.

Discovery of the elements with atomic numbers greater than or equal to 113

doi:10.1351/PAC-REP-10-05-01

Abstract: The IUPAC/IUPAP Joint Working Party (JWP) on the priority of claims to the discovery of new elements 113–116 and 118 has reviewed the relevant literature pertaining to several claims. In accordance with the criteria for the discovery of elements previously established by the 1992 IUPAC/IUPAP Transfermium Working Group (TWG), and reinforced in subsequent IUPAC/IUPAP JWP discussions, it was determined that the Dubna-Livermore collaborations share in the fulfillment of those criteria both for elements Z = 114 and 116. A synopsis of experiments and related efforts is presented.

The discovery of both elements has been credited to a collaborative team based at the Joint Institute for Nuclear Research in Dubna, Russia, and Lawrence Livermore National Laboratory in California, US. The collaborative parties have proposed the name flerovium for 114, after Soviet scientist Georgy Flyorov, and moscovium for 116, after the region in Russia.

In recent years, there have been several claims by laboratories for the discovery of elements at 113, 114, 115, 116 and 118 in the periodic table. The working party concluded that elements 114 and 116 now fulfilled criteria for official inclusion in the table.

Periodic Table

Periodic Table with the Unun series: image BBC

The two new elements are radioactive and only exist for less than a second before decaying into lighter atoms. Element 116 will quickly decay into 114, and 114 transforms into the slightly lighter copernicium as it sheds its alpha particles.

Molybdenite to challenge graphene?

January 31, 2011
Mineral molybdenite from collection of Nationa...

Mineral molybdenite: Image via Wikipedia

A new paper from researchers at Ecole Polytechnique Federale de Lausanne about a new material which could challenge graphene for transistors.

Single-layer MoS2 transistors, by B. Radisavljevic, A. Radenovic, J. Brivio, V. Giacometti & A. Kis, Nature Nanotechnology (2011) doi:10.1038/nnano.2010.279

Physorg reports:

Smaller and more energy-efficient electronic chips could be made using molybdenite. In an article appearing online January 30 in the journal Nature Nanotechnology, EPFL’s Laboratory of Nanoscale Electronics and Structures (LANES) publishes a study showing that this material has distinct advantages over traditional silicon or graphene for use in electronics applications.

A model showing how molybdenite can be integrated into a transistor. Credit: EPFL

Research carried out in the Laboratory of Nanoscale Electronics and Structures (LANES) has revealed that molybdenite, or MoS2, is a very effective semiconductor. This mineral, which is abundant in nature, is often used as an element in steel alloys or as an additive in lubricants. But it had not yet been extensively studied for use in electronics.

“It’s a two-dimensional material, very thin and easy to use in nanotechnology. It has real potential in the fabrication of very small transistors, light-emitting diodes (LEDs) and solar cells,” says EPFL Professor Andras Kis, whose LANES colleagues M. Radisavljevic, Prof. Radenovic et M. Brivio worked with him on the study. He compares its advantages with two other materials:silicon, currently the primary component used in electronic and computer chips, and graphene, whose discovery in 2004 earned University of Manchester physicists Andre Geim and Konstantin Novoselov the 2010 Nobel Prize in Physics.

One of molybdenite’s advantages is that it is less voluminous than silicon, which is a three-dimensional material. “In a 0.65-nanometer-thick sheet of MoS2, the electrons can move around as easily as in a 2-nanometer-thick sheet of silicon,” explains Kis. “But it’s not currently possible to fabricate a sheet of silicon as thin as a monolayer sheet of MoS2.” Another advantage of molybdenite is that it can be used to make transistors that consume 100,000 times less energy in standby state than traditional silicon transistors. A semi-conductor with a “gap” must be used to turn a transistor on and off, and molybdenite’s 1.8 electron-volt gap is ideal for this purpose.

The existence of this gap in molybdenite also gives it an advantage over graphene.

Read Article


Cold Fusion: Another fraud or a breakthrough?

January 22, 2011

In March 1989, Stanley Pons and Martin Fleishmann claimed to have achieved cold fusion at room temperature but their experiment could not be reproduced.

File:Igloo.jpg

Cold fusion lab (igloo) under construction : image wikipedia

While cold fusion is considered highly improbable, it is not impossible and there remains a nagging suspicion (hope?) that some “miracle”, perpetual machine may suddenly appear in the most unlikely place and perhaps even outside main-stream science.

Physorg reports on another claim this time from Bologna, Italy:

Despite the intense skepticism, a small community of scientists is still investigating near-room-temperature fusion reactions. The latest news occurred last week, when Italian scientists Andrea Rossi and Sergio Focardi of the University of Bologna announced that they developed a cold fusiondevice capable of producing 12,400 W of heat power with an input of just 400 W. Last Friday, the scientists held a private invitation press conference in Bologna, attended by about 50 people, where they demonstrated what they claim is a nickel-hydrogen fusion reactor
. Further, the scientists say that the reactor is well beyond the research phase; they plan to start shipping commercial devices within the next three months and start mass production by the end of 2011.

Rossi and Focardi say that, when the atomic nuclei of nickel and hydrogen are fused in their reactor, the reaction produces copper and a large amount of energy. The reactor uses less than 1 gram of hydrogen and starts with about 1,000 W of electricity, which is reduced to 400 W after a few minutes. Every minute, the reaction can convert 292 grams of 20°C water into dry steam at about 101°C. Since raising the temperature of water by 80°C and converting it to steam requires about 12,400 W of power, the experiment provides a power gain of 12,400/400 = 31. As for costs, the scientists estimate that electricity can be generated at a cost of less than 1 cent/kWh, which is significantly less than coal or natural gas plants……….

…… Rossi and Focardi’s paper on the nuclear reactor has been rejected by peer-reviewed journals, but the scientists aren’t discouraged. They published their paper in the Journal of Nuclear Physics, an online journal founded and run by themselves, which is obviously cause for a great deal of skepticism. They say their paper was rejected because they lack a theory for how the reaction works. According to a press release in Google translate, the scientists say they cannot explain how the cold fusion is triggered, “but the presence of copper and the release of energy are witnesses.”

http://www.physorg.com/news/2011-01-italian-scientists-cold-fusion-video.html

But not everybody is dismissing this latest claim.

Steven Krivit of the New Energy Times describes why he believes that the Rossi and Focardi LENR device is probably real and is an advancement on the Piantelli process.

But there seems to be a vested interest here and I remain unconvinced.

Especially since they claim that they cannot fully explain what happens but are going to be producing “commercial units” anyway it sounds like a scam. They will probably sell some units to the gullible  before they disappear from view.

Just another fraud.

Ozone layer hits record thickness in Sweden: Was there ever an ozone hole problem?

January 9, 2011

Lately there has been an increasing view that some of the catastrophe scenarios about the ozone hole which led to the Montreal Protocol of 1989 were exagerrated and based on poor science. The effects of humans on ozone variations as opposed to natural variations may have been exaggerated. In fact there are now some suggestions that the actions taken were not only unnecessary but that they have not had much to do with the natural increase of ozone layer thickness observed in recent times.

The Local reports:

Sweden’s government weather agency reported on Friday that the ozone layer over southern Sweden reached its thickest levels at the end of last year, surpassing the previous record set in 1991.

Sweden’s Meteorological and Hydrological Institute (Sveriges Meteorologiska och Hydrologiska Institut, SMHI) explained that the weather was particularly favourable at the end of 2010 and it explain why the ozone layer was especially thick at the time. “It is a step in the right direction, but it is still too early to say that the ozone layer has recovered. The favourable weather situation over the last few months has contributed to a record high,” said Weine Josefsson, a meteorologist at SMHI, in a statement on Friday.
The annual value of the ozone layer’s thickness over Norrköping in 2010 stood at a new high of 351.7 Dobson units (DU). The previous record was set in 1991 at 341.8 DU. The November and December values in particular set new records among the measurements regularly made at SMHI since 1988. ……….  Even in Norrland in the country’s north, the values have been positive in the last year. The ozone layer has been measured regularly in Vindeln northwest of Umeå in northern Sweden since 1991 and the latest results were also positive in this area.
However, it is not possible to record complete ozone measurements in the winter, so it is uncertain whether a record was set there as well at the end of last year. In November and December, air flows were affected by a special weather situation over western Europe, resulting in an extra thick ozone layer over this part of the world in these two months.
It is possible that the restrictions on ozone-depleting substances proposed in the Montreal Protocol in 1987 have also contributed to the thickening of the ozone layer. However, this type of measure is effective over a long period of time and it is difficult to distinguish the effect of natural variations in this case.