## Archive for the ‘Physics’ Category

### The universe shouldn’t exist

October 28, 2017

Even if the Standard Model is right and an equal amount of matter and anti-matter was produced at the Big Bang, it still does not explain why the matter and anti-matter did not exterminate each other with a huge flash of energy. Why a huge amount of energy was first triggered to be absorbed to create matter and anti-matter at the Big Bang is brushed aside as being at a singularity, before the laws of physics existed and maybe before even time existed.

CERN has a new press release showing that apart from sign – as the standard model requires – no difference can be detected between a proton and an antiproton.

It is not the new measurements which say the universe should not exist. It is in fact the standard model which says that the universe should not exist. Maybe the standard model has to be modified.

Riddle of matter remains unsolved: Proton and antiproton share fundamental properties

Scientists are still in search of a difference between protons and antiprotons which would help to potentially explain the existence of matter in our universe. However, physicists in the BASE collaboration at the CERN research center have been able to measure the magnetic force of antiprotons with almost unbelievable precision. Nevertheless, the data do not provide any information about how matter formed in the early universe as particles and antiparticles would have had to completely destroy one another. The most recent BASE measurements revealed instead a large overlap between protons and antiprotons, thus confirming the Standard Model of particle physics. Around the world, scientists are using a variety of methods to find some difference, regardless of how small. The matter-antimatter imbalance in the universe is one of the hot topics of modern physics. ……..

The BASE collaboration published high-precision measurements of the antiproton g-factor back in January 2017 but the current ones are far more precise. The current high-precision measurement determined the g-factor down to nine significant digits. This is the equivalent of measuring the circumference of the earth to a precision of four centimeters. The value of 2.7928473441(42) is 350 times more precise than the results published in January. “This tremenduous increase in such a short period of time was only possible thanks to completely new methods,” said Ulmer. The process involved scientists using two antiprotons for the first time and analyzing them with two Penning traps.

We have to bear in mind  that CERN has a massive confirmation bias. Their primary reason for existence is to confirm the standard model.

### The edge of the universe is to humans as the surface of the water is to fish

July 18, 2017

Things become weird and wonderful when physicists or cosmologists or astronomers talk about the “edge of the universe” or the “finitely bounded but infinite universe” or the “expanding universe” which does not expand into anything but creates space as it expands (the balloon analogy). I read that the Big Bang occurred 13.8 billion years ago but the size of the universe is said to be a diameter of (around) 93 billion light-years. Some photons, they say, have traveled 46-47 billion light-years since the Big Bang. Now how did they do that in just 13.8 billion years? The answer, say the physicist-philosophers, is that things aren’t moving away from each other so much as that the space between them is expanding. Really!

How can the universe be 93 billion light-years across if it is only 13.8 billion years old? Light hasn’t had enough time to travel that far…? Ultimately, understanding this facet of physics is the key to understanding what lies beyond the edge of the observable universe and whether we could ever get there.

To break this down, according to special relativity, objects that are close together cannot move faster than the speed of light with respect to one another; however, there is no such law for objects that are extremely distant from one another when the space between them is, itself, expanding. In short, it’s not that objects are traveling faster than the speed of light, but that the space between objects is expanding, causing them to fly away from each other at amazing speeds.

Ultimately, this means that we could only reach the edge of the observable universe if we develop a method of transport that allows us to either 1) Travel faster than the speed of light (something which most physicists think is impossible) 2) Transcend spacetime (by using wormholes or warp drive, which most physicists also think is impossible).

The reality, I think, is that human cognition is limited. I reject the converse, that human cognition is unlimited, because, if it was, we would not have imponderable questions. Stephen Hawking has often said that “outside the universe” makes no sense, because if the universe came from nothing and brought everything into existence, then asking what lies beyond the universe is an invalid question. When physicists invoke dark energy and dark matter and, in the same breath, point out that they are unknown and undetectable, then it follows that human understanding is incomplete because of the limits to human cognition.

If human cognition is limited, whether at the level we have reached or ten times that level, our understanding of the universe around us is, and will be – and must be – also limited. We will always have a “conceptual edge” to the universe around us corresponding to our cognitive limits. Beyod this edge lies what is “unknowable”. The edge of the real universe lies at at the furthest reaches of our cognitive abilities.

As most fish (with exceptions for flying fish and lung fish) cannot conceive of the world beyond the surface of the water, so can humans not conceive a universe beyond the “conceptual edge” defined by their cognitive ability.

We can never observe what is beyond the “observable universe” because light will never get there. But it isn’t just light that doesn’t get there. Our minds don’t reach there either. There may be a multiverse out there – or maybe not. There may be just the ultimate void being converted into space-time as our universe eats into it. Or maybe there is The Restaurant at the End of the Universe awaiting the intrepid few who get there. Or maybe there is a Thing with a long white beard observing us to see if any human-fish manage to leap through the “edge of the universe”.

### Science (and the gods) rely equally on magic

July 3, 2017

The fundamental assumptions of science can be written in various ways but, for me, seem to boil down to four:

1. The Universe exists
2. Laws of nature (science) exist
3. All phenomena are constrained to obey the laws of nature (science)
4. The laws of nature (science) apply everywhere in the universe

The laws of nature are such that compliance with these laws is inbuilt. If there is any non-compliance it is not a law of nature. If compliance is all that we observe then it is a law of nature. But why the laws are what they are are usually beyond explanation.

Assumptions are not amenable to further question. You could apply an “if” to them or question “why” the assumption is true, but that is futile for there are no answers. They are just taken as self-evident and the starting point of rational thought. They are never, in themselves, self-explanatory except in the trivial form. (Assume that 1+1=2. Therefore 2+2=4 and that proves that 1+1=2).

I apply the word “magic” to all that is inexplicable. And all the fundamental laws of nature (science) are built on a foundation of inexplicable magic. How many fundamental particles exist and why? It’s magic. If the laws of science only apply after the Big Bang but don’t apply at the Big Bang singularity itself, what laws did? It’s magic. If the laws apply to a supernova but not inside a black hole, it’s magic. (Never mind that a black hole seems to be a part of the universe where the laws of science do not apply which violates the assumption that the universe is homogeneous and isotropic (Assumption 4 above). Why are there 4 – and only 4 – fundamental forces in nature? It’s magic. How did time begin? It’s magic. Can empty space exist without even the property of dimensions? It’s magic. Can time be a dimension and not have negative values? It’s magic. Dark energy and dark matter are merely labels invoking magic. All science which relies on fundamental assumptions is ultimately built upon and dependent upon a set of inexplicable, fundamental statements. They are just magic.

A fundamental flaw with the claim of physics, that all of history up to just after the Big Bang is explainable by the laws of science, must also mean that all of the future is also fixed and determined by the laws of science applied to conditions now. What will happen was therefore fixed for all time by the Big Bang itself. And that, too, is indistinguishable from magic.

Religions do not just rely on magic, they claim the magic for their gods. Modern, “with-it” religions, which try to be “compatible” with the latest knowledge discovered by science, merely claim that their God(s) pushed the button which caused the Big Bang. That my God is greater than your God is magic. That there is a life after death, or reincarnation, or rebirth or an ultimate state of grace is also just magic.

Shiva, Kali, Jesus, Allah, nirvana, dark energy, dark matter and the Big Bang singularity are all labels for different facets of magic.

Magic, by any other name, is just as inexplicable.

### Why does the earth rotate in 24 hours? It’s just magic

June 26, 2017

The rotational speed of a planetary body around its own axis is primarily set by the angular momentum the mass of matter making up the body had when it first coalesced into a planet. What determined that initial angular momentum is unknown. All known effects thereafter (mainly tidal and all fundamentally gravitational effects) slow this rotation. For the last 3,000 years the earth’s rotation has been slowing down to cause the day to lengthen by about 2 milliseconds per century.

Currently the solar (siderial) day has a mean value of about 2 milliseconds greater than 86,400 seconds while the stellar day (relative to the fixed stars) has a mean value of about 86, 164 seconds.

But we have no real understanding of why it is what it is. We can observe that the day length on the planets are:

We have no real explanation for why Mercury and Venus rotate as slowly as they do. But it is believed that at coalescence the angular momentum must have been similar but subsequent gravitational effects (solar gravitation effects on Mercury and “tidal” effects on Venus and it’s thick atmosphere) have drastically slowed the rotation. But this is mainly speculation. It is now thought that even distant Jupiter may be having an effect on Mercury’s orbit and spin.

Mercury spins three times on its axis for every two revolutions around the sun. It was natural to assume the sun was influencing Mercury’s spin. Now scientists have learned that distant Jupiter – largest planet and second-largest body in our solar system – also may also be influencing Mercury’s orbit and spin, which is more complex than scientists realized.

Among the outer planets there is a very rough correlation between the size of the planet and rotational speed. But there are no apparent correlations with mass, density, distance from the sun or any other parameter. All we can say about any planet’s spin is that it depends on the angular momentum of the material which coalesced to form the planet and thereafter it changed due to collisions as the planet formed, subsequent gravitational interactions with other bodies, and tidal interactions.

SciAmIn our solar system, the giant gas planets (Jupiter, Saturn, Uranus, and Neptune) spin more rapidly on their axes than the inner planets do and possess most of the system’s angular momentum. The sun itself rotates slowly, only once a month. The planets all revolve around the sun in the same direction and in virtually the same plane. In addition, they all rotate in the same general direction, with the exceptions of Venus and Uranus. These differences are believed to stem from collisions that occurred late in the planets’ formation. (A similar collision is believed to have led to the formation of our moon.)

Planetary spin (Pinterest)

The laws of physics (as we know them) did not apply at the Big Bang singularity. All the energy (dark, imaginary and real) in the universe and all the momentum in all the materia (dark or otherwise) making up the universe was determined in the singularity when the laws of physics did not apply. How the Big Bang caused matter to gain spin in the first place is also unknown. So the simple answer to why earth’s day is 24 hours long (and why any planet’s rotational speed is what it is) is that we haven’t a clue.

It’s just magic.

### First nothingness was not, then came the Big Bang and the Gods came later

June 12, 2017

The Rig Veda was probably written between 1500 and 1200 BC and consists of 10 mandalas (books). The first and tenth books were probably written last. The 129th verse of the tenth mandala contains what is called The Hymn of Creation. Nasadiya sukta

It begins:

Then even nothingness was not, nor existence,
There was no air then, nor the heavens beyond it.
What covered it? Where was it? In whose keeping?

It is not difficult to equate this “then” to “before” the Big Bang and the “it” to all the compressed matter which participated in the Big Bang. (Accepting, of course, that “before” is meaningless when time does not flow).

Then there was neither death nor immortality
Nor was there then the torch of night and day.
The One breathed windlessly and self-sustaining.
There was that One then, and there was no other.

At first there was only darkness wrapped in darkness.
All this was only unillumined water.
That One which came to be, enclosed in nothing,
arose at last, born of the power of heat.

arose at last, born of the power of heat” sounds very like a modern description of the Big BangEven though the Rig Veda’s main 8 mandalas are in praise of various deities, the first and tenth books take a much more agnostic position – perhaps written to bring some balance. The plethora of gods are effectively made subservient to an unknowable, unfathomable creation event. “An atheist interpretation sees the Creation Hymn as one of the earliest accounts of skeptical inquiry and agnosticism”.

Who really knows?
Who will here proclaim it?
Whence was it produced? Whence is this creation?
The gods came later, with the creation of this universe.
Who then knows whence it has arisen?”

First even nothingness was not and existence was not. Then came the creation of the Universe whether by Big Bang or otherwise. And the Gods came later (made by man in the image of man).

### The Big Bang singularity is indistinguishable from an Act of Creation

June 11, 2017

Most modern physicists and cosmologists who believe (note – believe) in the Big Bang theory of the Universe believe implicitly in an Act of Creation (the Big Bang Singularity) but then usually ignore the question of how and why the singularity occurred. They focus on the Act of Creation and after but do not address the cause of the singularity or a Creator. Religions of all kinds have their own Creation myths but focus on the presumed Creator much more than on the Act(s) of Creation.

(My own belief is that all religions live in the space of ignorance and physics – like all religions – is ultimately dependent upon Magic).

Stephen Hawking describes the Big Bang Singularity thus:

The situation was different, however, when it was realised that the universe is not static, but expanding. Galaxies are moving steadily apart from each other. This means that they were closer together in the past. One can plot the separation of two galaxies, as a function of time. If there were no acceleration due to gravity, the graph would be a straight line. It would go down to zero separation, about twenty billion years ago. One would expect gravity, to cause the galaxies to accelerate towards each other. This will mean that the graph of the separation of two galaxies will bend downwards, below the straight line. So the time of zero separation, would have been less than twenty billion years ago.

At this time, the Big Bang, all the matter in the universe, would have been on top of itself. The density would have been infinite. It would have been what is called, a singularity. At a singularity, all the laws of physics would have broken down. This means that the state of the universe, after the Big Bang, will not depend on anything that may have happened before, because the deterministic laws that govern the universe will break down in the Big Bang. The universe will evolve from the Big Bang, completely independently of what it was like before. Even the amount of matter in the universe, can be different to what it was before the Big Bang, as the Law of Conservation of Matter, will break down at the Big Bang.

Since events before the Big Bang have no observational consequences, one may as well cut them out of the theory, and say that time began at the Big Bang. Events before the Big Bang, are simply not defined, because there’s no way one could measure what happened at them. This kind of beginning to the universe, and of time itself, is very different to the beginnings that had been considered earlier. These had to be imposed on the universe by some external agency.

He goes on, however, to make an unsupportable conclusion.

There is no dynamical reason why the motion of bodies in the solar system can not be extrapolated back in time, far beyond four thousand and four BC, the date for the creation of the universe, according to the book of Genesis. Thus it would require the direct intervention of God, if the universe began at that date. By contrast, the Big Bang is a beginning that is required by the dynamical laws that govern the universe. It is therefore intrinsic to the universe, and is not imposed on it from outside.

Genesis requires time to begin at 4004 BC and the Big Bang is no different in concept. It too defines the start of time and takes us back to 13.8 (give or take a few) billion years ago. Time is not defined before the Act of Creation – whether by the Big Bang or by the hand of God.

(Note that if the flow of time has a beginning then the concept of a before or an after has no meaning before the beginning of time.  The magical speed of an inconstant time).

Hawking concludes:

The conclusion of this lecture is that the universe has not existed forever. Rather, the universe, and time itself, had a beginning in the Big Bang, about 15 billion years ago. The beginning of real time, would have been a singularity, at which the laws of physics would have broken down. Nevertheless, the way the universe began would have been determined by the laws of physics, if the universe satisfied the no boundary condition. This says that in the imaginary time direction, space-time is finite in extent, but doesn’t have any boundary or edge. The predictions of the no boundary proposal seem to agree with observation. The no boundary hypothesis also predicts that the universe will eventually collapse again. However, the contracting phase, will not have the opposite arrow of time, to the expanding phase. So we will keep on getting older, and we won’t return to our youth. Because time is not going to go backwards, I think I better stop now.

It seems to me that he contradicts himself when he says “The beginning of real time, would have been a singularity, at which the laws of physics would have broken down. Nevertheless, the way the universe began would have been determined by the laws of physics, …..”

The Big Bang singularity where the laws of physics do not apply is just another Act of Creation. If the laws of physics do not apply at the singularity then, which laws or whose laws do? Or do the laws of physics change? Do they vary in different universes such as that which may have existed before the Big Bang?

Even a singularity must follow some laws. It is disingenuous of physicists and cosmologists to claim that the laws of physics break down at the Big Bang singularity and not address which or whose laws apply at the singularity. If, however, no laws apply at the singularity then the Singularity is Omnipotent (or Magic or God or whatever other label suits you).

I prefer to think it’s Magic.

The fundamentals of physics are just magic.

### The speed of light may have been faster

May 4, 2017

I have speculated before that the rate at which time dissipates may not be constant (The Magical Speed of an Inconstant time).

Now come suggestions that the speed of light may have been faster at the time of the Big Bang. That is perfectly consistent with the speed of time being slower at the time of the Big Bang.

In 2015, scientists at the Laser Interferometer Gravitational-Wave Observatory (LIGO) confirmed evidence of gravitational waves. These ripples in space-time, formed by the merger of two supermassive black holes, were exactly what Einstein had described with his theory of general relativity. But physicists studying the LIGO data found evidence of “echoes” that seem to contradict the predictions made by general relativity.

“Theoretical physicists Jahed Abedi, Hannah Dykaar, and Niayesh Afshordi, published a new paper explaining that the group believes they have detected the first evidence of gravitational effects not explained by general relativity in the data,” reported Inverse in the wake of the LIGO announcement. In this way, Einstein’s vindication could also prove his theory’s undoing. And this isn’t the only evidence that could disrupt the theory of relativity.

Physicists studying the early origins of the universe hypothesize that light has not always traveled at the same speed. This directly challenges special relativity.

“João Magueijo from Imperial College London and Niayesh Afshordi of the Perimeter Institute in Canada proposed a new experiment proving Einstein wrong and demonstrating that the speed of light actually isn’t a constant,” Inverse reported in November 2016. “The pair thinks light may have moved faster in the past, around the time of Big Bang, and that it’s actually slowed down since.”

They suspect that the lumpy density of the early universe caused light to behave differently. As the universe expanded and smoothed out, these lumpy areas disappeared. But there still may be some areas at the edge of the universe where the lumpiness persists, and in these areas, faster-than-light travel could be possible.

That’s why Einstein’s theory of relativity, which provides the foundation of most of modern physics, may soon be proven wrong as advanced technologies enable us to peer farther into the expanding universe than we ever have before: Once we finally peer into a black hole, we might find that Einstein was wrong about general relativity.

And so I distinguish between perceived time and eal time. eal time, of course is magical. It is only by definition that we take the passage of time to be constant. Of course this is just perceived time. And we perceive time only as a consequence of change. But eal time does not have to elapse at a constant rate.

The Big Bang does not, apparently, mathematically permit of a time older than 13.8 billion years. Magical eal time, of course, goes back to infinitely long ago. All can be resolved merely by accepting that ℜeal time elapsed at zero rate at the Big Bang and then gradually built up to the rate of elapse we are subject to now. ……

At the Big Bang, even change had to get started. All change, all motion, all vibrations, all oscillations and all radiation had to start from zero. The atoms and the elements had to come into being. Cesium had to have come much later. These cycles of these oscillations of even the very first atoms may be regular now. But they would all have had to start somewhere (somewhen) and start from zero. The speed of oscillation had to build up from nothing (implying an infinite period) to that applying today. Which means that close to the Big Bang as atoms were ratcheting up their oscillations, the period between cycles would have been longer, starting infinitely long and reducing rapidly (in apparent time) to what is observed today. Closer to the Big Bang, eal time, as opposed to apparent time, would have elapsed more slowly and the period between cycles of all radiation would have had to start from infinity. The very speed of time would have been slower.

At the Big Bang, the speed of eal time would have been zero. A perceived picosecond of elapsed time would actually have been after the elapse of many, many trillions of eal time years. The perceived age of the universe of 13.8 billion years of perceived time would have been infinitely long ago in eal time.

Ultimately physics is just magic.

### Elementary particle turns out to have been a mirage as CERN serves up more inelegant physics

August 6, 2016

Current physics and the Standard Model of the Universe it describes are no longer elegant. I have no doubt that the underlying structure of the universe is simple and beautiful. But models which require more than 61 elementary particles and numerous fudge factors (dark energy and dark matter) and an increasing complexity, are ugly and do not convince. Especially when they cannot explain the four “magical” forces we observe (gravitation, magnetic, strong nuclear and the weak nuclear forces).

I have a mixture of admiration and contempt for the “Big Physics” as practised by CERN and their experiments at the Large Hadron Collider. So, I was actually quite relieved to hear that CERN has just announced that, after much publicity, they hadn’t actually detected yet another elementary particle which was not predicted by the Standard Model. Since they found some anomalous data last December they have hyped the possibility of a new extra-heavy, elementary particle. Over 500 papers have been written (and published) postulating explanations of the data anomaly and fantasising about the nature of this particle. But the data has just disappeared. The postulated particle does not exist.

I remain convinced that 90% of modern physics is all about raising questions – some genuine and some fantasised – to ensure that funding for Sledgehammer Science continues. So not to worry. CERN may not have found another elementary particle this time. But they will soon come up with another unexpected particle, preceded by much publicity and hype, which will spawn much further speculation, and, most importantly, keep the funds flowing.

A great “might have been” for the universe, or at least for the people who study it, disappeared Friday.

Last December, two teams of physicists working at CERN’s Large Hadron Collider reported that they might have seen traces of what could be a new fundamental constituent of nature, an elementary particle that is not part of the Standard Model that has ruled particle physics for the last half-century.

A bump on a graph signaling excess pairs of gamma rays was most likely a statistical fluke, they said. But physicists have been holding their breath ever since.

If real, the new particle would have opened a crack between the known and the unknown, affording a glimpse of quantum secrets undreamed of even by Einstein. Answers to questions like why there is matter but not antimatter in the universe, or the identity of the mysterious dark matter that provides the gravitational glue in the cosmos. In the few months after the announcement, 500 papers were written trying to interpret the meaning of the putative particle.

CERN made the announcement this morning at the International Conference of High Energy Physics (ICHEP) in Chicago, alongside a huge slew of new Large Hadron Collider (LHC) data.

“The intriguing hint of a possible resonance at 750 GeV decaying into photon pairs, which caused considerable interest from the 2015 data, has not reappeared in the much larger 2016 data set and thus appears to be a statistical fluctuation,” CERN announced in a press release sent via email.

Why did we ever think we’d found a new particle in the first place?

Back in December, researchers at CERN’s CMS and ATLAS experiments smashed particles together at incredibly high energies, sending subatomic particles flying out as debris.

Among that debris, the researchers saw an unexpected blip of energy in form of an excess in pairs of photons, which had a combined energy of 750 gigaelectron volts (GeV).

The result lead to hundreds of journal article submissions on the mysterious energy signature – and prompted many physicists to hypothesise that the excess was a sign of a brand new fundamental particle, six times more massive than the Higgs boson – one that wasn’t predicted by the Standard Model of particle physics.

But, alas, the latest data collected by the LHC shows no evidence that this particle exists – despite further experiments, no sign of this 750 GeV bump has emerged since the original reading

So, we’re no closer to finding a new particle – or evidence of a new model that could explain some of the more mysterious aspects of the Universe, such as how gravity works (something the Standard Model doesn’t account for).

The Large Hadron Collider is the world’s largest and most powerful particle accelerator (Image: CERN)

The Higgs Boson that CERN claims to have found last year has turned out to be not quite the boson predicted by the Standard Model. So while the Higgs boson was supposed to be the God particle, the boson found only indicated that there were more bosons to be found. I dislike the publicity and hype that CERN generates — which is entirely about securing further funding.  (The LHC cost $4.75 billion to build and sucks up about$5 billion annually to conduct their experiments).

Constantly adding complexity to a mathematical model and the increasing use of fudge factors is usually a sign that the model is fundamentally wrong. But some great insight is usually needed to simplify and correct a mathematical model. Until that insight comes, the models are the best available and just have to be fudged and added to in an ad hoc manner, to correct flaws as they are found.

The Standard Model and its 61+ particles will have to be replaced at some point by something more basic and more simple. But that will require some new Einstein-like insight, and who knows when that might occur. But the Standard Model is inelegant. The LHC is expected to operate for another 20 years. But the very weight of the investment in the LHC means that physicists cannot build a career by being heretical or by questioning the Standard Model itself.

I miss the elegance that Physics once chased:

Physics has become a Big Science where billion dollar sledgehammers are used to crack little nuts. Pieces of nut and shell go flying everywhere and each little fragment is considered a new elementary particle. The Rutherford-Bohr model still applies, but its elementary particles are no longer considered elementary. Particles with mass and charge are given constituent particles, one having mass and no charge, and one having charge and no mass. Unexplainable forces between particles are assigned special particles to carry the force. Particles which don’t exist, but may have existed, are defined and “discovered”. Errors in theoretical models are explained away by assigning appropriate properties to old particles or defining new particles. Every new particle leaves a new slime trail across the painting. It is as if a bunch of savages are doodling upon a masterpiece. The scribbling is ugly and hides the masterpiece underneath, but it does not mean that the masterpiece is not there.

The “standard model” does not quite fit observations so new theories of dark energy and dark matter are postulated (actually just invented as fudge factors) and further unknown particles are defined. The number of elementary particle have proliferated and are still increasing. The “standard model” of physics now includes at least 61 elementary particles (48 fermions and 13 bosons). Even the ancient civilisations knew better than to try and build with too many “standard” bricks. Where did simplicity go? Just the quarks can be red, blue or green. They can be up, down, charm, strange, top or bottom quarks. For every type of quark there is an antiquark. Electrons, muons and taus have each their corresponding neutrinos. And they all have their anti-particles.Gluons come in eight colour combinations. There are four electroweak bosons and there ought to be only one higgs boson. But who knows? CERN could find some more. I note that fat and thin or warm and cool types of particles have yet to be defined. Matter and antimatter particles on meeting each other, produce a burst of energy as they are annihilated. If forces are communicated by particles, gravity by gravitons and light by photons then perhaps all energy transmission can give rise to a whole new family of elementary particles.

The 61 particles still do not include the graviton or sparticles or any other unknown, invisible, magic particles that may go to making up dark matter and dark energy. Some of the dark matter may be stealthy dark matter and some may be phantom dark matter. One might think that when dark matter goes phantom, it ought to become visible, but that would be far too simple.  The level of complexity and apparent chaos is increasing. Every new particle discovered requires more money and Bigger Science to find the next postulated elementary particle.

When CERN claimed to have found the God Particle – the higgs boson – they still added the caveat that it was just one kind of the higgs boson and there could be more as yet unknown ones to come. So the ultimate elementary particle was certainly not the end of the road. Good grief! The end of the road can never be found. That might end the funding. And after all, even if the God Particle has been found, who created God? Guess how much all that is going to cost?

### Physics uses new magic to define the kilogram

June 22, 2016

Some fifty years ago my Maths and Physics Professors instilled in me the concept of elegance being the hallmark of “rightness” in science. For my Maths Professor, there was nothing more admirable or elegant than being just “necessary and sufficient”. I cannot shake off the gut-feeling that unnecessary complexity of explanation is an indicator of “wrongness”. Modern Physics is no longer characterised by elegance – only by complexities which are not necessarily, necessary. Fifty-seven fundamental particles (why only 57?), magical dark energy and dark matter, even stealth dark energy are all “fudge factors”  to cover the flaws of unsatisfactory theories and which make modern physics grossly inelegant.

A new paper from the National Institute of Standards and Technology:

D. Haddad, F. Seifert, L.S Chao, S. Li, D.B. Newell, J.R. Pratt, C. Williams, and S. Schlamminger. A precise instrument to determine the Planck constant, and the future kilogram. Review of Scientific Instruments, 2016 DOI: 10.1063/1.4953825

There used to be a time when units made common sense. A day was the time from sunrise to sunrise. That one day was a little shorter or longer than the next or that it was a different length in different parts of the world, was of little practical significance. Why the earth rotates around its own axis in its orbit around the sun, even in the most advanced physics theories, remains a mystery and a consequence of fundamental magic. Nowadays, of course, modern physics cannot conceive of using something as elegant and simple as the interval from one sunrise to the next to define time. That interval was too variable, too localised to the earth-sun system to be suitable for the flights of fancy of modern physics and cosmology. The magic involved was just too unsophisticated – too crude, too simple.

So now the unit of time is no longer a day but is a second. The second used to be the 86,400th part of a “standard” day, but now the reference interval is the second, defined as the

duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom, at rest, and approaching the theoretical temperature of absolute zero, including corrections for ambient radiation.

Day-magic is now replaced by a more sophisticated atomic magic. All radiation or vibration requires energy. It follows that the radiation of any atom must eventually cease but physicists are happy enough to invoke the magical acquisition of energy by the reference atom such that its radiation remains magically “constant”.

It is a similar story with the kilogram. Once upon a more common-sensical time, it was the weight (under the force of earth’s gravity) of a mass of one litre of water at 4ºC. Since the litre needed defining and the measurement was of weight rather than mass, physics needed something more sophisticated. So was born the International Prototype Kilogram (IPK).  But that mass of platinum/iridium (90/10) alloy was found to be losing mass (about 50 μg over 120 years) and so a more “independent” and “absolute” measure was needed. Two methods were proposed

One would define the kilogram in terms of the mass of a silicon atom by counting the number of atoms in a 1 kg sphere of ultra-pure silicon-28. (See Silicon Kilogram.)

The other …..  proposed assigning a fixed value to the Planck constant as the basis for a new definition. Mohr and Taylor reasoned that if a watt balance could use an exactly defined mass to measure the unknown value of h, then the process could be reversed: By setting an exact fixed value of h, the same system could be used to measure an unknown mass.

The idea, which came to be known as the “electric” or “electronic” kilogram, was widely discussed and finally endorsed in principle in 2011 by the international General Conference on Weights and Measures (CGPM), with a few provisions. One of them was that, prior to re-definition, at least one instrument, and preferably more, would have to measure h to a benchmark uncertainty of 2 parts in a hundred million (108). NIST’s most recent measurement has a stated relative standard uncertainty of 3.4 X 108. In addition, the values obtained by the watt balances should be in reasonable agreement with those from scientists using the atom-counting approach to defining the kilogram.

……. The measured values from different groups will have to be in very good agreement in order to set an official fixed value for h.

To get from Planck’s constant to mass is not that simple:

….. the connection between mass …  and a constant deriving from the very earliest days of quantum mechanics may not be immediately obvious. The scientific context for that connection is suggested by a deep underlying relationship between two of the most celebrated formulations in physics.

One is Einstein’s famous E =mc2, where E is energy, m is mass,and c is the speed of light. The other expression, less well known to the general public but fundamental to modern science, is E = hν, the first “quantum” expression in history, stated by Max Planck in 1900. Here E is energy, ν is frequency, and h is what is now known as the Planck constant.

Einstein’s equation reveals that mass can be understood and even quantified in terms of energy. Planck’s equation shows that energy, in turn, can be calculated in terms of the frequency (ν) of some entity such as a photon — or alternatively, with some mathematical substitutions, a significant mass — times an integer multiple of h. The integer aspect is what makes the relationship “quantized.”

Taking the two equations together yields a counterintuitive but hugely valuable insight: Mass – even on the scale of everyday objects – is inherently related to h, which Planck first used to describe the vanishingly small energy content of individual photons emitted by the atoms in hot objects. The value of h is about 0.6 trillionths of a trillionth of a billionth of 1 joule-second. The joule is the SI unit of energy.

As a practical matter, experiments linking mass to h with extraordinary precision became possible in the late 20th century as the result of two separate discoveries which led to two different physical constants related to voltage and resistance respectively.*

*These are the Josephson constant (K= 2e/h) and the von Klitzing constant (R= h/e2). …. Both constants also involve e, the fundamental charge of the electron. Because of the way the watt balance measures electrical power (albeit indirectly), e, cancels out of the equations. That leaves h as the sole quantity of interest.

The new NIST paper describes new measurements of h, with a watt-balance:

A high-tech version of an old-fashioned balance scale at the National Institute of Standards and Technology (NIST) has just brought scientists a critical step closer toward a new and improved definition of the kilogram. The scale, called the NIST-4 watt balance, has conducted its first measurement of a fundamental physical quantity called Planck’s constant to within 34 parts per billion – demonstrating the scale is accurate enough to assist the international community with the redefinition of the kilogram, an event slated for 2018.

But the Planck constant itself is unexplained and relies on magic.

Classical statistical mechanics requires the existence of h (but does not define its value). Eventually, following upon Planck’s discovery, it was recognized that physical action cannot take on an arbitrary value. Instead, it must be some multiple of a very small quantity, the “quantum of action”, now called the Planck constant. Classical physics cannot explain this fact.

Why Planck’s constant is a constant or has to be a constant is unknown. It’s magic. Why the radiation of a caesium atom would remain constant is also counter-intuitive and just magic. Advances in physics only delve down to deeper layers of magic. Ultimately they all rely on evoking the 4 fundamental magical  forces of the universe. Giving some magic a name and a label does not explain it.

Fifty-seven fundamental particles is just inelegant and unsatisfactory. It is complication for the sake of complication. (Has CERN ever actually discovered anything? Every question it addresses is answered by two more questions – and without ever answering the first. The God of the God particle turned out to be just a deity rather than a God.)

The universe is not that messy. It is just magical.

Far simpler to take a kilogram as being the mass of a litre of water where a litre is twice the amount of beer I can drink in one gulp (when I am parched).

### Gravitational “constant” is not constant but varies periodically

February 1, 2016

Newton’s gravitational constant, G, is surprisingly variable and varies periodically. The period is 5.899 +/- 0.062 years which is the same period by which the length of day varies and is also about half the 11 year solar cycle.

The reasons for this are unknown and speculations about currents in the earth’s core and magnetic effects abound.

The simplest explanation is that it is the same magic which causes gravity (and calling it space-time does not reduce its magical qualities) which also causes the solar cycle and is also the same magic which governs the movement of the earth around the sun and the corresponding length of day.

John D. Anderson, Gerald Schubert, Virginia Trimble, Michael R. Feldman, Measurements of Newton’s gravitational constant and the length of day, EPL 110 (2015) 10002, doi:10.1209/0295-5075/110/10002

Abstract:About a dozen measurements of Newton’s gravitational constant, G, since 1962 have yielded values that differ by far more than their reported random plus systematic errors. We find that these values for G are oscillatory in nature, with a period of P = 5.899 +/- 0.062 yr, an amplitude of (1.619 +/- 0.103) x 10^{-14} m^3 kg^{-1} s^{-2}, and mean-value crossings in 1994 and 1997. However, we do not suggest that G is actually varying by this much, this quickly, but instead that something in the measurement process varies. Of other recently reported results, to the best of our knowledge, the only measurement with the same period and phase is the Length of Day (LOD – defined as a frequency measurement such that a positive increase in LOD values means slower Earth rotation rates and therefore longer days). The aforementioned period is also about half of a solar activity cycle, but the correlation is far less convincing. The 5.9 year periodic signal in LOD has previously been interpreted as due to fluid core motions and inner-core coupling. We report the G/LOD correlation, whose statistical significance is 0.99764 assuming no difference in phase, without claiming to have any satisfactory explanation for it. Least unlikely, perhaps, are currents in the Earth’s fluid core that change both its moment of inertia (affecting LOD) and the circumstances in which the Earth-based experiments measure G. In this case, there might be correlations with terrestrial magnetic field measurements.

A set of 13 measurements of G exhibit a 5.9-year periodic oscillation (solid curve) that closely matches the 5.9-year oscillation in LOD measurements (dashed curve). The two outliers are a 2014 quantum measurement and a 1996 measurement known to suffer from drift. The green dot is an estimate of the mean value of G after the 5.9-year periodicity is removed. Credit: J. D. Anderson, et al. ©2015 EPLA

Physics is impossible without final recourse to various magics; Big Bang Magic, gravitational magic, weak force magic, strong force magic and electromagical magnetics. There is something very inelegant – bordering on ugly – when modern physics needs over 50 different “fundamental” particles and unknown, unseen, undetectable forms of dark matter and dark energy to make their models feasible.

If there is a fundamental particle then there can be only one and it is called the Ultimion.