I like this one:
π = 3
For the error in this one and some other similar math errors go HERE.

lorenz
From Jos Leys, Étienne Ghys and Aurélien Alvarez, the makers of Dimensions, comes CHAOS, a math movie with nine 13-minute chapters. It is a film about dynamical systems, the butterfly effect and chaos theory, intended for a wide audience.
The 9 movies are available here and the first Panta Rhei – Everything moves is below:
Being able to distinguish between “more” and “less” is – most likely – a capability that is a pre-requisite for the evolutionary development of the ability to count which itself must lead to the invention of numbers. Recent experiments with baboons demonstrates that they have a clear ability to make quite complex more/less distinctions.
Allison M. Barnard, Kelly D. Hughes, Regina R. Gerhardt, Louis DiVincenti, Jenna M. Bovee and Jessica F. Cantlon.Inherently Analog Quantity Representations in Olive Baboons (Papio anubis). Frontiers in Comparative Psychology, 2013 DOI: 10.3389/fpsyg.2013.00253
From the University of Rochester press release:
… Now a new study with a troop of zoo baboons and lots of peanuts shows that a less obvious trait—the ability to understand numbers—also is shared by man and his primate cousins.
“The human capacity for complex symbolic math is clearly unique to our species,” says co-author Jessica Cantlon, assistant professor of brain and cognitive sciences at the University of Rochester. “But where did this numeric prowess come from? In this study we’ve shown that non-human primates also possess basic quantitative abilities. In fact, non-human primates can be as accurate at discriminating between different quantities as a human child.”
“This tells us that non-human primates have in common with humans a fundamental ability to make approximate quantity judgments,” says Cantlon. “Humans build on this talent by learning number words and developing a linguistic system of numbers, but in the absence of language and counting, complex math abilities do still exist.” ……
……… The baboons’ choices, conclude the authors, clearly relied on this latter “more than” or “less than” cognitive approach, known as the analog system. The baboons were able to consistently discriminate pairs with numbers larger than three as long as the relative difference between the peanuts in each cup was large. Research has shown that children who have not yet learned to count also depend on such comparisons to discriminate between number groups, as do human adults when they are required to quickly estimate quantity.
Studies with other animals, including birds, lemurs, chimpanzees, and even fish, have also revealed a similar ability to estimate relative quantity, but scientists have been wary of the findings because much of this research is limited to animals trained extensively in experimental procedures. The concern is that the results could reflect more about the experimenters than about the innate ability of the animals. ……..……… To rule out such influence, the study relied on zoo baboons with no prior exposure to experimental procedures. Additionally, a control condition tested for human bias by using two experimenters—each blind to the contents of the other cup—and found that the choice patterns remained unchanged.
A final experiment tested two baboons over 130 more trials. The monkeys showed little improvement in their choice rate, indicating that learning did not play a significant role in understanding quantity.
“What’s surprising is that without any prior training, these animals have the ability to solve numerical problems,” says Cantlon. The results indicate that baboons not only use comparisons to understand numbers, but that these abilities occur naturally and in the wild, the authors conclude. …….
I spent a large part of my early career in mathematical modelling (of combustion systems and of heat flow) and have a very clear idea of what models can do and what they can’t. Models after all are used primarily to simplify complex systems which are otherwise intractable. They are – always – severely limited by the assumptions and simplifying approximations that have to be introduced. Models are a powerful tool for investigation but are only as good as their most inaccurate assumption. But they are a tool primarily for investigation — and can be dangerous when used for decision making based on their imperfect predictions. The spectacular failures of mathematical models of the global economy are a case in point. It is worth noting that in spite of the great strides made in weather forecasting for example – much of which is empirical – the simple statement that “tomorrows weather will be like today’s” is as correct – statistically – as the most complex model running on some super-computer somewhere.
It has therefore always amazed me that so-called “scientists” would be so certain about their approximate models of climate systems – which are perhaps as complex, chaotic and “unknown” systems as any one could study. It has been a boon for politicians looking for new ways of raising revenue. It has been exploited by the alarmists since the alarmist predictions cannot be tested. The wide spread of results from climate models is rarely mentioned.
When reality does not match model forecasts it is time to back off and rethink the models and hopefully they will be better next time. And it is time to back track from all the political decisions made on the basis of patently incomplete and inaccurate models.
The simple reality about climate is that rather than being a “settled science”
This from Dr. Roy Spencer and the ridiculously wide spread of the model results and the obvious deviation of reality from model results are particularly striking:
Global Warming Slowdown: The View from Space
Since the slowdown in surface warming over the last 15 years has been a popular topic recently, I thought I would show results for the lower tropospheric temperature (LT) compared to climate models calculated over the same atmospheric layers the satellites sense.
Courtesy of John Christy, and based upon data from the KNMI Climate Explorer, below is a comparison of 44 climate models versus the UAH and RSS satellite observations for global lower tropospheric temperature variations, for the period 1979-2012 from the satellites, and for 1975 – 2025 for the models:
Clearly, there is increasing divergence over the years between the satellite observations (UAH, RSS) and the models. The reasons for the disagreement are not obvious, since there are at least a few possibilities:
………
The dark line in the above plot is the 44-model average, and it approximately represents what the IPCC uses for its official best estimate of projected warming. Obviously, there is a substantial disconnect between the models and observations for this statistic.
I find it disingenuous for those who claim that, because not ALL of individual the models disagree with the observations, the models are somehow vindicated. What those pundits fail to mention is that the few models which support weaker warming through 2012 are usually those with lower climate sensitivity.
So, if you are going to claim that the observations support some of the models, and least be honest and admit they support the models that are NOT consistent with the IPCC best estimates of warming.
A new paper playing probabilistic games – this time about the Doomsday Argument.
Universal Doomsday: Analyzing Our Prospects for Survival, by Austin Gerig, Ken D. Olum, Alexander Vilenkin, arxiv.org/abs/1303.4676 , Cosmology and Extragalactic Astrophysics (astro-ph.CO)
The full version of the paper (pdf) is here
The Doomsday Argument is the idea that we can estimate the total number of humans that will ever exist, given the number that have lived so far. The argument goes that since around 100 billion is the number of humans that have ever lived and assuming that there is a 95% probability that we are among the last 95% of humans who will ever live then there is a 95% probability that the number of humans who will ever live will lie between 1.4 and 2.0 trillion. A fairly trivial conclusion since any probability greater than 0 and less than 100% would be valid for the exercise.
In this paper the authors try to formalise the probability calculations and introduce the effect of known existential threats. Just like in Drake’s equation for the number of extra-terrestrial civilisations that may exist in the Milky Way, all the probabilities are unknown and could be assumed to be anything you like. The Doomsday Argument like Drake’s equation is really no more than a probability game, based on nothing at all. But it is fascinating to consider which terms are relevant and necessary in any such game. And that is what makes these games interesting.
The author’s conclusions could be considered a trifle obvious and almost cliched – but none the less they are perfectly true!! The Earth will surely experience catastrophic events in the future which threaten human existence – whether by earthquake or volcanos or meteors and even if we survive all of these, eventually by the inevitable death of our sun. In fact you could play another – and equally valid – probability game and calculate how many humans will have lived if humanity continues to survive till the death of our sun. And this probability is surely not zero.
To avoid Doomsday, humanity needs to make sure that asteroids don’t crash into earth and that catastrophic earthquakes, volcano eruptions or the like don’t occur until such time as humanity has spread into space and developed colonies on other planets.
From the Conclusions:
With the priors that we considered, the fraction of civilizations that last long enough to become large is not likely to exceed a few percent. If there is a message here for our own civilization, it is that it would be wise to devote considerable resources (i) for developing methods of diverting known existential threats and (ii) for space exploration and colonization. Civilizations that adopt this policy are more likely to be among the lucky few that beat the odds. Somewhat encouragingly, our results indicate that the odds are not as overwhelmingly low as suggested by earlier work.
Abstract (Submitted on 19 Mar 2013)
Given a sufficiently large universe, numerous civilizations almost surely exist. Some of these civilizations will be short-lived and die out relatively early in their development, i.e., before having the chance to spread to other planets. Others will be long-lived, potentially colonizing their galaxy and becoming enormous in size. What fraction of civilizations in the universe are long-lived? The “universal doomsday” argument states that long-lived civilizations must be rare because if they were not, we should find ourselves living in one. Furthermore, because long-lived civilizations are rare, our civilization’s prospects for long-term survival are poor. Here, we develop the formalism required for universal doomsday calculations and show that while the argument has some force, our future is not as gloomy as the traditional doomsday argument would suggest, at least when the number of early existential threats is small.
Acharya Sennimalai Kalimuthu strikes again! And Elsevier as publishers do look like idiots.
Back in April I posted about a paper by Kalimuthu which was first published in Computers & Mathematics with Applications and then retracted because it “lacked scientific content”.
This time he managed to get a paper published in Applied Mathematics Letters
For the origin of new geometry, by S. Kalimuthu, 2/394, Kanjampatti P.O., Pollachi Via, Tamil Nadu 642003, India. http://dx.doi.org/10.1016/j.aml.2010.08.006,
He has 12 references – all self-citations. The paper has now been retracted because it “makes no sense mathematically”. The title itself should have been a give-away but the paper was published in December 2010 and it has taken 2 years to be retracted.
This paper does not meet the minimum research and mathematical standards of Applied Mathematics Letters; for example, some of this paper’s constructions and arguments make no sense mathematically. Though handled by the previous editorial office, the available records lead us to believe its publication was the result of an administrative oversight and apologies are offered to readers of the journal that this was not detected earlier.
In both cases Elsevier was the unfortunate publisher. This does not say much for the “peer-review” process at Elsevier which allowed such rubbish to be published. First I wondered if Kalimuthu might be an unrecognised genius until I read his two papers. You do not need to be an advanced mathematician to appreciate the absurdities. His two papers are
After the 5th reading of his second paper I managed to figure out the central claim:
Our constructions and proofs are consistent. We have not introduced any new hypothesis in this work. . ….. But we have pointed out in the abstract that the fifth Euclidean postulate problem is one of the most famous mathematical impossibilities. So, although our finding is consistent, it poses a very serious question about the foundations of geometry.
… we have obtained a challenging result, namely the smaller side of triangle AHJ is equal to the larger side BC of triangle ABC. This is a problematic problem.
Further studies will certainly unlock this mathematical mystery.
No doubt the further studies will be first published and then retracted by Elsevier.
Retraction Watch covers the story and actually took the time to write to Kalimuthu for his comments on the retraction. His reply will surely go down as a classic:
“Please and please note that I do NOT agree with retraction of this relevant paper.Can you tell me WHAT IS THE FLAW? AND WHERE IS THE FLAW? A result is a result, A result is a result, A result is a result, and A result is a result,.Let us recall what Einstein told about simplicity: IF YOU CAN NOT PUT YOUR IDEA IN SIMPLE, IT SHOWS THAT YOU DO NOT KNOW THE SUBJECT. Who is expert? We are all so called experts. Only God is expert. I am going to re write this particular paper in 20 long pages and get published. Kindly note that papers rejected by referees and editors have won the NOBEL PRIZE.”
But what on earth was Elsevier playing at to publish such drivel.
Once was bad enough but twice???
Either Kalimuthu has some kind of genius in being able to get papers without scientific content and which make no mathematical sense published or the Elsevier peer-review process is a farce.
Before an intelligence can turn to counting it must first have some concept of numbers. When and how did our ancient ancestors first develop a concept of numbers and then start counting?
… the increasing complexities of co-operation and their requirements for communication was what drove the parallel – and inter-linked – development of speech and numerology starting some 150,000 years ago.
Idle thoughts and unanswerable questions on a Saturday morning:
The Number System seems to be continuous and infinite but every number seems to be discrete. But if any number is also infinitely divisible it must also be continuous. So are numbers simultaneously both discrete and continuous?
Or is a number just a label? Perhaps a number – if just a label and representing a singularity – is discrete and the divisibility of a number is actually undefined. It is number difference – not a number – which is infinitely divisible. So – for example – the number 10, as a label, is not divisible — it is the number difference between 10 and some reference number (10-0) which is. So is our Number System then a discrete thing and made up of an infinite and continuous quantity of number differences with each number difference being discrete? It would then be rational for there to be an infinity of discrete Number Systems.
So perhaps numbers don’t exist. Only number differences do and the numbers are their labels.
The sequence of partition numbers grows fast. But a general formula for calculating the number of partitions for any number n has been elusive.
Emory mathematician Ken Ono and colleagues will be announcing results today that include a finite, algebraic formula for partition numbers thanks to discovering that the sequence of partitions is fractal.
Partition numbers: In number theory, a partition of a positive integer n is a way of writing n as a sum of positive integers. Two sums that differ only in the order of their summands are considered to be the same partition. In number theory, the partition function p(n) represents the number of possible partitions of a natural number n, which is to say the number of distinct (and order independent) ways of representing n as a sum of natural numbers.
A EurekAlert press release appeared today, entitled: New math theories reveal the nature of numbers and people are already whispering “Fields Medal”. Obviously, like most press releases, this one is full of hyperbole and ridiculous sentences like, “the team was determined go beyond mere theories”, but the actual work being discussed is fascinating says one maths blogger.
The work of 18th-century mathematician Leonhard Euler led to the first recursive technique for computing the partition values of numbers. The method was slow, however, and impractical for large numbers. For the next 150 years, the method was only successfully implemented to compute the first 200 partition numbers. In the early 20th century, Srinivasa Ramanujan and G. H. Hardy invented the circle method, which yielded the first good approximation of the partitions for numbers beyond 200. They essentially gave up on trying to find an exact answer, and settled for an approximation.
Ramanujan also noted some strange patterns in partition numbers. In 1919 he wrote: “There appear to be corresponding properties in which the moduli are powers of 5, 7 or 11 … and no simple properties for any moduli involving primes other than these three.”
The legendary Indian mathematician died at the age of 32 before he could explain what he meant by this mysterious quote, now known as Ramanujan’s congruences.
In 1937, Hans Rademacher found an exact formula for calculating partition values. While the method was a big improvement over Euler’s exact formula, it required adding together infinitely many numbers that have infinitely many decimal places.
On Friday, Emory mathematician Ken Ono will unveil new theories that answer these famous old questions.
Ono and his research team have discovered that partition numbers behave like fractals. They have unlocked the divisibility properties of partitions, and developed a mathematical theory for “seeing” their infinitely repeatingsuperstructure. And they have devised the first finite formula to calculate the partitions of any number.
“Our work brings completely new ideas to the problems,” says Ono, who will explain the findings in a public lecture at 8 p.m. Friday on the Emory campus. “We prove that partition numbers are ‘fractal’ for every prime. These numbers, in a way we make precise, are self-similar in a shocking way. Our ‘zooming’ procedure resolves several open conjectures, and it will change how mathematicians study partitions.” …….. “We found a function, that we call P, that is like a magical oracle,” Ono says. “I can take any number, plug it into P, and instantly calculate the partitions of that number. P does not return gruesome numbers with infinitely many decimal places. It’s the finite, algebraic formula that we have all been looking for.”