Archive for the ‘Mathematics’ Category

Pareto’s 80/20 rule is ubiquitous

June 11, 2018

I first came across and learned to use the Pareto principle in the 70s as a young engineer. It was the starting point for fault analysis of any kind.  Root cause analysis always started with a “Pareto diagram”. It was an extremely powerful tool not only for fault analysis but also then in all quality improvement actions.

The Pareto principle (also known as the 80/20 rule, the law of the vital few, or the principle of factor sparsity) states that, for many events, roughly 80% of the effects come from 20% of the causesWikipedia

Pareto showed in 1896 that 80% of land in Italy was owned by 20% of the population and thus was born the 80/20 rule. It has now become almost a cliche in all business processes and in financial and economic analysis to describe the relationship where a minority of causes lead to a majority of the result.

The 80-20 rule is a business rule of thumb that states that 80% of outcomes can be attributed to 20% of all causes for a given event. In business, the 80-20 rule is often used to point out that 80% of a company’s revenue is generated by 20% of its total customers. Therefore, the rule is used to help managers identify and determine which operating factors are most important and should receive the most attention based on an efficient use of resources.Investopedia

The 80/20 rule seems to apply in almost all fields. It applies in wealth distribution, in personal and public finance, in all kinds of manufacturing, in quality control, in experimentation and in disease control.

It is not perhaps so surprising.

Wherever a phenomenon is subject to a power-law probability distribution, the 80/20 rule will apply, and a power-law probability distribution is perhaps the most common form of probability distribution that occurs in nature and in man-made processes. Put differently it is not at all surprising.

Of all the possible causes of an observed effect, a minority of the possible causes are usually responsible for a majority of the observed effect.

Perhaps we should be surprised only if the 80/20 “rule” does not apply. The “20%” and the “80%” should be taken as symbols for a “minority” and a “majority” respectively and then the 80/20 rule is ubiquitous.


 

Advertisements

Decimals are too simple

April 22, 2018

Of course all attempts to create a 10 hour day with 100 minutes to each hour and 100 seconds to each minute have failed. Similarly all attempts to divide the circle into 100 parts have not caught on.

The use of 60 is almost as basic as the use of 10.

The origins of base 60

All the non-decimal systems I learnt were embedded in memory before I was 20. I don’t expect many will remember these.

As a child I learned the use of 12 and 60 from my grandmother. The use of 12 was automatic with 4 and 3. Three pice to a pie. 4 pies to an anna and 16 annas to the rupee. When driving around India with my father, miles and furlongs and yards and feet came naturally. Bushels and pecks and gallons and quarts and pints came later as an apprentice in England.

Decimals are simple. But they are also simplistic.

Perhaps too simple.

Rupee, anna, pice, pies

Pounds, shillings, pence, farthings

Ton, hundredweights, pounds, ounces

Mile, furlongs, yards, feet

Bushel, pecks, gallons, quarts, pints


 

Counting on fingers leads naturally also to base-60

August 19, 2017

We have forgotten what it was like to count on our fingers. We have forgotten that counting itself was a mystery long before the mysteries of manipulation of numbers and the magic of mathematics. Yet the use of base-60 lies deep in our psyches. We still use it for time measurement and for geographical and spatial measurements. Attempts to use decimals for time and angle measurement have all failed miserably. Sixty still occurs in ancient Chinese and Indian calendars.

Today the use of 60 still predominates for time, for navigation and geometry. But generally only for units already defined in antiquity. A base of 10 is used for units found to be necessary in more recent times. Subdivision of a second of time or a second of arc is always using the decimal system rather than by the duodecimal or the sexagesimal system.

Usually the origin of sexagesimal systems of counting are traced back to the Babylonians (c. 1,800 BCE) and even to the Sumerians (c. 3,000 – 2,500 BCE). But I suspect that it goes back much further and that base-60 long precedes the Babylonians and the Sumerians.

I observe that twelve and then sixty come naturally from three factors:

  1. using fingers for counting,
  2. maximising the count with only one hand free, and
  3. the growth of trade and the need for counts of greater than 20

That five comes naturally from the fingers of one hand is self-evident. With only one free hand, a count to twelve using the thumb and the digits of the other four fingers is also self-evident. I saw my great grandmother, and my grandmother after her, regularly count to twelve using only one hand. Sixty comes naturally from a hand of five of a hand of twelve. Counting to five and twelve would have been well known to our hunter-gatherer ancestors. It seems very plausible that a hunter would look to maximise the count on a single hand. So, it is not necessary to look for the origins of base-60 in the skies or in the length of the year or the number of its divisors or the beginnings of geometry. If the origins of counting lie some 50,000 years ago, the use of twelve and then of sixty probably goes back some 20,000 years.

The origins of base 60

I like 60. Equilaterals. Hexagons. Easy to divide by almost anything. Simple integers for halves, quarters, thirds, fifths, sixths, tenths, 12ths, 15ths and 30ths. 3600. 60Hz. Proportions pleasing to the eye. Recurring patterns. Harmonic. Harmony.

The origins of the use of base 60 are lost in the ancient past. By the time the Sumerians used it about 2,500 years ago it was already well established and continued through the Babylonians. But the origin lies much earlier.

hand of 5I speculate that counting – in any form more complex than “one, two, many….” – probably goes back around 50,000 years. I have little doubt that the fingers of one hand were the first counting aids that were ever used, and that the base 10 given by two hands came to dominate. Why then would the base 60 even come into being?

The answer, I think, still lies in one hand. Hunter-gatherers when required to count would prefer to use only one hand and they must – quite early on and quite often – have had the need for counting to numbers greater than five. And of course using the thumb as pointer one gets to 12 by reckoning up the 3 bones on each of the other 4 fingers.

a hand of 12 - image sweetscience

a hand of 12 – image sweetscience

My great-grandmother used to count this way when checking the numbers of vegetables (onions, bananas, aubergines) bought by her maid at market. Counting up to 12 usually sufficed for this. When I was a little older, I remember my grandmother using both hands to check off bags of rice brought in from the fields – and of course with two hands she could get to 144. The counting of 12s most likely developed in parallel with counting in base 10 (5,10, 50, 100). The advantageous properties of 12 as a number were fortuitous rather than by intention. But certainly the advantages helped in the persistence of using 12 as a base. And so we still have a dozen (12) and a gross (12×12) and even a great gross (12x12x12) being used today. Possibly different groups of ancient man used one or other of the systems predominantly. But as groups met and mixed and warred or traded with each other the systems coalesced.

hands for 60

And then 60 becomes inevitable. Your hand of 5, with my hand of 12, gives the 60 which also persists into the present.  (There is one theory that 60 developed as 3 x 20, but I think finger counting and the 5 x 12 it leads to is far more compelling). But it is also fairly obvious that the use of 12 must be prevalent first before the 60 can appear. Though the use of 60 seconds and 60 minutes are all pervasive, it is worth noting that they can only come after each day and each night is divided into 12 hours.

While the use of base 10 and 12 probably came first with the need for counting generally and then for trade purposes (animals, skins, weapons, tools…..), the 12 and the 60 came together to dominate the measuring and reckoning of time. Twelve months to a year with 30 days to a month. Twelve hours to a day or a night and 60 parts to the hour and 60 parts to those minutes. There must have been a connection – in time as well as in the concepts of cycles – between the “invention” of the calendar and the geometrical properties of the circle. The number 12 has great significance in Hinduism, in Judaism, in Christianity and in Islam. The 12 Adityas, the 12 tribes of Israel, the 12 days of Christmas, the 12 Imams are just examples. My theory is that simple sun and moon-based religions gave way to more complex religions only after symbols and writing appeared and gave rise to symbolism. ……… 

If we had six fingers on each hand the decimal system would never have seen the light of day. A millisecond would then be 1/ 1728th of a second. It is a good thing we don’t have 7 fingers on each hand, or – even worse – one hand with 6 fingers and one with 7. Arithmetic with a tridecimal system of base 13 does not entice me. But if I was saddled with 13 digits on my hands I would probably think differently.

 


 

The inherent logic of the universe – but not language – was established by the Big Bang

June 16, 2017

You could call this the First Law of Everything.

Logic is embedded in the universe.

At the Big Bang we have no idea what the prevailing laws were. Physicists merely call it a singularity where the known laws of physics did not apply. It was just another Creation Event. But thereafter – after the Big Bang – everything we observe in the universe is logical. We take logic to be inherent in the Universe around us. We discover facets of this embedded logic empirically and intuitively (and intuition is merely the synthesis of empiricism). We do not invent logic – we discover it. If logic was ever created it was created at the time of the Big Bang.

Language, on the other hand, is invented by man to describe and communicate the world around us. We build into the framework of our languages, rules of logic such that the use of language is consistent with the embedded logic of the universe. But language is not always equal to the task of describing the universe around us. “I have not the words to describe ….”. And then we imbue old words with new meanings or invent new words, or new grammar. But we never make changes which are not consistent with the logic of the universe.

Reasoning with language is then constrained to lie within the logical framework so constructed and therefore, also always consistent with our empirical observations of the universe around us. Given certain assumptions – as expressed by language – always lead to the same logical inferences – also as described by that language. Such inferring, or reasoning, works and – within our observable universe – is a powerful way of extrapolating from the known to the not-yet-known. The logical framework itself ensures that the inferences drawn remain consistent with the logic of the universe.

In the sentence “If A is bigger than B, and if B is bigger than C, then A is bigger than C”, it is the logic framework of the language which constrains if, then and bigger to have meanings which are consistent with what we can observe. The logic framework is not the grammar of the language. Grammar would allow me to say: “If A is bigger than B, and if B is bigger than C, then A is smaller/louder/faster/heavier than C”, but the embedded logic framework of the language is what makes it ridiculous. The validity of the reasoning or of inferring requires that the logic framework of the language not be infringed. “If A is bigger than B, and if B is bigger than C, then A is smaller than C” is grammatically correct but logically invalid (incorrect). However, the statement “If A is bigger than B, and if B is bigger than C, then A is heavier than C” is grammatically correct, logically invalid but not necessarily incorrect.

Mathematics (including Symbolic Logic) also contains many languages which provide a better means of describing facets of the universe which other languages cannot. But they all contain a logic framework consistent with the embedded logic of the universe. That 1 + 1 =2 is a discovery – not an invention. That 2H2 + O2 = 2H2O is also a discovery, not an invention. The rules for mathematical operations in the different branches of mathematics must always remain consistent with the embedded logic of the universe – even if the language invented has still to find actual application. Imaginary numbers and the square root of -1 were triggered by the needs of the electrical engineers. Set theory, however, was only used in physics and computing long after it was “invented”.

Languages (including mathematics) are invented but each must have a logical framework which itself is consistent with the inherent logic of the universe.


 

Number theory was probably more dependent upon live goats than on raindrops

June 14, 2017

It used to be called arithmetic but it sounds so much more modern and scientific when it is called number theory. It is the branch of mathematics which deals with the integers and the relationships between them. Its origins (whether one wants to call it a discovery or an invention) lie with the invention of counting itself. It is from where all the various branches of mathematics derive. The origin of counting can be said to be with the naming of the integers, and is intimately tied to the development of language and of writing and perhaps goes back some 50,000 years (since the oldest known tally sticks date from some 30,000 years ago).

How and why did the naming of the integers come about?  Why were they found necessary (necessity being the cause of the invention)? Integers are whole numbers, indivisible, complete in themselves. Integers don’t recognise a continuum between themselves. There are no partials allowed here. They are separate and discrete and number theory could as well be called quantum counting.

Quite possibly the need came from counting their livestock or their prey. If arithmetic took off in the fertile crescent it well may have been the need for trading their live goats among themselves (integral goats for integral numbers of wives or beads or whatever else they traded) which generated the need for counting integers. Counting would have come about to fit their empirical observations. Live goats rather than carcasses, I think, because a carcass can be cut into bits and is not quite so dependent upon integers.  Quanta of live goat, however, would not permit fractions. It might have been that they needed integers to count living people (number of children, number of wives …..) where fractions of a person were not politically correct.

The rules of arithmetic – the logic – could only be discovered after the integers had been named and counting could go forth. The commutative, associative and distributive properties of integers inevitably followed. And the rest is history.

But I wonder how mathematics would have developed if the need had been to count raindrops.

After all:

2 goats + 2 goats = 4 goats, and it then follows that

2 short people + 2 short people = 4 short people.

But if instead counting had been inspired by counting raindrops, they would have observed that

2 little raindrops + 2 little raindrops = 1 big raindrop.

They might then have concluded that

2 short people + 2 short people = one tall person

and history would then have been very different.


 

Counting was an invention

March 19, 2017

A new book is just out and it seems to be one I have to get. I am waiting to get hold of an electronic version.

Number concepts are a human invention―a tool, much like the wheel, developed and refined over millennia. Numbers allow us to grasp quantities precisely, but they are not innate. Recent research confirms that most specific quantities are not perceived in the absence of a number system. In fact, without the use of numbers, we cannot precisely grasp quantities greater than three; our minds can only estimate beyond this surprisingly minuscule limit.

Numbers fascinate me and especially how they came to be.

The earliest evidence we have of humans having counting ability are ancient tally sticks made of bone and dating up to 50,000 years ago. An ability to tally at least up to 55 is evident. One of the tally sticks may have been a form of lunar calendar. By this time apparently they had a well developed concept of time. And concepts of time lead immediately and inevitably to the identification of recurring time periods. By 50,000 years ago our ancestors counted days and months and probably years. Counting numbers of people would have been child’s play. They had clearly developed some sophistication not only in “numbering” by this time but had also progressed from sounds and gestures into speech.  They were well into the beginnings of language.

Marks on a tally stick tell us a great deal. The practice must have been developed in response to a need. Vocalisations – words – must have existed to describe the tally marks. These marks were inherently symbolic of something else. They are evidence of the ability to symbolise and to think in abstract terms. Perhaps they represented numbers of days or a count of cattle or of items of food or of number of people in the tribe. But their very existence suggests that the concept of ownership of property – by the individual or by the tribe – was already in place. Quite probably a system of trading with other tribes and protocols for such trade were also in place. At 50,000 years ago our ancestors were clearly on the threshold of using symbols not just on tally sticks or in cave paintings but in a general way and that would have been the start of developing a written language. …….

My time-line then becomes:

  • 8 million YBP           Human Chimpanzee divergence
  • 6 million YBP           Rudimentary counting among Archaic humans (1, 2, 3 many)
  • 2 million YBP           Stone tools
  • 600,000 YBP          Archaic Human – Neanderthal divergence
  • 400,000 YBP          Physiological and genetic capability for speech?
  • 150,000 YBP           Speech and counting develop together
  • 50,000   YBP           Verbal language, counting, trading, calendars in place (tally sticks)
  • 30,000   YBP           Beginnings of written language?
Clearly our counting is dominated by the base of 10 and our penchant for 12-based systems. The joints on the fingers of one hand allows us to count to 12 and that together with the five fingers of the other clearly led to our many 60-based counting systems.

I like 60. Equilaterals. Hexagons. Easy to divide by almost anything. Simple integers for halves, quarters, thirds, fifths, sixths, tenths, 12ths, 15ths and 30ths. 3600. 60Hz. Proportions pleasing to the eye. Recurring patterns. Harmonic. Harmony.

The origins of the use of base 60 are lost in the ancient past. By the time the Sumerians used it about 2,500 years ago it was already well established and continued through the Babylonians. But the origin lies much earlier. ……

Why then would the base 60 even come into being?

image sweet science

The answer, I think, still lies in one hand. Hunter-gatherers when required to count would prefer to use only one hand and they must – quite early on and quite often – have had the need for counting to numbers greater than five. And of course using the thumb as pointer one gets to 12 by reckoning up the 3 bones on each of the other 4 fingers. 

My great-grandmother used to count this way when checking the numbers of vegetables (onions, bananas, aubergines) bought by her maid at market. Counting up to 12 usually sufficed for this. When I was a little older, I remember my grandmother using both hands to check off bags of rice brought in from the fields – and of course with two hands she could get to 144. The counting of 12s most likely developed in parallel with counting in base 10 (5,10, 50, 100). The advantageous properties of 12 as a number were fortuitous rather than by intention. But certainly the advantages helped in the persistence of using 12 as a base. And so we still have a dozen (12) and a gross (12×12) and even a great gross (12x12x12) being used today. Possibly different groups of ancient man used one or other of the systems predominantly. But as groups met and mixed and warred or traded with each other the systems coalesced.

If we had 4 bones on each finger we would be using 5 x 16 = 80 rather than 60.


 

The sum of all wisdom by the mathematics of philosophy

November 4, 2016

The sum of all wisdom is the summation across the population, of the integrals over time of the second values derivative of knowledge.

the-sum-of-all-wisdom

Not the philosophy of mathematics but the mathematics of philosophy!


 

What is “statistically significant” is not necessarily significant

October 12, 2016

“Statistical significance” is “a mathematical machine for turning baloney into breakthroughs, and flukes into funding” – Robert Matthews.


Tests for statistical significance generating the p value are supposed to give the probability of the null hypothesis (that the observations are not a real effect and fall within the bounds of randomness). So a low p value only indicates that the null hypothesis has a low probability and therefore it is considered “statistically significant” that the observations do, in fact, describe a real effect. Quite arbitrarily it has become the custom to use 0.05 (5%) as the threshold p-value to distinguish between “statistically significant” or not. Why 5% has become the “holy number” which separates acceptance for publication and rejection, or success from failure is a little irrational. Actually what “statistically significant” means is that “the observations may or may not be a real effect but there is a low probability that they are entirely due to chance”.

Even when some observations are considered just “statistically significant” there is a 1:20 chance that they are not. Moreover it is conveniently forgotten that statistical significance is called for only when we don’t know. In a coin toss there is certainty (100% probability) that the outcome will be a heads or a tail or a “lands on its edge”. Thereafter to assign a probability to one of the only 3 outcomes possible can be helpful – but it is a probability constrained within the 100% certainty of the 3 outcomes. If a million people take part in a lottery, then the 1: 1,000,000 probability of a particular individual winning has significance because there is 100% certainty that one of them will win. But when conducting clinical tests for a new drug, it is often so that there is no certainty anywhere to provide a framework and a boundary within which to apply a probability.

A new article in Aeon by David Colquhoun, Professor of pharmacology at University College London and a Fellow of the Royal Society, addresses The Problem with p-values.

In 2005, the epidemiologist John Ioannidis at Stanford caused a storm when he wrote the paper ‘Why Most Published Research Findings Are False’,focusing on results in certain areas of biomedicine. He’s been vindicated by subsequent investigations. For example, a recent article found that repeating 100 different results in experimental psychology confirmed the original conclusions in only 38 per cent of cases. It’s probably at least as bad for brain-imaging studies and cognitive neuroscience. How can this happen?

The problem of how to distinguish a genuine observation from random chance is a very old one. It’s been debated for centuries by philosophers and, more fruitfully, by statisticians. It turns on the distinction between induction and deduction. Science is an exercise in inductive reasoning: we are making observations and trying to infer general rules from them. Induction can never be certain. In contrast, deductive reasoning is easier: you deduce what you would expect to observe if some general rule were true and then compare it with what you actually see. The problem is that, for a scientist, deductive arguments don’t directly answer the question that you want to ask.

What matters to a scientific observer is how often you’ll be wrong if you claim that an effect is real, rather than being merely random. That’s a question of induction, so it’s hard. In the early 20th century, it became the custom to avoid induction, by changing the question into one that used only deductive reasoning. In the 1920s, the statistician Ronald Fisher did this by advocating tests of statistical significance. These are wholly deductive and so sidestep the philosophical problems of induction.

Tests of statistical significance proceed by calculating the probability of making our observations (or the more extreme ones) if there were no real effect. This isn’t an assertion that there is no real effect, but rather a calculation of what wouldbe expected if there were no real effect. The postulate that there is no real effect is called the null hypothesis, and the probability is called the p-value. Clearly the smaller the p-value, the less plausible the null hypothesis, so the more likely it is that there is, in fact, a real effect. All you have to do is to decide how small the p-value must be before you declare that you’ve made a discovery. But that turns out to be very difficult.

The problem is that the p-value gives the right answer to the wrong question. What we really want to know is not the probability of the observations given a hypothesis about the existence of a real effect, but rather the probability that there is a real effect – that the hypothesis is true – given the observations. And that is a problem of induction.

Confusion between these two quite different probabilities lies at the heart of why p-values are so often misinterpreted. It’s called the error of the transposed conditional. Even quite respectable sources will tell you that the p-value is the probability that your observations occurred by chance. And that is plain wrong. …….

……. The problem of induction was solved, in principle, by the Reverend Thomas Bayes in the middle of the 18th century. He showed how to convert the probability of the observations given a hypothesis (the deductive problem) to what we actually want, the probability that the hypothesis is true given some observations (the inductive problem). But how to use his famous theorem in practice has been the subject of heated debate ever since. …….

……. For a start, it’s high time that we abandoned the well-worn term ‘statistically significant’. The cut-off of P < 0.05 that’s almost universal in biomedical sciences is entirely arbitrary – and, as we’ve seen, it’s quite inadequate as evidence for a real effect. Although it’s common to blame Fisher for the magic value of 0.05, in fact Fisher said, in 1926, that P= 0.05 was a ‘low standard of significance’ and that a scientific fact should be regarded as experimentally established only if repeating the experiment ‘rarely fails to give this level of significance’.

The ‘rarely fails’ bit, emphasised by Fisher 90 years ago, has been forgotten. A single experiment that gives P = 0.045 will get a ‘discovery’ published in the most glamorous journals. So it’s not fair to blame Fisher, but nonetheless there’s an uncomfortable amount of truth in what the physicist Robert Matthews at Aston University in Birmingham had to say in 1998: ‘The plain fact is that 70 years ago Ronald Fisher gave scientists a mathematical machine for turning baloney into breakthroughs, and flukes into funding. It is time to pull the plug.’ ………

Related: Demystifying the p-value


 

The origins of base 60

September 14, 2015

I like 60. Equilaterals. Hexagons. Easy to divide by almost anything. Simple integers for halves, quarters, thirds, fifths, sixths, tenths, 12ths, 15ths and 30ths. 3600. 60Hz. Proportions pleasing to the eye. Recurring patterns. Harmonic. Harmony.

The origins of the use of base 60 are lost in the ancient past. By the time the Sumerians used it about 2,500 years ago it was already well established and continued through the Babylonians. But the origin lies much earlier.

hand of 5I speculate that counting – in any form more complex than “one, two, many….” – probably goes back around 50,000 years. I have little doubt that the fingers of one hand were the first counting aids that were ever used, and that the base 10 given by two hands came to dominate. Why then would the base 60 even come into being?

The answer, I think, still lies in one hand. Hunter-gatherers when required to count would prefer to use only one hand and they must – quite early on and quite often – have had the need for counting to numbers greater than five. And of course using the thumb as pointer one gets to 12 by reckoning up the 3 bones on each of the other 4 fingers.

a hand of 12 - image sweetscience

a hand of 12 – image sweetscience

My great-grandmother used to count this way when checking the numbers of vegetables (onions, bananas, aubergines) bought by her maid at market. Counting up to 12 usually sufficed for this. When I was a little older, I remember my grandmother using both hands to check off bags of rice brought in from the fields – and of course with two hands she could get to 144. The counting of 12s most likely developed in parallel with counting in base 10 (5,10, 50, 100). The advantageous properties of 12 as a number were fortuitous rather than by intention. But certainly the advantages helped in the persistence of using 12 as a base. And so we still have a dozen (12) and a gross (12×12) and even a great gross (12x12x12) being used today. Possibly different groups of ancient man used one or other of the systems predominantly. But as groups met and mixed and warred or traded with each other the systems coalesced.

hands for 60

And then 60 becomes inevitable. Your hand of 5, with my hand of 12, gives the 60 which also persists into the present.  (There is one theory that 60 developed as 3 x 20, but I think finger counting and the 5 x 12 it leads to is far more compelling). But it is also fairly obvious that the use of 12 must be prevalent first before the 60 can appear. Though the use of 60 seconds and 60 minutes are all pervasive, it is worth noting that they can only come after each day and each night is divided into 12 hours.

While the use of base 10 and 12 probably came first with the need for counting generally and then for trade purposes (animals, skins, weapons, tools…..), the 12 and the 60 came together to dominate the measuring and reckoning of time. Twelve months to a year with 30 days to a month. Twelve hours to a day or a night and 60 parts to the hour and 60 parts to those minutes. There must have been a connection – in time as well as in the concepts of cycles – between the “invention” of the calendar and the geometrical properties of the circle. The number 12 has great significance in Hinduism, in Judaism, in Christianity and in Islam. The 12 Adityas, the 12 tribes of Israel, the 12 days of Christmas, the 12 Imams are just examples. My theory is that simple sun and moon-based religions gave way to more complex religions only after symbols and writing appeared and gave rise to symbolism.

Trying to construct a time-line is just speculation. But one nice thing about speculation is that the constraints of known facts are very loose and permit any story which fits. So I put the advent of numbers and counting at around 50,000 years ago first with base 10 and later with base 12. The combination of base 10 with base 12, I put at around 20,000 years ago when agricultural settlements were just beginning. The use of 60 must then coincide with the first structured, astronomical observations after the advent of writing and after the establishment of permanent, settlements. It is permanent settlements. I think, which allowed regular observations of cycles, which allowed specialisations and the development of symbols and religion and the privileged priesthood. That probably puts us at about 8 -10,000 years ago, as agriculture was also taking off, probably somewhere in the fertile crescent.

Wikipedia: The Egyptians since 2000 BC subdivided daytime and nighttime into twelve hours each, hence the seasonal variation of the length of their hours.

The Hellenistic astronomers Hipparchus (c. 150 BC) and Ptolemy (c. AD 150) subdivided the day into sixty parts (the sexagesimal system). They also used a mean hour(124 day); simple fractions of an hour (14, 23, etc.); and time-degrees (1360 day, equivalent to four modern minutes).

The Babylonians after 300 BC also subdivided the day using the sexagesimal system, and divided each subsequent subdivision by sixty: that is, by 160, by 160 of that, by 160of that, etc., to at least six places after the sexagesimal point – a precision equivalent to better than 2 microseconds. The Babylonians did not use the hour, but did use a double-hour lasting 120 modern minutes, a time-degree lasting four modern minutes, and a barleycorn lasting 313 modern seconds (the helek of the modern Hebrew calendar), but did not sexagesimally subdivide these smaller units of time. No sexagesimal unit of the day was ever used as an independent unit of time.

Today the use of 60 still predominates for time, for navigation and geometry. But generally only for units already defined in antiquity. A base of 10 is used for units found to be necessary in more recent times. Subdivision of a second of time or a second of arc is always using the decimal system rather than by the duodecimal or the sexagesimal system.

If we had six fingers on each hand the decimal system would never have seen the light of day. A millisecond would then be 1/ 1728th of a second. It is a good thing we don’t have 7 fingers on each hand, or – even worse – one hand with 6 fingers and one with 7. Arithmetic with a tridecimal system of base 13 does not entice me. But if I was saddled with 13 digits on my hands I would probably think differently.

 

Physics came first and then came chemistry and later biology

August 19, 2015

I generally take it that there are only 3 basic sciences, physics, chemistry and biology. I take logic to be the philosophical framework and the background for the observation of the universe. Mathematics is then not a science but a language by which the observations of the universe can be addressed. All other sciences are combinations or derivatives of the three basic sciences. Geology, astronomy, cosmology, psychology, sociology, archaeology, and all the rest derive from the basic three.

I was listening to a report today about some Japanese researchers  who generated protein building blocks by recreating impacts by comets containing water, amino acids and silicate. Some of the amino acids linked together to form peptides (chained molecules). Recurring lengths of peptide chains form proteins and that leads to life. What interested me though was the element of time.

Clearly “chemistry” had to exist before “biology” came into existence. Chemistry therefore not only comes first and “higher” in the hierarchy of the existence of things but is also a necessary, but insufficient, requirement for “biology” to exist. Chemistry plus some “spark” led to biology. In that case the basic sciences are reduced to two since biology derives from chemistry. I cannot conceive of biology preceding chemistry. The elements and atoms and molecules of chemistry had to exist before the “spark” of something brough biology into existence.

chemical reactions (chemistry) + “spark of life”(physics?) = biology

By the same token, does physics precede chemistry? I think it must. Without the universe existing (physics) and all the elements existing within it (which is also physics) and without all the forces acting upon the elements (still physics), there would be no chemistry to exist. Or perhaps the Big Bang was physics and the creation of the elements itself was chemistry? But considering that nuclear reactions (fusion or fission) and the creation of new elements are usually considered physics, it would seem that the existence of physics preceded the existence of chemistry. The mere existence of elements would be insufficient to set in motion reactions between the elements. Some other forces are necessary for that (though some of these forces are even necessary for the existence of the elements). Perhaps physics gives the fundamental particles (whatever they are) and then chemistry begins with the formation of elements? Whether chemistry starts with elements or with the fundamental particles, physics not only must rank higher as a science, it must have come first. Particles must first exist before they can react with each other.

Particles (physics) + forces (physics) = chemistry.

In any event, and by whatever route I follow, physics preceded chemistry, and physics must exist first for chemistry to come into being. That makes chemistry a derivative of physics as biology is a derivative of chemistry.

We are left with just one fundamental science – physics.

by elfbrazil wikipedia


%d bloggers like this: