Archive for the ‘Humans’ Category

Nutrition rather than genetics when it comes to height over the last 100 years

September 2, 2013

Nutrition – and especially nutrition in the early years of life – has dominated the development of human height over the last 100 years. An average growth of 11 cm in the last 100 years. One hundred years is just over 5 generations and far too short a time for Darwinian genetics to have had any significant impact. This increase in height, rather than being hampered, actually accelerated during 2 World Wars and the Great Depression in the 15 European countries studied.

But now as the height impact of improved nutrition plateaus, perhaps the next 100 years and five generations of fast food will bring an 11cm increase in human width!!

In the “nature” versus “nurture” debate it only convinces me further that for all genetic traits, the particular set of genes in an individual only provides a Bell curve of the available framework for the expression of that trait. And there will be a Bell curve for each “trait” which is genetically determined. Thereafter it is “nurture” and/or the existing environment which determines the level to which that trait is expressed.

Nature and Nurture

Nature and Nurture

Science Codex: 

The average height of European males increased by an unprecedented 11cm between the mid-nineteenth century and 1980, according to a new paper published online today in the journalOxford Economic Papers. Contrary to expectations, the study also reveals that average height actually accelerated in the period spanning the two World Wars and the Great Depression.

Timothy J. Hatton, Professor of Economics at the University of Essex and the Research School of Economics at Australian National University in Canberra, examined and analysed a new dataset for the average height (at the age of around 21) of adult male birth cohorts, from the 1870s to 1980, in fifteen European countries. The data were drawn from a variety of sources. For the most recent decades the data were mainly taken from height-by-age in cross sectional surveys. Meanwhile, observations for the earlier years were based on data for the heights of military conscripts and recruits. The data is for men only as the historical evidence for women’s heights is severely limited.

Professor Hatton said, “Increases in human stature are a key indicator of improvements in the average health of populations. The evidence suggests that the improving disease environment, as reflected in the fall in infant mortality, is the single most important factor driving the increase in height. The link between infant mortality and height has already been demonstrated by a number of studies.” Infant mortality rates fell from an average of 178 per thousand in 1871-5 to 120 per thousand in 1911-15. They then plummeted to 41 in 1951-5 and 14 in 1976-80.

In northern and middle European countries (including Britain and Ireland, the Scandinavian countries, Netherlands, Austria, Belgium, and Germany) there was a “distinct quickening” in the pace of advance in the period spanning the two World Wars and the Great Depression. This is striking because the period largely predates the wide implementation of major breakthroughs in modern medicine and national health services. One possible reason, alongside the crucial decline in infant mortality, for the rapid growth of average male height in this period was that there was a strong downward trend in fertility at the time, and smaller family sizes have already been linked with increasing height.

Other factors in the increase in average male height include an increased income per capita; more sanitary housing and living conditions; better general education about health and nutrition (which led to better care for children and young people within the home); and better social services and health systems.

Source: Oxford University Press

Humans are enabling some animals to evolve larger brains

August 23, 2013

As humans implement artificial selection on themselves – a process and a force for evolution operating much faster than natural selection could  – they are also changing the environment for evolutionary selection in which many animals live. Some of these species are enormously successful in adapting to their new environments while others cannot cope with the change. The changes are so profound that the evolutionary trajectories of these species are changing. It could be that the “urbansisation” of these species has led some species to follow an evolutionary path which includes increasing brain size in their new, man-made surroundings. Carl Zimmer writes in the New York Times:

Are We Making Animal Brains Bigger? 

Evolutionary biologists have come to recognize humans as a tremendous evolutionary force. In hospitals, we drive the evolution of resistant bacteria by giving patients antibiotics. In the oceans, we drive the evolution of small-bodied fish by catching the big ones.

In a new study, a University of Minnesota biologist, Emilie C. Snell-Rood, offers evidence suggesting we may be driving evolution in a more surprising way. As we alter the places where animals live, we may be fueling the evolution of bigger brains.

Dr. Snell-Rood bases her conclusion on a collection of mammal skulls kept at the Bell Museum of Natural History at the University of Minnesota. Dr. Snell-Rood picked out 10 species to study, including mice, shrews, bats and gophers. She selected dozens of individual skulls that were collected as far back as a century ago. An undergraduate student named Naomi Wick measured the dimensions of the skulls, making it possible to estimate the size of their brains.

Two important results emerged from their research. In two species — the white-footed mouse and the meadow vole — the brains of animals from cities or suburbs were about 6 percent bigger than the brains of animals collected from farms or other rural areas. Dr. Snell-Rood concludes that when these species moved to cities and towns, their brains became significantly bigger.

…..  Studies by other scientists have linked better learning in animals with bigger brains. In January, for example, researchers at Uppsala University in Sweden described an experiment in which they bred guppies for larger brain sizes. The big-brained fish scored better on learning tests than their small-brained cousins.

Animals colonizing cities and towns have to learn how to find food in buildings and other places their ancestors hadn’t encountered. ..

Humans still have monkey feet

August 21, 2013

Humans may have split from chimpanzees at least 7 million years ago and  left the trees to become ground-dwellers sometime after that. By 4 million years ago humans had developed bipedalism. Since the 1930’s the prevailing view has been that the evolution of human bipedalism has led to – or has been enabled by – human feet functioning very differently to those of other apes, “due to the development of arches in the mid-foot region and the supposed rigidity of that on the outside edge of the foot“.

But it now appears that human feet are not that unique and not so different to those of our ape cousins.

“Our limbs, however, did not adapt to life on the ground anywhere near as much as those of other ground-dwelling animals such as horses, hares and dogs. Our tests showed that our feet are not as stiff as originally thought and actually form part of a continuum of variation with those of other great apes.”

Karl T. Bates et al, The evolution of compliance in the human lateral mid-foot, Proc. R. Soc. B 22 October 2013 vol. 280, no. 1769, 2013.1818

rspb.royalsocietypublishing.org/lookup/doi/10.1098/rspb.2013.1818

Abstract:Fossil evidence for longitudinal arches in the foot is frequently used to constrain the origins of terrestrial bipedality in human ancestors. This approach rests on the prevailing concept that human feet are unique in functioning with a relatively stiff lateral mid-foot, lacking the significant flexion and high plantar pressures present in non-human apes. This paradigm has stood for more than 70 years but has yet to be tested objectively with quantitative data. Herein, we show that plantar pressure records with elevated lateral mid-foot pressures occur frequently in healthy, habitually shod humans, with magnitudes in some individuals approaching absolute maxima across the foot. Furthermore, the same astonishing pressure range is present in bonobos and the orangutan (the most arboreal great ape), yielding overlap with human pressures. Thus, while the mean tendency of habitual mechanics of the mid-foot in healthy humans is indeed consistent with the traditional concept of the lateral mid-foot as a relatively rigid or stabilized structure, it is clear that lateral arch stabilization in humans is not obligate and is often transient. These findings suggest a level of detachment between foot stiffness during gait and osteological structure, hence fossilized bone morphology by itself may only provide a crude indication of mid-foot function in extinct hominins. Evidence for thick plantar tissues in Ardipithecus ramidus suggests that a human-like combination of active and passive modulation of foot compliance by soft tissues extends back into an arboreal context, supporting an arboreal origin of hominin bipedalism in compressive orthogrady. We propose that the musculoskeletal conformation of the modern human mid-foot evolved under selection for a functionally tuneable, rather than obligatory stiff structure.

Liverpool University Press Release:

In a study of more than 25,000 human steps made on a pressure-sensitive treadmill at the University’s Gait Laboratory, scientists at Liverpool have shown that despite having abandoned life in the trees long ago, our feet have retained a surprising amount of flexibility, the type seen in the feet of other great apes, such as orang-utans and chimpanzees, that have remained largely tree-dwelling.

Professor Robin Crompton, from the University’s Institute of Ageing and Chronic Disease, explains: “It has long been assumed that because we possess lateral and medial arches in our feet – the lateral one supposedly being rigid and supported in bone – that our feet differ markedly to those of our nearest relatives, whose mid-foot is fully flexible and makes regular ground contact.

“This supposed ‘uniqueness’, however, has never been quantitatively tested.  We found that the range of pressures exerted under the human mid-foot, and thus the internal mechanisms that drive them, were highly variable, so much so that they actually overlapped with those made by the great apes.”

It has previously been thought that humans who make contact with the ground with the mid-foot region are primarily those that suffer from diabetes or arthritis, both of which can impact on the structure of the feet.  Research showed, however, that two thirds of normal healthy subjects produced some footfalls where the mid-foot touches the ground, with no indication that this is other than an aspect of normal healthy walking.

 Dr Karl Bates, from the University’s Institute of Ageing and Chronic Disease, said: “Our ancestors probably first developed flexibility in their feet when they were primarily tree-dwelling, and moving on bendy branches, but as time passed and we became more and more ground-dwelling animals, some new features evolved to enable us to move quickly on the ground.

“Our limbs, however, did not adapt to life on the ground anywhere near as much as those of other ground-dwelling animals such as horses, hares and dogs. Our tests showed that our feet are not as stiff as originally thought and actually form part of a continuum of variation with those of other great apes.

“We hypothesise that despite becoming nearly exclusively ground dwelling we have retained flexibility in the feet to allow us to cope effectively with the differences in hard and soft ground surfaces which we encounter in long distance walking and running.  The next part of our study will be testing this theory, which could offer a reason why humans can outrun a horse, for example, over long distances on irregular terrain.”

Faroe Islands were colonised 300-500 years before the Vikings

August 20, 2013

Somebody got there before the Vikings did – some 300 and 500 years earlier. Norse settlers reached Iceland in the 9th century and probably reached Greenland around the 11th century. But the archaeological evidence is that some unknown colonists had already reached the Faroes in the 4th- 6th century and again between the 6th -8th centuries. There is a theory that they could have been monks from Ireland (St. Brendan?) but I think it is still highly likely that these early explorers/colonists were sea-faring peoples out of Scandinavia.

Mike J. Church, Símun V. Arge, Kevin J. Edwards, Philippa L. AscoughJulie M. Bond, Gordon T. Cook, Steve J. Dockrill, Andrew J. DugmoreThomas H. McGovernClaire Nesbitt and Ian A. Simpson, The Vikings were not the first colonizers of the Faroe Islands, Quaternary Science Reviews (2013)

dx.doi.org/10.1016/j.quascirev.2013.06.011

Faroe Islands -Google Earth

Faroe Islands -Google Earth

Abstract

We report on the earliest archaeological evidence from the Faroe Islands, placing human colonization in the 4th–6th centuries AD, at least 300–500 years earlier than previously demonstrated archaeologically. The evidence consists of an extensive wind-blown sand deposit containing patches of burnt peat ash of anthropogenic origin. Samples of carbonised barley grains from two of these ash patches produced 14C dates of two pre-Viking phases within the 4th–6th and late 6th–8th centuries AD. A re-evaluation is required of the nature, scale and timing of the human colonization of the Faroes and the wider North Atlantic region.

Durham University Press Release:

The Faroe Islands were colonised much earlier than previously believed, and it wasn’t by the Vikings, according to new research.

New archaeological evidence places human colonisation in the 4th to 6th centuries AD, at least 300-500 years earlier than previously demonstrated. 

The research, directed by Dr Mike J Church from Durham University and Símun V Arge from the National Museum of the Faroe Islands as part of the multidisciplinary project “Heart of the Atlantic”, is published in the Quaternary Science Reviews.

The research challenges the nature, scale and timing of human settlement of the wider North Atlantic region and has implications for the colonisation of similar island groups across the world.

Sandoy, Faroes - Google Maps

Sandoy, Faroes – Google Maps

The Faroes were the first stepping stone beyond Shetland for the dispersal of European people across the North Atlantic that culminated on the shores of continental North America in the 11th century AD, about 500 years before Columbus made his famous voyage.

The research was carried out on an archaeological site at Á Sondum on the island of Sandoy. 

Analysis showed an extensive windblown sand deposit containing patches of burnt peat ash from human activity, dating human settlement to pre-Viking phases. These ash spreads contained barley grains which were accidentally burnt in domestic hearths and were then spread by humans onto the windblown sand surface during the 4th-6thcenturies and 6th-8th centuries, a common practice identified in the North Atlantic during this period to control wind erosion.

Lead author Dr Mike Church, from Durham University’s Department of Archaeology, said: “There is now firm archaeological evidence for the human colonisation of the Faroes by people some 300-500 years before the large scale Viking colonisation of the 9th century AD, although we don’t yet know who these people were or where they came from.

“The majority of archaeological evidence for this early colonisation is likely to have been destroyed by the major Viking invasion, explaining the lack of proof found in the Faroes for the earlier settlement. This also raises questions about the timing of human activity on other islands systems where similarly evidence may have been destroyed.”

Co-author, Símun V Arge from the National Museum of the Faroe Islands, said: “Although we don’t know who the people were that settled here and where they came from, it is clear that they did prepare peat for use, by cutting, drying and burning it which indicates they must have stayed here for some time.

Ancient Sarmatian burial tomb of noble descendant of the Amazon warrior-women found intact

August 8, 2013
Sarmatian (Amazon) warriior woman (image from RealmsofGold)

Sarmatian (Amazon) warriior woman (image from RealmsofGold)

It is thought that the predecessors of the ancient warrior-women of the Sauromatian culture of central Asia dating from the 6th to the 4th century B.C.  could have been the inspiration for the Amazons of Homer’s Iliad. The Iliad possibly dates from the 8th or the 7th century B.C and describes

a race of fierce women who mated with vanquished male foes and kept only the female children they bore, were believed to occupy the area around the Black Sea. Amazon women also crop up in Greek myths. One of the labors of Hercules, for example, required him to acquire the girdle of the Amazon queen, Hippolyte.

… The works of the Greek historian Herodotus, written around the 5th century B.C., describe a group of female warriors who lost to the Greeks at the battle of Thermodon. Herodotus’ Amazons were taken prisoner and put on ships, but overwhelmed and killed the Greek crew. Unable to sail themselves, the women drifted to the shores of the Black Sea, to the territory of the Scythians, a nomadic culture of Iranian descent. The women, Herodotus says, intermarried with the Scythian men, and convinced their new husbands to move northeast across the flat grassy plains, high mountains, and searing deserts of the Russian steppes, where the group eventually evolved into the Sauromatian culture.

The Sauromatians were succeeded by the Sarmatians from around the 4th century B.C. who were also nomadic and fierce warriors who held their warrior-women  – now evolving into “noble” women –  in high regard:

The culture, which had been expanding its territory, soon shifts its focus. “They become raiders and traders, with forays to the west to interface with the Romans, and they relocate to cities and to areas along large trade routes,” …… “Their wealth increases. We see that in their burial items. We see strong, powerful women, but their role changes. We find burials of women that still retain cultic artifacts, indicating that they were a priestess of some sort, but there is much more gold and more secular ornamentation — more golden cups, more golden jewelry, elaborate things — and less weaponry. This type of evolution is a normal manifestation of culture.” 

Filippovka "Tsar Tumulus" mounds (Google Maps)

Filippovka “Tsar Tumulus” mounds (Google Maps)

The Sarmatians held sway for about 900 years until about 400 AD when they were overrun by barbarians from the West. Now a completely intact tomb of a Sarmatian noble woman dating from about 2500 years ago has been found at the  “Tsar Tumulus” mounds near Filippovka in Southern Russia reports RiaNovosti.

MOSCOW, August 6 (RIA Novosti) – Archaeologists have found the intact burial chamber of a noble woman from a powerful tribe that roamed the Eurasian steppes 2,500 years ago in southern Russia, an official said Tuesday.

The Sarmatians were a group of Persian-speaking tribes that controlled what is now parts of southern Russia, Ukraine and Central Asia from around 500 BC until 400 AD. They were often mentioned by ancient Greek historians and left luxurious tombs with exquisite golden and bronze artifacts that were often looted by gravediggers.

sarmatian trasures (image from en.ria.ru)

sarmatian trasures (image from en.ria.ru)

But the burial site found near the the village of Filippovka in the Orenburg region has not been robbed – and contained a giant bronze kettle, jewelry, a silver mirror and what appears to be containers for cosmetics, said history professor Gulnara Obydennova who heads the Institute of History and Legal Education in the city of Ufa.

“The find is really sensational also because the burial vault was intact – the objects and jewelry in it were found the way they had been placed by the ancient nomads,” she told RIA Novosti.

The vault – located 4 meters (13 feet) underground – was found in the “Tsar Tumulus,” a group of two dozen mounds where hundreds of golden and silver figurines of deer, griffins and camels, vessels and weapons have been found since the 1980s.

The woman’s skeleton was still covered with jewelry and decorations, and her left hand held a silver mirror with an ornamented golden handle, Obydennova said.

The descendants of the Sarmatians include Ossetians, an ethnic group living in the Caucasus region, who speak a language related to Persian.

From Realmsof Gold:

Accomplished horse-breeders and horsemen, Sarmatians were nomadic Indo-European tribes closely related to the Scythians. The Roman historian Ammianus Marcellinus describes Sarmatian tribesmen as “tall and handsome, their hair inclines to blond; by the ferocity of their gaze they inspire dread. They delight in danger and warfare.” 

A fascinating feature of Sarmatian society was the high status accorded to women. Sarmatian warrior queens were renowned in antiquity. Herodotus affirmed that the Sarmatians were descendants of the Amazons and Scythians, whose women “frequently hunted on horseback with their husbands; in war taking the field; and wore the very same dress as the men.” The Sarmatian tradition had it that “no girl should wed till she had killed a man in battle.” In ancient kurgans, sumptuous female burials often included swords and arrowheads together with elegant jewelry inlaid with dazzling gems in the Hellenistic style. Eastern campaigns of Alexander the Great (356-323 BC) spread Greek influences throughout his huge empire and exposed local artisans to new styles. The composite style that emerged is known as Hellenistic. 

The Sarmatians were overrun by the invasions of the Goths and Huns in the 3rd and 4th centuries AD. – 

Y-chromosome study dates most recent common male ancestor to only 120 -156 thousand years ago

August 2, 2013

I am not quite sure why a difference in the time when our most recent common male ancestor (Y-chromosomal Adam) lived to that when our most recent common female ancestor (Mitochondrial Eve) lived should actually be a discrepancy which needs resolving.

Except of course if the authors wish to believe that all 7 billion humans alive today actually derive from a single couple!!

G. David Poznik et al, Sequencing Y Chromosomes Resolves Discrepancy in Time to Common Ancestor of Males Versus Females, Science 2 August 2013: 562-565. DOI:10.1126/science.1237619

This new study claims

… that this initial paper on Y chromosome sequence diversity provides important first evidence that the male most recent common ancestor did not live more recently than the female most recent common ancestor. 

The study involved Y chromosomes obtained through the Human Genome Diversity Project, and from other sources. It included chromosomes from 69 men in several populations in sub-Saharan Africa, and from Siberia, Cambodia, Pakistan, Algeria and Mexico.

Abstract: The Y chromosome and the mitochondrial genome have been used to estimate when the common patrilineal and matrilineal ancestors of humans lived. We sequenced the genomes of 69 males from nine populations, including two in which we find basal branches of the Y-chromosome tree. We identify ancient phylogenetic structure within African haplogroups and resolve a long-standing ambiguity deep within the tree. Applying equivalent methodologies to the Y chromosome and the mitochondrial genome, we estimate the time to the most recent common ancestor (TMRCA) of the Y chromosome to be 120 to 156 thousand years and the mitochondrial genome TMRCA to be 99 to 148 thousand years. Our findings suggest that, contrary to previous claims, male lineages do not coalesce significantly more recently than female lineages.

The study seems not to have looked at recent evidence of the age of the male lineage. This other evidence is more convincing and suggests that Y-chromosomal Adam is very much older and may lie some 237-581 thousand years ago and that Mitochondrial Eve goes back to about 200 thousand years ago.

Silly science? Evolution does not favour the selfish

August 1, 2013

Silly season is upon us.

Two Michigan University researchers claim that evolution will not sustain a “selfish gene” but will eventually select for cooperation.

Why am I not in the least bit convinced?

If either selfishness or cooperation was genetically determined, and if survival was dependent upon such a choice, then one or the other should have become extinct a long time ago.  The silliness of this work lies first in the assumption that a behavioural characteristic – even if crucial for survival – is merely determined by genetics.  Second, evolution never selects for excellence – whether in superlative selfishness or for unstinting cooperation. It represents the minimum of behavioural traits needed to survive till reproduction.

Evolution couldn’t care less if individuals are selfish or cooperative. It only results from those individuals sufficiently selfish or sufficiently cooperative for survival until reproduction. 

Of course the Daily Mail manages to put it in a remarkably silly headline:

Selfish people ‘will eventually die out’ because evolution favours cooperation

The paper is published in Nature Communications.

Christoph Adami, Arend Hintze. Evolutionary instability of zero-determinant strategies demonstrates that winning is not everythingNature Communications, 2013; 4 DOI:10.1038/ncomms3193

The accompanying press release trumpets

Evolution will punish you if you’re selfish and mean 

Two Michigan State University evolutionary biologists offer new evidence that evolution doesn’t favor the selfish, disproving a theory popularized in 2012.

“We found evolution will punish you if you’re selfish and mean,” said lead author Christoph Adami, MSU professor of microbiology and molecular genetics. “For a short time and against a specific set of opponents, some selfish organisms may come out ahead. But selfishness isn’t evolutionarily sustainable.”

The paper appears in the current issue of Nature Communications and focuses on game theory, which is used in biology, economics, political science and other disciplines. Much of the last 30 years of research has focused on how cooperation came to be, since it’s found in many forms of life, from single-cell organisms to people.

In 2012, a scientific paper unveiled a newly discovered strategy – called zero-determinant – that gave selfish players a guaranteed way to beat cooperative players.

“The paper caused quite a stir,” said Adami, who co-authored the paper with Arend Hintze, molecular and microbiology research associate. “The main result appeared to be completely new, despite 30 years of intense research in this area.”

Adami and Hintze had their doubts about whether following a zero determinant strategy (ZD) would essentially eliminate cooperation and create a world full of selfish beings. So they used high-powered computing to run hundreds of thousands of games and found ZD strategies can never be the product of evolution. While ZD strategies offer advantages when they’re used against non-ZD opponents, they don’t work well against other ZD opponents.

“In an evolutionary setting, with populations of strategies, you need extra information to distinguish each other,” Adami said.

So ZD strategies only worked if players knew who their opponents were and adapted their strategies accordingly. A ZD player would play one way against another ZD player and a different way against a cooperative player.

“The only way ZD strategists could survive would be if they could recognize their opponents,” Hintze said. “And even if ZD strategists kept winning so that only ZD strategists were left, in the long run they would have to evolve away from being ZD and become more cooperative. So they wouldn’t be ZD strategists anymore.” 

Game theory for an individual game or even for a succession of games is one thing but evolution does not care how selfish or how cooperative an individual is.

Canada used indigenous children to “study” malnutrition

July 28, 2013

Yet another depressing story of how, in the name of “science”, the “establishment” made use of less “worthy” populations to carry out medical experiments.

This time in Canada from 1942 -1952.

There was no difference of principle and only one of degree between the medical experiments carried out in Nazi Germany and those carried out on native or disadvantaged populations in Australia, Canada, and the USA (among many other countries).

We may like to think that it does not happen any more. I am not so sure. The real story of Haiti and its cholera and the use of cheap, untested vaccines is yet to be told Similarly, some of the stories about the intentional “creation” of new strains of influenza and the subsequent discovery and dissemination of new vaccines for their cure may never ever become public.

Mosby, I. Social History 46, 145–172 (2013). Administering Colonial Science: Nutrition Research and Human Biomedical Experimentation in Aboriginal Communities and Residential Schools, 1942–1952

Abstract: Between 1942 and 1952, some of Canada’s leading nutrition experts, in cooperation with various federal departments, conducted an unprecedented series of nutritional studies of Aboriginal communities and residential schools. The most ambitious and perhaps best known of these was the 1947–1948 James Bay Survey of the Attawapiskat and Rupert’s House Cree First Nations. Less well known were two separate long-term studies that went so far as to include controlled experiments conducted, apparently without the subjects’ informed consent or knowledge, on malnourished Aboriginal populations in Northern Manitoba and, later, in six Indian residential schools. This article explores these studies and experiments, in part to provide a narrative record of a largely unexamined episode of exploitation and neglect by the Canadian government. At the same time, it situates these studies within the context of broader federal policies governing the lives of Aboriginal peoples, a shifting Canadian consensus concerning the science of nutrition, and changing attitudes towards the ethics of biomedical experimentation on human beings during a period that encompassed, among other things, the establishment of the Nuremberg Code of experimental research ethics.

Nature also reports:

Canadian government scientists used malnourished native populations as unwitting subjects in experiments conducted in the 1940s and 1950s to test nutritional interventions. The tests, many of which involved children at state-funded residential schools, had been largely forgotten until they were described earlier this month in the journal Social History by Ian Mosby, who studies the history of food and nutrition at the University of Guelph in Canada.

The work began in 1942, when government scientists visited several native communities in northern Manitoba and discovered widespread hunger and malnutrition. “Their immediate response was to study the problem by testing nutritional supplements,” says Mosby. From a group of 300 malnourished people selected for the tests, 125 were given vitamin supplements, and the rest served as ‘untreated’ controls. ….

Nancy Walton, a medical ethicist at Ryerson University in Toronto, Ontario, and former chairwoman of the university’s research-ethics board, says that such a project would never be allowed today, “but in the context of that time, it’s unfortunately not surprising”. Awareness of the need for informed consent in human studies was growing — informed consent was a central tenet of the Nuremberg Code, developed in the late 1940s — but the idea had not yet been adopted around the world.

“It’s not just bad ethics, it’s bad science,” Walton says of the Canadian government research. “They didn’t appear to try and prove or disprove any hypothesis that I can see, or make any statistical correlations.”

Indeed, says Mosby, very little of value came out of the research. He found no evidence that the northern Manitoba study was completed or published. The school experiments were presented at conferences and published, but they led to no important advances in nutritional science or improvements in conditions at the schools. “They mostly just confirmed what they already knew,” Mosby says. ….

 

On birth rates, abortions and “eugenics by default”

July 20, 2013

Selective breeding works.

Humans have applied it – and very successfully – for plants and animals since antiquity.

There is nothing “wrong” conceptually with eugenics for the selective breeding of humans. But the Nazis – and not only the Nazis – brought all of eugenics into disrepute by the manner in which they tried to apply the concept.  Because of the Nazis and the coercive treatment of some minorities in Europe and of the Aborigines in Australia where forced sterilisation, forced abortions, genocide, euthanasia and mass murder were used to try and control the traits of future generations, eugenics has come to be inextricably associated with the methods used. Even in more recent times genocide, mass rapes and mass murder have been evident even if not openly for the purpose of controlling the genetic characteristics of the survivors.

I note that evolution by “natural selection” does not intentionally select for any particular traits. Surviving traits are due to the deselection of individuals who have not the wherewithal to survive until reproduction. Natural Selection in that sense is not pro-active and evolution is merely the result of changing environments which causes individuals of a species who cannot cope with the change to perish. Evolution has no direction of its own and is just the result of who survives an environmental change. It is not not some great force which “selects” or  leads a species into a desired future. Species fail when the available spread of traits and characteristics among the existing individuals of that species is not sufficient to generate some individuals who can survive the environmental change. Natural Selection is therefore not an intentional selection process but represents the survivors of change. Of course, not all traits have a direct influence on survival. All “collateral” traits are carried along – coincidentally and unintentionally –  with those traits which do actually help survival in any particular environment. But as conditions change what was once a collateral trait may become one which assists in survival.

As breeding techniques go, “Natural Selection” relies on a wide variation of traits throwing up viable individuals able to cope no matter how the environment changes, while “Artificial Selection” chooses particular traits to promote but runs the risk of unwanted collateral traits showing up (as with some bulldogs unable to breathe or with the development of killer bees). Natural selection is the shot-gun to the rifle of artificial selection. The shot gun usually succeeds to hit the target but may not provide a “kill”. But the rifle usually kills but it could easily miss or even kill the wrong target!

Of all the babies conceived today about 1% are conceived by “artificial” means (IVF or surrogacy) and include a measure of genetic selection. Even the other 99% include a measure of partner selection and – though very indirectly – a small measure of genetic selection. A significant portion (perhaps around 20%?) are through “arranged” marriages where some due diligence accompanies the “arrangement”. Such due diligence tends to focus on economic and social checks but does inherently contain some “genetic selection” (for example by excluding partners with histories of mental or other illnesses in their families). If eugenics was only about deliberate breeding programs seeking particular traits then we would not be very far down the eugenics road. But more importantly around 20-25% of babies conceived are aborted and represent a genetic deselection. As a result, a form of “eugenics by default” is already being applied today.

(The rights and wrongs of abortion is another discussion which – in my opinion – is both needless and tainted. Abortion, I think, is entirely a matter for the pregnant female and her medical advisors. I cannot see how anybody else – male or female – can presume to impose the having or not having of an abortion on any pregnant person. Even the male sperm donor does not, I think,  warrant any decisive role in what another person should or should not do. No society requires that a female should get its approval for conceiving or having a child (with the exception of China’s one-child policy). Why then should not having a child require such approval? While society may justifiably seek to impose rules about infanticide, abortion – by any definition – is not the same as infanticide. Until the umbilical is severed, a foetus is essentially parasitic, totally dependent upon its host- mother and not – in my way of thinking – an independent entity. I cannot and do not have much respect for the Pope or other religious mullahs who would determine if I should shave or not or if a woman may or may not have an abortion).

Consider our species as we breed today.

In general the parents of children being conceived today share a geographical habitat. Apart from the necessity – so far – of the parents having to meet physically, it is geographical proximity which I think has dominated throughout history. Victors of war, conquerors, immigrants, emigres and wanderers have all succumbed to the lures of the local population within a few generations. In consequence, partners often share similar social and religious and ethnic backgrounds. But the geographical proximity takes precedence. Apart from isolated instances (Ancient Greece, the Egypt of the Pharaohs, the persecution of the Roma, European Royalty, Nazi Germany and the caste-system on the Indian sub-continent), selective breeding solely for promoting or destroying specific genetic traits has never been the primary goal of child-bearing. Even restrictive tribes where marrying outside the “community” (some Jews and Parsis for example) is discouraged have been and still are more concerned about not diluting inherited wealth than any desire to promote specific genetic traits.

But it is my contention that we are in fact – directly and indirectly –  exercising an increasing amount of genetic control in the selection and deselection of our offspring . So much so that we already have “eugenics by default” being applied to a significant degree in the children being born today.

Currently the global birth rate is around 20 per 1000 of population (2%), having been around 37 in 1950 and projected to reduce to around 14 (1.4%) by 2050.

Crude birth rate actual and forecast UN data

Crude birth rate actual and forecast: UN data

Of these the number conceived by artificial means (IVF and surrogacy) is probably around 1% (around 0.2 births per 1000 of population). For example for around 2% of live births in the UK in 2010 , conception was by IVF. In Europe this is probably around 1.5% and worldwide it is still less than 1%. But this number is increasing and could more than double by 2050 as IVF spreads into Asia and Africa. By 2050 it could well be that for around 3% of all live births, conception has been by “artificial” means and that there will be a much greater degree of genetic screening applied.

Abortion rates increased sharply after the 1950’s as the medical procedures developed to make this a routine procedure. Done properly it is a relatively risk-free procedure though there are still many “unsafe” abortions in the developing and religiously repressive countries. Since 1995 abortion rates worldwide have actually decreased from about 35 per 1000 women of child-bearing age to about 28 today.  These numbers would indicate that the number of abortions taking place today is around 20-25% of the number of live births.

http://www.economist.com/blogs/graphicdetail/2012/01/daily-chart-7

Global abortion rates: graphic Economist

Global abortion rates: graphic Economist

So of every 100 babies conceived around 25% are deselected by abortion and 75 proceed to birth. Only 1 of these 75 would have been conceived by “artificial” means. The genetic deselection by abortion is both direct and indirect. The detection of genetic defects in the foetus often leads to abortion and this proportion can be expected to increase as techniques for the early identification of defects or the propensity for developing a debilitating disease are perfected. In many cases abortion is to safeguard the health of the mother and does not – at least directly – involve any deselection for genetic reasons. In many countries – especially India – abortions are often carried out to avoid a girl child and this is a direct genetic deselection. It seems to apply particularly for a first child. The majority of abortions today are probably for convenience. But if the “maternal instinct” is in any way a genetic charateristic, then even such abortions would tend to be deselection in favour of those who do have the instinct.

The trends I think are fairly clear. The proportion of “artificial births” is increasing and the element of genetic selection by screening for desired charateristics in such cases is on the increase. The number of abortions after conception would seem to be on its way to some “stable” level of perhaps 25% of all conceptions. The genetic content of the decision to abort however is also increasing and it is likely that the frequency of births where genetic disorders exist or where the propensity for debilitating disease is high will decrease sharply as genetic screening techniques develop further.

It is still a long way off to humans breeding for specific charateristics but even what is being practised now is the start of eugenics in all but name. And it is not difficult to imagine that eugenics – without any hint of coercion – but where parents or the mothers-to-be select for certain characteristics or deselect (by abortion) to avoid others in their children-to-be will be de rigueur.


 

Chimpanzees and orangutans have long term memories too

July 19, 2013

image The Telegraph

Interesting work in a new paper is published in Current Biology. It supports my view that life is a continuum from simple to complex with no place for – or any need to invoke – a “soul”. At what point the brain of a species is large enough and complex enough not only to be able to “save” memories but also to then access these data at a later time is also unknown. I have little doubt from the  dogs and cats that I have known that they can “remember” people and behaviour from many years before  – even if they are often  supposed to live only in the “now”. At what point in this continuum “self-awareness” emerges is not known but I suspect that it depends on the definition of “self-awareness” and some level of self-awareness lies very close to the “simple” end of the scale of life.

(Certainly the mosquito which got trapped in my study yesterday was not just “self-aware”, it was also maliciously aware of me. If it had a soul it has now been consigned to mosquito hell!!)

This work shows that chimpanzees and orangutans have the ability to “remember events that happened two weeks or three years ago, but also that they can remember them even when they are not expecting to have to recall those events at a later time” 

Gema Martin-Ordas, Dorthe Berntsen, Josep Call. Memory for Distant Past Events in Chimpanzees and OrangutansCurrent Biology, 2013; DOI: 10.1016/j.cub.2013.06.017

Highlights

  • First study addressing unexpected and cued recall of both general and unique events
  • Chimpanzees and orangutans recalled events that happened weeks and years earlier
  • Subjects also showed evidence of binding
  • Chimpanzees and orangutans share this form of autobiographical memory with humans

Summary

Determining the memory systems that support nonhuman animals’ capacity to remember distant past events is currently the focus an intense research effort and a lively debate. Comparative psychology has largely adopted Tulving’s framework by focusing on whether animals remember what-where-when something happened (i.e., episodic-like memory). However, apes have also been reported to recall other episodic components after single-trial exposures. Using a new experimental paradigm we show that chimpanzees and orangutans recalled a tool-finding event that happened four times 3 years earlier (experiment 1) and a tool-finding unique event that happened once 2 weeks earlier (experiment 2). Subjects were able to distinguish these events from other tool-finding events, which indicates binding of relevant temporal-spatial components. Like in human involuntary autobiographical memory, a cued, associative retrieval process triggered apes’ memories: when presented with a particular setup, subjects instantaneously remembered not only where to search for the tools (experiment 1), but also the location of the tool seen only once (experiment 2). The complex nature of the events retrieved, the unexpected and fast retrieval, the long retention intervals involved, and the detection of binding strongly suggest that chimpanzees and orangutans’ memories for past events mirror some of the features of human autobiographical memory.

From Science Daily:

…. “Our data and other emerging evidence keep challenging the idea of non-human animals being stuck in time,” says Gema Martin-Ordas of Aarhus University in Denmark. “We show not only that chimpanzees and orangutans remember events that happened two weeks or three years ago, but also that they can remember them even when they are not expecting to have to recall those events at a later time.” ….. 

“I was surprised to find out not only that they remembered the event that took place three years ago, but also that they did it so fast!” Martin-Ordas says. “On average it took them five seconds to go and find the tools. Again this is very telling because it shows that they were not just walking around the rooms and suddenly saw the boxes and searched for the tools inside them. More probably, it was the recalled event that enabled them to find the tools directly.”