Friday, June 21, 2024

Prehistoric famines and disease drove spread of milk-drinking gene

The ability to digest lactose in adulthood did not spread among prehistoric Europeans as a result of increased milk consumption but due to the effects of famine and disease, scientists say. 

New analysis indicates that there was no correlation between the extent of milk consumption in prehistoric Europe and the ability to digest the sugar naturally occurring in milk. In fact, the research shows that Europeans were heavy consumers of milk thousands of years before the genetic trait became common across the continent. 

Although this digestive ability would have given prehistoric Europeans only a marginally more comfortable existence in normal times, the research published in the journal Nature, indicates that it may have saved their lives in harsh conditions and enabled them to pass on their DNA when others could not. 

While most European adults can drink milk without discomfort, about two-thirds of adults around the world today, and almost all adults 5,000 years ago, have not been so fortunate. That’s because the majority cannot digest lactose beyond infancy, meaning the sugar travels to their large intestine and may cause cramps, diarrhoea, and flatulence. 

To establish how lactase persistence, or the ability to digest milk in adulthood, evolved, Professor Richard Evershed, the study’s lead author from the University of Bristol’s School of Chemistry, assembled a database of almost 7,000 organic animal fat residues from 13,181 fragments of pottery and 554 archaeological sites. His findings indicate that milk was used extensively in European prehistory from the earliest days of farming almost 9,000 years ago. Regions where milk was heavily consumed continuously from 5,500BC included the British Isles, where he said the inhabitants were “accomplished dairy farmers”, as well as the rest of Northern Europe.

“The lactase persistence genetic variant was pushed to high frequency by some sort of turbocharged natural selection”

Professor Mark Thomas

Meanwhile, the team of Mark Thomas, Professor of Evolutionary Genetics at University College London and one of the study’s co-authors, assembled a database on the presence or absence of the lactase persistence variant using published DNA sequences from over 1,700 prehistoric European and Asian individuals.

By mapping patterns of milk use over time and analysing both ancient DNA and modern genetic and medical data, the study authors found that lactase persistence was uncommon until about 1000 BC — nearly 4,000 years after it was first detected around 4700–4600 BC.

“The lactase persistence genetic variant was pushed to high frequency by some sort of turbocharged natural selection. The problem is, such strong natural selection is hard to explain,” Thomas said.

His team developed a new statistical approach to examine how well changes in milk consumption through time explained the natural selection for lactase persistence. They found no relationship, even though they were able to show that they could detect one if it existed — challenging the long-held view that increasing consumption of milk drove the evolution of lactase persistence. 

Important clues came from Professor George Davey Smith’s team at Bristol University who studied the UK Biobank, comprised of genetic and medical data for over 300,000 living individuals. They found only minimal differences in milk drinking behaviour between genetically lactase persistent and non-persistent people. Significantly, the large majority of people who were genetically lactase non-persistent experienced no short or long-term negative health effects when they consume milk.

Davey Smith, an epidemiologist, said: “Our findings show milk use was widespread in Europe for at least 9,000 years, and healthy humans, even those who are not lactase persistent, could happily consume milk without getting ill. However, drinking milk in lactase non-persistent individuals does lead to a high concentration of lactose in the intestine, which can draw fluid into the colon, and dehydration can result when this is combined with diarrhoeal disease”.

The researchers argue that during later prehistory, as populations and settlement sizes grew, human health would have been increasingly impacted by poor sanitation and diarrhoeal diseases. Under these conditions, consuming milk would have resulted in increasing death rates, with individuals lacking lactase persistence being especially vulnerable. This situation would have been further exacerbated during periods of famine, when disease and malnutrition rates rise.

This would all lead to individuals who did not carry a copy of the lactase persistence gene variant being more likely to die before or during their reproductive years, which would push the population prevalence of lactase persistence up.    

Davey Smith added: “If you are healthy and lactase non-persistent and drink lots of milk, you may experience some discomfort, but you not going to die of it. However, if you are severely malnourished and have diarrhoea, then you’ve got life-threatening problems. When their crops failed, prehistoric people would have been more likely to consume unfermented high-lactose milk — exactly when they shouldn’t.”

Share post:



Victorian map unlocks ‘incredible’ tale of Romano-British metal hoard

Archival detective work and scientific analysis by archaeologists have...

Mysterious Roman dodecahedron is ‘find of a lifetime’

A Roman dodecahedron unearthed on a community dig in...

Cambridgeshire bones may hold first DNA evidence of Sarmatians in Britain

Remains of a man buried near a rural farmstead...

‘Backwater’ town was bustling trade hub that rewrites Roman history

A Roman town once considered so unpromising that no...