These microfossils reveal what our ancestors were eating 1.2 million years ago.


Raw meat and grass.

Dental analysis of a 1.2-million-year-old tooth that once belonged to one of our early human ancestors has yielded new information about what ancient hominins used to eat – and it doesn’t sound all that appetising.

According to a new study, tartar build-up on the tooth shows that raw meat and vegetables were on the menu, alongside bits of wood, scales from butterfly wings, and parts of an insect’s leg.

 “Evidence for plant use at this time is very limited, and this study has revealed the earliest direct evidence for foods consumed in the genus Homo,” the team, led by Karen Hardy from the Autonomous University of Barcelona in Spain, reports.

“All food was eaten raw, and there is no evidence for processing of the starch granules which are intact and undamaged.”

While an understanding of diet is helpful to piece together how our ancient ancestors lived, the key suggestion here is that ancient tartar provides evidence that fire was not in use in Europe 1.2 million years ago.

Exactly when in human prehistory fire started to be used for cooking and warming purposes has been widely debated, with some researchers suggesting that it goes back 1.8 million years.

The earliest direct archaeological evidence of fire has been dated to 800,000 years ago, and if these tartar-encrusted teeth are evidence of a lack of fire, it suggests that fire technology took off somewhere between 1.2 million to 800,000 years ago, but no one is entirely sure.

The team also suggests that, since the toothbrush wasn’t invented yet, early hominins used sticks and twigs as toothpicks to keep their teeth in check – an act that researchers previously only had evidence of from roughly 49,000 years ago.

 “Additional biographical detail includes fragments of non-edible wood found adjacent to an interproximal groove suggesting oral hygiene activities, while plant fibres may be linked to raw material processing,” the team writes.

The tooth came from a jaw bone found at the Sima del Elefante excavation site in Spain’s Atapeurca Mountains, which contains some of the oldest early human remains ever found.

The researchers performed their analysis by taking a sample of dental calculus – better known as tartar, a form of hardened plaque – that they removed from the tooth using an ultrasonic scaler, a pointy, rather intimidating piece of dental equipment that buzzes plaque away.

Next, they performed various scans on the tartar to reveal microfossils hidden within it. This is where they found bits of wood, animal tissue, and even a small plant pathogen known as Alternaria, associated with asthma and hay fever.

While there isn’t enough taxonomic evidence to accurately identify the species of early human that the bones came from, the evidence suggests that this ancient hominin’s diet consisted of mainly raw foods, because they had yet to start cooking over a fire.

Even without the luxury of cooking, the tooth shows a pretty broad diet, including seeds, plants, and animals.

“It is plausible that these ancient grasses were ingested as food,” Hardy said of the fibres. “Grasses produce abundant seeds in a compact head, which may be conveniently chewed, especially before the seeds mature fully, dry out, and scatter.”

“Our evidence for the consumption of at least two different starchy plants, in addition to the direct evidence for consumption of meat and of plant-based raw materials suggests that this very early European hominin population had a detailed understanding of its surroundings and a broad diet.”

Hopefully, as the studies of the bones found at Sima del Elefante continue, we’ll gain a better understanding of how our early ancestors lived, ate, and died, providing a more complete picture of our past.

Africa In Data – Our World In Data


https://africaindata.org/?linkId=32475114#/title-slide

5 things you should do every day to increase your intelligence


If you want to witness the magic of the human brain in action, play the matching game with a 4-year-old — you know, the classic game where the tiles are arranged face down and players take turns flipping them over to find pairs.

You’ll see what I mean as you watch that wonderful little brain doing its thing. It’s amazing, and when you realize you’ve been outdueled in a brain game by a child, it’s just a bit humbling.

Obviously, young children develop, learn, and change very quickly. They show drastic improvement in academic, physical, and social skills all the time. One day they can’t, and literally the next day they can. Does it make you a little bit jealous? Imagine what you could accomplish with that amazing ability.

Well, there is hope for us adults. We’ve known for quite a while that IQ can be increased, that you’re not just stuck with what you’re born with. Andrea Kuszewski explains that fluid intelligence, which refers to the capacity to learn new things, retains that information, uses it to solve new problems, and can be strengthened over time. She suggests that if one implements five elements into life every day, or at least as much as possible, cognitive capacity can be increased. Those elements are to seek novelty, challenge yourself, think creatively, do things the hard way, and network.

8424476075_8e33ab63fc_o

We could learn a thing or two from the younger generation about keeping our minds challenged. 

So, is it any wonder that children learn at the rate they do? Every sight, sound, word, taste — everything — is new and novel to a child. And, when you’re a kid, everything is challenging. Kids make use of creativity to solve problems from the time they are born. They discover the easy way via the hard way. And finally, they are constantly networking by simply meeting new people all the time.

So, kids are engaging in brain strengthening activities all the time, almost by default. For us adults, it likely takes a more concerted effort. It’s so easy to stay within our comfort zones, stick with our routines, and never change anything up. Doing that may be practical, productive and convenient… it’s just not making us any smarter.

Here are a few things to keep in mind as you’re building your intelligence:

1. Seek novelty.

Novel doesn’t need to mean outlandish; it just means new to you. New experiences, new people, new anything other than “what we’ve always done.” Be open to new experiences and take advantage of opportunities when you have them.

2. Challenge yourself.

The brain, like any muscle in your body, gets stronger when you offer it some resistance. You know you’re good at what you do, but make sure you continue to take on challenges that stimulate your mind and take you outside your comfort zone.

3. Think creatively.

Creativity has countless benefits. Not only does thinking creatively increase our intelligence, it also enhances our productivity, efficiency, success, and happiness. It’s not just artists and what we normally think of as creative types — we’re all creative. Just allow your brain to do the work, instead of just asking it to memorize and regurgitate information.

4. Do things the hard way.

This is the brain exercise. It doesn’t mean making your life an unbearable struggle at every turn. But when you can, and when it’s practical, challenge yourself a little bit at a time. Figure the tip without a calculator, find your way without GPS, or make your own sauerkraut.

18647212508_9d4a3159a7_kThe ‘hard way’ can benefit you in the long run. 

5. Network.

It probably sounds like a burden to expose yourself to novelty and find ways to be creative if you don’t exactly know where to look. This is where networking comes in. Let other people show you the way. The more people you meet, the more you’re exposed to and the more you experience.

Networking, on its own, can seem daunting for many of us and can certainly lead us outside our comfort zone. For the introspective and thoughtful among us, that only makes networking even more beneficial.

As I’ve said before, there’s a lot to be said for sameness and routine to get us through life. But consciously weaving these elements into your routines could be beneficial in many ways, not the least of which is being competitive when playing games with children.

For the cost of an iPhone, you can now buy a wind turbine that can power an entire house for lifetime


Indian startup Avant Garde Innovations has developed a low-cost wind turbine that can generate 3-5 kW hours of electricity daily

01indian

Soon after assuming office, Kerala (southern state of India) Chief Minister Pinarayi Vijayan kicked up a storm by publicly supporting the Athirappilly hydro electric project, which environmentalists said, if implemented, would create ecologic imbalance in the area and destroy the Athirappilly waterfalls, the largest natural waterfalls in the state.

It is not that the government is oblivious to the impact that the project could make, but it says it has no option but to leverage existing means to check the growing power crisis in Kerala, which partially depends on the private sector for electricity.

Things are no different in other states either. While Kerala has attained almost 100 per cent electrical coverage, many parts of India still remain in the dark. For a large portion of the Indian population, electricity to this day remains a distant dream.

Enter two siblings who want to make India’s energy crisis a thing of the past. The duo has developed a new solution they say will not even slightly impact the ecological balance.

Avant Garde Innovations, the startup founded by siblings Arun and Anoop George from Kerala, has come up with a low-cost wind turbine that can generate enough electricity to power an entire house for a lifetime. The size of a ceiling fan, this wind turbine can generate 5 kWh/kW per day — with just a one-time cost of US$750.

“Our goal is to eliminate energy poverty, reduce dependence on struggling state power grids and create energy self sufficiency for all the needy ones through distributed, localised and affordable renewable energy. In doing so, we believe we can collectively usher in our world a cleaner environment, new economic prosperity and social change,” reads the company ‘What We Do’ statement.

“Our first offering is a highly affordable small wind turbine suitable for residential, commercial, agricultural, village electrification and other uses, which is aimed for a market launch during 2016.”

Incorporated in 2015, Avant Garde claims to be a startup with a ‘green’ heart and soul.

For the startup, opportunity is massive. India is the world’s sixth largest energy consumer, accounting for 3.4 per cent of global energy consumption. Federal governments in India, and the central government for that matter, are unable to bear the huge infrastructural cost required to bring electricity to remote villages.

Erecting electric posts and electric lines require huge investments that could cost millions of dollars.

This is where Avant Garde comes into picture. “When small wind turbine generating 1kW energy costs INR 3-7 lakh (US$4,000-10,000), our company plans to sell it at less than NR 50,000 (about US$750). Costs will decrease further through mass production,” Arun said in an interview to The Times of India.

The company launched its pilot project at a church in the capital city of Thiruvananthapuram in January this year. The small wind turbine prototype that it has developed is highly scalable for power capacities of 300 kW or even higher, Arun told e27.

“Our passionate aim is to introduce innovative, affordable and sustainable solutions that take renewable energy self sufficiency and energy empowerment to the next level through a distributed and decentralised approach using pioneering strategies the world has not witnessed yet,” the company says.

This revolutionary product has also won them a spot in the Top 20 Cleantech Innovations in India. The company has also made it to the list of 10 clean energy companies from India for the “UN Sustainable Energy For All” initiative under the one billion dollar clean energy investment opportunity directory.

According to the Global Wind Energy Council, the country ranks 4th in terms of global installed wind power capacity, after China, the US, and Germany.

Maybe, if Avant Garde Innovations takes off, Kerala can keep the Athirappilly waterfalls untouched.

 

 

Technology destroys people and places. I’m rejecting it.


From Wednesday, I’m going to live without my laptop, internet, phone, washing machine or television. I want my life back. I want my soul back
Mark Boyle: ‘Technology separates us from nature, while simultaneously converting life into the cash that oils consumerist society.’
Mark Boyle: ‘Technology separates us from nature, while simultaneously converting life into the cash that oils consumerist society.’ 

I’ll never know how many people liked this article, shared it or found it irrelevant, anti-progressive or ironic. Nor will I get to read comments about my personal hygiene, or suggesting that a luddite like me needs to embrace industrialism. And that is no bad thing, for the moment writing becomes a popularity contest – rewarding sensationalism, groupthink and deceit over honest exploration of complex matters – people and places lose, and those who need to be held to account win. Win, that is, for a shortsighted moment.

The reason I won’t see any web reaction is because I live in a cabin – built with spruce, oak, hands, straw, Douglas fir, stubbornness, earth and knees – without electricity or so-called modern conveniences (I’ve never found doing the work to buy and maintain them particularly convenient).

From Wednesday, I’m rejecting the world of complex technology entirely. That means no laptop, no internet, no phone, no washing machine, no tapped water, no gas, no fridge, no television or electronic music; no anything requiring the copper-mining, oil-rigging, plastics-manufacturing essential to the production of a single toaster or solar photovoltaic system.

Having already rejected these industrial-scale, complex technologies, I intend to move fully towards what is pejoratively called primitive technology. Insofar as engaging with civilisation allows, I’m also trying to resist the modern domination of what Jay Griffiths, in Pip Pip, calls clock time – and failing daily.

That probably sounds like I’ve given up a lot of stuff. But while I intend to be clear and honest about the difficulties involved over the coming months, especially in the digital age, I’m just as fascinated in exploring what lessons about life – myself, society, the natural world – I might learn; perhaps things my cyborg-mind cannot yet imagine. That was my experience of living without money for three fine years.

Rejecting technologies that my generation considers to be the basic necessities of life wasn’t done on a thoughtless whim. I already miss not being able to pick up the phone and talk to my parents. Writing is different, my pencil unaided by both copy-and-paste and the easy delete, two word-processing functions reflective of a generic, transient and whimsical culture; and it has been a while since the media and publishing worlds worked by snail mail.

I decided to eschew complex technology for two reasons. The first was that I found myself happier away from screens and the relentless communication they generate, and instead living intimately with my locale. The second, more important, was the realisation that technology destroys, in more ways than one.

It destroys our relationship with the natural world. It first separates us from nature, while simultaneously converting life into the cash that oils consumerist society. Not only does it enable us to destroy habitat efficiently, over time this separation has led us to valuing the natural world less, meaning we protect and care for it less. By way of this vicious technological cycle, we are consciously causing the sixth mass extinction of species.

When I walk to the spring to collect water in the morning I meet neighbours and we talk. Yes, it takes time, something I found frustrating at first, but slowness only became a bad thing when time became money. Walking four miles to the post office to send my letters takes time too, but it ties me to people and place in a way that sitting in my bedroom on my own, writing endless emails, could never do.

Technology destroys people. We’re already cyborgs (pacemakers, hearing aids) of a sort, and are well on our way to the type of Big Brother dystopia of the techno-utopians. And look at the state of us. Our toxic, sedentary lifestyles are causing industrial-scale afflictions of cancer, mental illness, obesity, heart disease, auto-immune disorders and food intolerances, along with those slow killers, loneliness, clock-watching and meaninglessness. We seem to spend more time watching porn than we do making love, relationships are breaking down because we stare into screens instead of eyes, while social media are making us antisocial.

Living without complex technology has its own difficulties, especially for people like me who were never initiated into those ways. But already I much prefer it. Instead of making a living to pay bills, I make living my life. Contrary to expectation, my biggest issue is not being bored, but how to do all the things I’d love to do. Of course hand-washing your clothes can be a pain sometimes, but that minor inconvenience is hardly worth destroying the natural world over.

Well-intentioned friends often try to convince me to go off-grid, but in using batteries, electrical cables and photovoltaic panels (as I once did), I would still be connected, by a peculiar sort of invisible cable, to the global network of quarries, factories, courtrooms, mines, financial institutions, bureaucracies, armies, transport networks and workers needed to produce such things. They also ask me to stay on social media to speak out about the technology issue, but I say I’m denouncing complex technology simply by renouncing it. My culture made a Faustian pact, on my behalf, with those devilish tyrants Speed, Numbers, Homogeneity, Efficiency and Schedules, and now I’m telling the devil I want my soul back.

We know that, at the very least, some technologies are harming our natural world, our societies and, ultimately, ourselves. Therefore we can recognise the need to reject some technologies. If we’re to avoid technological extremism we’re going to have to draw a line in the sand somewhere. I’ve drawn mine, and I will only move it in the direction of my home.

Impact of aging on brain connections mapped in major scan study


UNIVERSITY OF EDINBURGH — Brain connections that play a key role in complex thinking skills show the poorest health with advancing age, new research suggests.

Connections supporting functions such as movement and hearing are relatively well preserved in later life, the findings show.

Scientists carrying out the most comprehensive study to date on ageing and the brain’s connections charted subtle ways in which the brain’s connections weaken with age.

Knowing how and where connections between brain cells – so-called white matter – decline as we age is important in understanding why some people’s brains and thinking skills age better than others.

Worsening brain connections as we age contribute to a decline in thinking skills, such as reasoning, memory and speed of thinking.

Researchers from the University of Edinburgh analysed brain scans from more than 3,500 people aged between 45 and 75 taking part in the UK Biobank study.

Researchers say the data will provide more valuable insights into healthy brain and mental ageing, as well as making contributions to understanding a range of diseases and conditions.

The study was published in Nature Communications journal.

Dr Simon Cox, of the University of Edinburgh’s Centre for Cognitive Ageing and Cognitive Epidemiology (CCACE), who led the study, said: “By precisely mapping which connections of the brain are most sensitive to age, and comparing different ways of measuring them, we hope to provide a reference point for future brain research in health and disease.

“This is only one of the first of many exciting brain imaging results still to come from this important national health resource.”

Professor Ian Deary, Director of CCACE, said: “Until recently, studies of brain scans with this number of people were not possible. Day by day the UK Biobank sample grows, and this will make it possible to look carefully at the environmental and genetic factors that are associated with more or less healthy brains in older age.”

Professor Paul Matthews of Imperial College London, Chair of the UK Biobank Expert Working Group, who was not involved in the study, said: “This report provides an early example of the impact that early opening of the growing UK Biobank Imaging Enhancement database for access by researchers world-wide will have.

“The large numbers of subjects in the database has enabled the group to rapidly characterise the ways in which the brain changes with age – and to do so with the confidence that large numbers of observations allow.

“This study highlights the feasibility of defining what is typical, to inform the development of quantitative MRI measures for decision making in the clinic.”

The University of Edinburgh Centre for Cognitive Ageing and Cognitive Epidemiology receives funding from the Medical Research Council (MRC) and the Biotechnology and Biological Sciences Research Council (BBSRC).

UK Biobank was established by the Wellcome Trust, MRC, Department of Health, Scottish Government and the Northwest Regional Development Agency. It has had funding from the Welsh Assembly Government, British Heart Foundation and Diabetes UK. UK Biobank is hosted by the University of Manchester and supported by the NHS.A video explanation of the research is available at: http://www.ccace.ed.ac.uk/news-events/latest

A video explanation of the research is available at: http://www.ccace.ed.ac.uk/news-events/latest

 

Study maps brain’s ageing connections


brain
Human Brain Project

Brain connections that play a key role in complex thinking skills show the poorest health with advancing age, new research suggests.

 Connections supporting functions such as movement and hearing are relatively well preserved in later life, the findings show.

Scientists carrying out the most comprehensive study to date on ageing and the brain’s connections charted subtle ways in which the brain’s connections weaken with age.

Knowing how and where connections between brain cells – so-called white matter – decline as we age is important in understanding why some people’s brains and thinking skills age better than others.

Worsening as we age contribute to a decline in , such as reasoning, memory and speed of thinking.

Researchers from the University of Edinburgh analysed brain scans from more than 3,500 people aged between 45 and 75 taking part in the UK Biobank study.

Researchers say the data will provide more valuable insights into healthy brain and mental ageing, as well as making contributions to understanding a range of diseases and conditions.

The study was published in Nature Communications journal.

Dr Simon Cox, of the University of Edinburgh’s Centre for Cognitive Ageing and Cognitive Epidemiology (CCACE), who led the study, said: “By precisely mapping which connections of the brain are most sensitive to age, and comparing different ways of measuring them, we hope to provide a reference point for future brain research in health and disease.

“This is only one of the first of many exciting brain imaging results still to come from this important national health resource.”

Professor Ian Deary, Director of CCACE, said: “Until recently, studies of with this number of people were not possible. Day by day the UK Biobank sample grows, and this will make it possible to look carefully at the environmental and genetic factors that are associated with more or less healthy brains in older age.”

Professor Paul Matthews of Imperial College London, Chair of the UK Biobank Expert Working Group, who was not involved in the study, said: “This report provides an early example of the impact that early opening of the growing UK Biobank Imaging Enhancement database for access by researchers world-wide will have.

“The large numbers of subjects in the database has enabled the group to rapidly characterise the ways in which the changes with age – and to do so with the confidence that large numbers of observations allow.

“This study highlights the feasibility of defining what is typical, to inform the development of quantitative MRI measures for decision making in the clinic.”

How Britain plans to lead the global science race to treat dementia.


It has struck nearly a million people in the UK, yet even its cause is still unclear

CT scans of a brain showing the progress of Alzheimer’s disease. Atrophy is shown by enlarged ventricles (white areas at the centre).
CT scans of a brain showing the progress of Alzheimer’s disease. Atrophy is shown by enlarged ventricles (white areas at the centre). 

Early next year, Professor Bart De Strooper will sit down in an empty office in University College London and start to plan a project that aims to revolutionise our understanding and treatment of dementia. Dozens of leading researchers will be appointed to his £250m project which has been set up to create a national network of dementia research centres – with UCL at its hub.

The establishment of the UK Dementia Research Institute – which was announced last week – follows the pledge, made in 2012 by former prime minister David Cameron, to tackle the disease at a national level and comes as evidence points to its increasing impact on the nation. Earlier this year, it was disclosed that dementia is now the leading cause of death in England and Wales. At the same time, pharmaceutical companies have reported poor results from trials of drugs designed to slow down the progress of Alzheimer’s disease, the most common form of dementia.

Current understanding of Alzheimer’s suggests the disease is triggered when beta amyloid, a protein in nerve cell membranes, starts to clump together. Slowly the brain undergoes metabolic changes as amyloid clumping continues. In particular, a protein known as tau, which is involved in memory storage, is affected. It starts to form tangles inside the brain’s neurons and these die off. Eventually, symptoms – such as severe memory loss – manifest themselves.

To date, most attempts at drug interventions have focused on medicines that could prevent beta amyloid from forming clumps, the most recent being Solanezumab, developed by the pharmaceutical company Eli Lilly. However, results of clinical trials of the drug – revealed last month – indicated that it had no significant effect on the thinking abilities of people with mild Alzheimer’s. Solanezumab had also failed in people with more advanced versions of the disease in earlier trials.

This double failure has led some scientists to argue that amyloid clumping is not a cause of the disease but is merely a symptom. By targeting it, scientists are wasting time, it is argued. Professor John Hardy, a geneticist based at UCL – who has played a key role in setting up the college’s Dementia Research Institute – does not agree. “All the evidence we have from families affected by early onset dementia indicates that the disease begins with the deposition of amyloid plaques in the brain,” he said. “The trouble is that this buildup starts 15 to 20 years before dementia’s symptoms appear. The drugs we have developed so far offer treatments that are, in effect, too little and too late.”

Hardy drew a parallel between cholesterol buildup in blood vessels that eventually leads to cardiac disease and the buildup of amyloid plaques in the brain and the onset of Alzheimer’s. “Unfortunately, we have no equivalent of a cholesterol test to assess how much amyloid is clumping in a person’s brain,” he added. “However, that could change in the near future.”

Research suggests between 20 and 30 genes are involved in predisposing people to Alzheimer’s.
Research suggests between 20 and 30 genes are involved in predisposing people to Alzheimer’s. 

Recent research has pinpointed a group of around 20 to 30 genes that are involved in predisposing individuals to Alzheimer’s. These genes come in different variants. Some variants of a gene predispose individuals to dementia more than other variants of that gene. If a person inherits a package of genes made up of variants that particularly predispose to dementia, they are very likely to develop Alzheimer’s.

“We are now within five years of developing a chip that will be able to tell – from a blood test – whether a person is likely to have amyloid plaques forming inside their brains in middle age,” added Hardy. “This would then be followed up by a brain scan to confirm if this is true or not.”

This would be dementia’s equivalent of a cholesterol test. The problem is that there is, as yet, no equivalent of drugs which would halt this amyloid buildup in a way that parallels the use of statins to block buildups of cholesterol, once detected, and so head off cardiac illness. For their part, researchers argue that the use of drugs like Solanezumab – although seemingly ineffective on patients in whom amyloid plaques have become established – could be far more effective in the early stage of the condition.

Many other issues complicate our understanding of dementia, however. “A good example is provided by the immune system,” said David Reynolds, chief scientific officer of Alzheimer’s Research UK. “There is a lot of evidence now that the immune system is involved in the development of Alzheimer’s after beta amyloid clumps appear.”

However, the nature of that immune response is still not fully understood. “We do not know whether the immune system tends to overreact – as with conditions like rheumatoid arthritis in which the body’s own tissue is attacked by its own immune defences – or react weakly and allow amyloid clumps to develop when they could be stopped,” added Reynolds. “Certainly it would be unwise to wade in with drugs until we know exactly what it is we want to achieve.”

And this is where the distributed nature of the Dementia Research Institute network could prove important. Based in different university cities (Edinburgh, Oxford and Cambridge are all candidates for units), these outlying centres will focus on different aspects of the disease: environmental factors, care of dementia patients – and immunology. “The creation and direction of these centres will depend on existing expertise at that university,” added Reynolds. “A centre that focuses on immunology and dementia would be particularly useful in finding new ways to tackle the condition.”

The Dementia Research Institute network is to be supported, over the next 10 years, by £150m funding from the Medical Research Council – with further inputs of £50m each being made by Alzheimer’s Research UK and by the Alzheimer’s Society. This commitment marks a significant increase in dementia research in the UK, which had already raised its annual funding from £50m in 2008 to £90m in 2012 and is now a world leader in the field.

“It is good news but we need to put it in perspective,” said James Pickett, of the Alzheimer’s Society. “In 2012 we spent more than £500m on cancer research; there are five times more researchers working on cancer in the UK; while the number of clinical trials of dementia drugs is less than 1% of those of cancer drugs.”

At the same time, the need for some form of treatment to tackle dementia is becoming increasingly urgent. More and more people are living to their 80s and 90s when their chances of getting dementia increase markedly. There are currently 850,000 people with dementia in the UK, a figure that will rise to one million by 2025 and two million by 2051.

“We are going to have to be very nuanced in understanding all the risk factors involved in dementia – and in appreciating why factors like education and general health provide some protection against its onset,” said Professor Carol Brayne, of Cambridge. “That is going to be the strength of the institute. It offers us the opportunity, for the first time, to follow so many different avenues and approaches to dealing with and understanding the dementia.”

GROWING THREAT

Dementia overtook heart disease as the leading cause of death in England and Wales last year. More than 61,000 people died of the condition in 2015, 11.6% of all recorded deaths.

The Office for National Statistics said the increase had occurred because people were living for longer while deaths from other causes, including heart disease, had gone down. In addition, doctors are now better at diagnosing dementia, and it is appearing more often on death certificates.

The bulk of dementia deaths last year were among women: 41,283, compared to 20,403 in men.

According to the Alzheimer’s Society, dementia is the only one of the top 10 causes of death that we cannot prevent or even slow down.

The leading cause of dementia is Alzheimer’s disease, which accounts for 62% of all cases in the UK: 520,000 of the 850,000 people living with dementia in the UK today. Other forms of the disease include vascular dementia and Lewy Body dementia.

Dementia costs the UK economy approximately £26bn per year, according to the Alzheimer’s Society.

If a drug could be found to slow cognitive decline in dementia, that would delay the need for paid care and reduce the financial burden on families, the NHS and social care.

The price of life


“More life with your kids, more life with your friends, more life spent on earth—but only if you pay” was the message of AA Gill’s posthumous essay published in the Sunday Times this week. His death from lung cancer, at 62, saddened and shocked readers of his column, where he had announced less than a month before that he was suffering from “an embarrassment” of cancer.

The “paying” Gill referred to was for Nivolumab, an immune checkpoint inhibitor shown to be effective as a second line treatment in NSCLC. A 2015 NEJM RCT showed Nivolumab increased survival in NSCLC by a modest but significant three months. [1,2] But Nivolumab is expensive, costing around £60,000 per year. How much is three months of extra life worth? For any one individual desperately hoping for more life spent on earth, this is an unanswerable question. But for our population the answer is more clear: the NICE threshold for recommending treatments to be used in the NHS is between £20,000 and £30,000 per Quality Adjusted life Year. And this rules out Nivolumab.

The good news is Nivolumab isn’t the only treatment shown to be effective in advanced lung cancer. Five years before the Nivolumab RCT, another RCT involving patients with advanced NSCLC was published in the NEJM. [3] This trial, similarly, found a survival benefit for the patients in the intervention arm of around three months. What’s more, patients also had improved quality of life and less depression. But this time the intervention was not a new and expensive cancer drug. It was specialist palliative care, introduced early and collaboratively alongside standard oncology care.

Since this 2010 NEJM trial, other studies involving people with cancer and non-cancer diagnoses have found a survival benefit of specialist palliative care. The increasing body of evidence that palliative care not only helps people with advanced disease live better, but can help them live longer, has led to recent ASCO guidelines recommending palliative care is introduced early for patients with metastatic cancer, alongside oncology care. [4] Gone are the days of palliative care as the thing that happens after all the medicine has finished.

Both interventions, Nivolumab and specialist palliative care, improve survival in advanced NSCLC by around three months. One also improves patients’ quality of life. The other can cause enterocolitis, hepatitis, and dermatitis among other itisis. And yet, if faced with a choice between the two, my guess is that most patients would opt for the drug. Because it’s simply hard to comprehend that something as low-tech as person-centred, holistic, palliative care could be as beneficial as an expensive (and toxic) new drug.

I don’t know if Gill received specialist palliative care before he died. I hope so, if not for the potential survival benefit, then for the benefits to his quality of life and to provide support to his family. As a middle-classed, educated, relatively young person with cancer, he is more likely than many to have accessed specialist palliative care. [5] Like expensive cancer drugs, specialist palliative care is a relatively scarce resource: not everyone who might benefit from it gets it. It is estimated that at a minimum three quarters of all deaths in England would benefit from palliative care. [6] That’s 375,000 people every year: overwhelmingly more people than our current hospital, community and hospice services can care for. The irony is that in contrast to expensive cancer drugs, palliative care can actually be cost saving: the extra expense incurred by the specialist team is offset by the fact that patients have fewer expensive interventions and trips into hospital. [7] Investing in palliative care is good for individuals, and good for society.

What level of investment is required to provide good palliative care to everyone who might benefit from it? This was the question asked by the Review of Choice in End of Life Care in 2015. The answer was £130 million, to ensure that every dying person could access the care they need whether in hospital or community settings. In response, this year, the Government made a National Commitment to improve end of life care. What was not committed, however, was any of the £130 million needed to achieve this.

Cross-cultural surveys have shown that most people, if faced with a hypothetical terminal illness, would choose to prioritise quality of life over quantity of life. [8] Gill, facing non-hypothetical and rapidly progressing metastatic cancer, was clear about his desire for quantity. He wanted more time. More life. But this doesn’t have to be an either/or choice. It’s possible to have both quality and quantity. And investing in palliative care, alongside cancer care, is part of the solution.

What you need to know about health risk calculators


Ever since researchers with the legendary Framingham Heart Study created the first calculator to gauge the chances of having a heart attack, such tools have become a routine part of medicine. But the results aren’t as straightforward to interpret as the answers you used to get from your old high-school graphing calculator. The problem has to do with the challenge of interpreting the concept of risk.

Let’s use as an illustration the heart risk calculator designed by the American College of Cardiology and the American Heart Association to accompany their latest guidelines for the use of cholesterol-lowering statins.

It asks for 10 pieces of information, from age to cholesterol levels and smoking status, in order to estimate the chance of having a first “atherosclerotic cardiovascular disease event,” better known as heart attack or the most common type of stroke. Add your information and hit the “Calculate” button. The tool returns a number that represents your chance of having an event over the next 10 years. But what does this really mean? Experts disagree about the best interpretation.

Say the number you get is 10 percent. One way to interpret it is like this: In a group of 100 people with the same risk factors as you, 10 will have an event over the course of the next decade. Experts call this the epidemiologic risk. While it can be helpful in planning an treatment and prevention efforts across an entire population, it probably doesn’t fully answer your questions about yourself. Am I going to be one of the 10 with an event or the 90 without an event? What about my other risk factors not included in the calculator? For example, you may have had one or two parents who died young from a heart attack or stroke, which would increase your risk, or you may exercise every day and be at the highest level of cardiorespiratory fitness, which would lower it.

Ideally, instead of knowing the risk of having heart attack in the next 10 years, we would rather know the definitive answer — am I going to have one or not?

Individual factors like smoking or high blood pressure or diet quality have been linked with disease risks for decades. Using statistical techniques, it is possible to capture the prognostic power of such factors into a risk estimate. If one indicator sketches an individual’s portrait, many of them working together (10 in the case of the heart risk calculator) carve a statue of the same subject, adding dimensions to the prediction.

As my colleagues Ralph B. D’Agostino, Allan D. Sniderman, and I wrote earlier this year in JAMA, if all past, present, and future predictors of a particular disease were known — and it was possible to quantify them — one could build an algorithm that would give a definitive answer about that disease occurring for each individual.

Until then, it’s important to make the best of the useful tools that Framingham and other reputable sources have created for us. To make that happen, physicians and other experts need to convey the result that emerges from a risk calculator in ways that people can easily grasp.

The Framingham Heart Study, for example, includes in one of its calculators a “heart/vascular age” in addition to the 10-year risk of cardiovascular disease. A heart/vascular age younger than your chronological age is good, one older than your chronological age isn’t.

An approach we are exploring at the Duke Clinical Research Institute capitalizes on human’s natural tendency to compare themselves with others. Telling an individual that she has a 15 percent chance of having a heart attack in the next 10 years may offer some motivation to adopt healthier habits, while telling her that 90 percent of women her age have a better risk profile than hers may be even more motivational.

The risk estimates offered by today’s models and algorithms are limited by the data on which they were developed and the information they include. With more and more data becoming available, predictive analytics become more powerful and more useful to precision medicine, which aims to tailor treatment to individual patients. Electronic health records may help us build calculators that can provide estimates in a doctor’s office or at a patient’s bedside.

It is important to keep in mind that health risk calculators don’t assess the benefit that may come from treatment. One that inadequately addresses the root causes of a disease may do little to bring down the risk, while an effective therapy may offer an incremental or long-term benefit among individuals at moderate or even low risk.

The quality of the data that go into building a risk calculator matters. Not every calculator can be trusted. Turning up high in a Google search doesn’t always mean that a calculator’s validity and performance are trustworthy. A few, like those produced by the Framingham Heart Study, have been thoroughly assessed and validated.

Risk calculators will likely become more common in everyday clinical settings, partially because more data are available to build them and partially because the medical community and the public find them to be useful. Innovations such as machine learning applied to the “big data” available in electronic health records and other sources will only serve to improve the performance of these calculators and increase their value.

But even with advances such as machine learning, we are unlikely to ever create a calculator that moves us from prognostication to certainty and delivers the correct answer for each individual. That shouldn’t dissuade us from using health risk estimates. But we cannot let the appeal of a number that appears easy to understand pass for real understanding of what that number means. If we settle for that, what will be at risk is the potential benefit of these calculators for patient health.