How to Stay Sharp in Old Age


Of all the common consequences of aging, none is more frightening than memory loss. Even if you’ve never helplessly watched a loved one succumb to Alzheimer’s—which I promise is worse than it sounds—it’s natural to wonder if something similar could happen to you.

Our collective fear of aging has long been exploited for profit; cognitive decline is no exception. Most people are terrified of losing their mental faculties as they age, and corporations know it—brain power-boosting games and apps are a big business these days. Their claims are bold: Lumosity promises to help users “improve memory, increase focus, and find calm.” 2013 Apple App of the Year winner Elevate touts itself as “a brain training program designed to improve focus, speaking abilities, processing speed, memory, math skills, and more.” Using fear to sell products may be an effective marketing strategy, but those products rarely solve any actual problems.

There’s so much about dementia that we still don’t know, but one thing is certain: it’s caused by a complex confluence of many, many factors. In other words, any single prevention-minded strategy—like playing a game on your phone for a few minutes a day—probably won’t make a difference, but a multi-pronged approach just might. While the majority of risk factors are beyond our control, some of them are within our power to change, and knowing the difference is your best protection.

What is dementia, and what causes it?

There are three main types of memory loss: age-related cognitive decline, mild cognitive impairment (MCI), and dementia. Although the symptoms overlap somewhat, these are distinct conditions and it’s important to know the differences between them.

Age-related cognitive decline

Age-related cognitive decline is what we call the somewhat normal level of memory loss. Just like our hair, skin, and muscles, brain cells age along with us, which can cause impaired cell function and communication. Everyone loses some neurons as a normal part of the aging process, so mild memory problems can be chalked up to getting older.

Mild cognitive impairment

MCI lies between normal aging and dementia on the severity scale. People with MCI have more memory problems than is considered normal for their age group, but can still function on their own. (As always, determining what’s “normal” is at the discretion of a qualified medical professional.) It makes accomplishing day-to-day tasks more difficult, like remembering appointments and medications, but unlike dementia, MCI typically doesn’t cause behavioral changes.

Dementia

Dementia, according to the National Institute on Aging, is “the loss of cognitive functioning—thinking, remembering, and reasoning—and behavioral abilities to such an extent that it interferes with a person’s daily life and activities.” People with dementia forget appointments and medications, but they can also experience impaired vision, language skills, spatial reasoning, and decision-making. They may wander or get lost. Dementia can eventually cause personality changes: irritability, paranoia, hallucinations, aggression, unusual sexual behavior, and even physical violence.

The most common cause of dementia is Alzheimer’s disease, which can be either early- or late-onset. In late-onset Alzheimer’s, the more common type, dementia symptoms set in during or after the mid to late 60s. Early-onset Alzheimer’s is rarer, accounting for roughly 10 percent of all cases, and sets in anytime between age 30 and 60.

Video courtesy of the National Institute on Aging.

Scientists don’t fully understand why dementia develops, but in general, cognitive issues arise when neurons stop communicating with other brain cells and eventually die. In Alzheimer’s disease specifically, amyloid proteins and neurofibrillary (or tau) fibers clump together in abnormal formations, interrupting neuron connections and killing formerly healthy tissue. These formations, called amyloid plaques and tau tangles, are believed to at least partially explain the cognitive and behavioral changes observed in Alzheimer’s patients. The areas of the brain involved with memory are the usually first to be damaged, causing forgetfulness and broader memory loss; as the disease progresses to other parts of the brain, the patient gradually loses their ability to reason, speak, and behave normally. Eventually, the damage becomes so widespread that it affects basic physical functions like breathing and swallowing.

Who’s at risk?

The exact physiological causes of dementia are largely unknown, which makes early detection all but impossible; if there’s a precursor that shows up in routine blood work or imaging, we haven’t found it yet. For most people, dementia symptoms are their only warning, so it’s important to know your risk.

The single biggest risk factor for dementia is age. Whether it’s caused by Alzheimer’s or something else, dementia is much more common in the elderly; the NIH estimates that half of people over age 85 have some form of dementia. Family history also plays a role. Some people with no family history at all develop dementia, but as with many other medical conditions, the more people in your family that have had it, the higher your risk. Additionally, mental illness, particularly depression, is associated with an increased risk of developing dementia.

Both early- and late-onset Alzheimer’s have a genetic component, but that doesn’t mean you evaluate your risk with a DNA test—it just means that researchers have identified some of the chromosomes and genetic mutations involved in Alzheimer’s development. Your genes are just a few of many factors at play in a complex, decades-long process; plenty of Alzheimer’s patients don’t have any of the relevant mutations at all. It is worth noting, though, that most people with Down syndrome will develop Alzheimer’s. This could be because the gene that produces amyloid proteins is located on chromosome 21, of which people with Down syndrome have an extra copy.

What can we do about It?

There’s no sugarcoating this: Dementia cannot currently be prevented, and there’s no way to stop, reverse, or slow its progression. Finding a cure is a top priority, but the ultimate goal of dementia research is to prevent it altogether—ideally through easily-adopted lifestyle changes. Scientists have explored several interventions that could delay the onset of cognitive decline, but only some of them are truly promising.

Exercise may help, but we’re not sure

Of all the potential interventions, none have been studied more than exercise. The results are mostly inconclusive. While some studies suggest that increased physical activity may delay normal age-related cognitive decline, there’s no evidence that the same is true for MCI or dementia. Still, staying physically active has enough general health benefits that it’s worth your time—it’s just not the one thing that’ll keep you from developing dementia.

Brain training games may not improve your brain in real life

Another increasingly popular intervention is “cognitive training,” or playing increasingly difficult games to challenge different parts of your brain. It’s an attractive idea: play enough games and solve enough puzzles and you, too, can improve your overall cognition. Unfortunately, the research doesn’t quite back it up. Some games show more promise than others, but for the most part, brain training seems to mostly improve your ability to play that specific game.

For cognitive training to work, any benefits gained from playing games should carry over into related tasks in what’s known as a “transfer effect.” Proving this is way harder than it sounds: scientists disagree about which aspects of cognition correspond to brain training games, as well as how to meaningfully test for improvement. As a result, very few researchers have observed transfer effects. That hasn’t stopped corporations like Lumosity from claiming otherwise, even though there’s no proof these games can stave off cognitive decline. (Lumosity was fined $2 million by the FTC in 2016 for “deceptive advertising charges.”)

Treating high blood pressure may help

Something that could be more helpful is aggressive hypertension treatment, which just means bringing your blood pressure into the normal range—120/80 mmHg or less. A recent randomized clinical trial of more than 9000 hypertensive adults found a connection between intensive blood pressure management and the risk of MCI and probable dementia: people who reduced their systolic blood pressure to 120 mmHg or lower had a significantly lower rate of MCI than those whose systolic pressure was under 140 mmHG (14.6 vs 18.3 cases per 1000 person-years, respectively). Intensive blood pressure reduction also significantly reduced the combined risk of MCI and dementia. As for probable dementia on its own, researchers observed a measurable reduction—7.2 vs 8.6 cases per 1000 person-years for the 120 mmHg and 140 mmHG groups, respectively—but it was not statistically significant.

That doesn’t mean this study is bunk; quite the opposite, actually. It’s the first large-scale randomized clinical trial to find a statistically meaningful link between a common, treatable physical condition and the risk of MCI. On top of that, the study was so successful at reducing cardiovascular events and overall mortality that the blood pressure management program ended after 3.3 years—more than a year and a half early. MCI and dementia assessment continued for the full five years. Given the participants’ relative youth (about 68 years on average), the short observation window, and the fact that MCI usually presents earlier than dementia, it makes sense that significant results were only observed in relation to MCI—and therefore pretty exciting that any dementia result was observed at all. It’s always possible that future research will contradict these findings, but until then, it seems like as good a reason as any to keep your blood pressure under control.

Social interaction is our most promising strategy so far

Finally, and perhaps most promisingly, there is mounting evidence that social isolation is a major risk factor for cognitive decline and dementia. A 2017 Lancet Commission report estimates that social isolation accounts for up to 2 percent of lifetime dementia risk—just as much as hypertension. Though it’s a relatively new area of research, more and more studies are exploring the intervention potential of increased socialization. To learn more, I spoke with the author of one of these studies: Dr. Hiroko Dodge, principal investigator of Oregon Health & Science University’s I-CONECT project.

In a June 2015 Alzheimer’s & Dementia paper, Dr. Dodge et. al. designed a clinical trial to test the effect of “naturalistic human contact” on cognitive function in elderly (80 years, on average) adults. About half of the participants video-chatted with trained interviewers 30 minutes a day for six weeks; the others did not. Compared with baseline scores and the control group, the video chatters showed improvement in semantic fluency (being able to find and produce words in a certain category) and psychomotor speed (reaction time). The only statistically significant results were observed in subjects with normal cognition—i.e., no impairment or dementia—but subjects with MCI still showed improvement relative to controls. The study was considered a success, and a larger-scale follow-up trial is currently ongoing.

Dr. Dodge believes that the human element of video chat is key to their observed results. In the conversation sessions, interviewers were trained to prioritize eye contact and back-and-forth conversation, two important aspects of face-to-face contact that socially isolated people don’t get enough of. Plus, video chat is accessible to the people who stand to benefit from it the most: physically and socially isolated adults. I asked Dr. Dodge if FaceTiming or video chatting with isolated elderly relatives was a good thing to do regularly. “Definitely,” she said, explaining that regular face-to-face conversations could improve cognitive compensation mechanisms—the brain’s ability to work around cognitive impairments.

Of course, a cure or prevention for dementia is a ways off. The NIH calls clinical trials the “gold standard” of medical proof, but getting statistically significant results out of them is exceptionally difficult. As Dr. Dodge explained to me, this is because variability is very high in dementia research, particularly where human subjects are concerned:

“If you ask [subjects] in the morning to do tests, and then in the afternoon to do tests, even within an individual the fluctuation is so high. … When they’re feeling good, or if they slept well last night, they do much better. If they didn’t sleep well, or if they have a little cold, that really shifts around the scores.”

She also mentioned that cognitive compensation complicates things further: people with the same degree of cognitive impairment can perform differently on tests depending on how (or if) they’ve learned to cope with it.

Social isolation research is promising, but it’s just beginning—and until it studies more people of different ages, ethnicities, nationalities, genders, and socioeconomic classes, we won’t know for sure just how much it can help.

Taken together, the body of research on dementia intervention suggests that staying socially and physically active is our best bet for long, healthy lives. However, as Dr. Dodge reminded me, you can do everything “right” and still get dementia—so we have got to stop blaming people for failing to prevent an unpreventable disease. “If somebody gets dementia, others may say, ‘Oh, she didn’t do social interaction, or she didn’t do cognitive stimulation’ … unfortunately, some people will get the disease, and it’s not their fault.”

NIH-supported study identifies 11 new Alzheimer’s disease risk genes.


An international group of researchers has identified 11 new genes that offer important new insights into the disease pathways involved in Alzheimer’s disease. The highly collaborative effort involved scanning the DNA of over 74,000 volunteers—the largest genetic analysis yet conducted in Alzheimer’s research—to discover new genetic risk factors linked to late-onset Alzheimer’s disease, the most common form of the disorder.

By confirming or suggesting new processes that may influence Alzheimer’s disease development—such as inflammation and synaptic function—the findings point to possible targets for the development of drugs aimed directly at prevention or delaying disease progression.

Supported in part by the National Institute on Aging (NIA) and other components of the National Institutes of Health, the International Genomic Alzheimer’s Project (IGAP) reported its findings online in Nature Genetics on Oct. 27, 2013. IGAP is comprised of four consortia in the United States and Europe which have been working together since 2011 on genome-wide association studies (GWAS) involving thousands of DNA samples and shared datasets. GWAS are aimed at detecting the subtle gene variants involved in Alzheimer’s and defining how the molecular mechanisms influence disease onset and progression.

“Collaboration among researchers is key to discerning the genetic factors contributing to the risk of developing Alzheimer’s disease,” said Richard J. Hodes, M.D., director of the NIA. “We are tremendously encouraged by the speed and scientific rigor with which IGAP and other genetic consortia are advancing our understanding.”

The search for late-onset Alzheimer’s risk factor genes had taken considerable time, until the development of GWAS and other techniques. Until 2009, only one gene variant, Apolipoprotein E-e4 (APOE-e4), had been identified as a known risk factor. Since then, prior to today’s discovery, the list of known gene risk factors had grown to include other players—PICALM, CLU, CR1, BIN1, MS4A, CD2AP, EPHA1, ABCA7, SORL1 and TREM2.

IGAP’s discovery reported today of 11 new genes strengthens evidence about the involvement of certain pathways in the disease, such as the role of the SORL1 gene in the abnormal accumulation of amyloid protein in the brain, , a hallmark of Alzheimer’s disease. It also offers new gene risk factors that may influence several cell functions, to include the ability of microglial cells to respond to inflammation.

The researchers identified the new genes by analyzing previously studied and newly collected DNA data from 74,076 older volunteers with Alzheimer’s and those free of the disorder from 15 countries. The new genes (HLA-DRB5/HLA0DRB1, PTK2B, SLC24A4-0RING3, DSG2, INPP5D, MEF2C, NME8, ZCWPW1, CELF1, FERMT2 and CASS4) add to a growing list of gene variants associated with onset and progression of late-onset Alzheimer’s. Researchers will continue to explore the roles played by these genes, to include:

·         How SORL1 and CASS4 influence amyloid, and how CASS4 and FERMT2 affect tau, another protein hallmark of Alzheimer’s disease

·         How inflammation is influenced by HLA-DRB5/DRB1, INPP5D, MEF2C, CR1 and TREM2

·         How SORL1affects lipid transport and endocytosis (or protein sorting within cells)

·         How MEF2C and PTK2B influence synaptic function in the hippocampus, a brain region important to learning and memory

·         How CASS4, CELF1, NME8 and INPP5 affect brain cell function

The study also brought to light another 13 variants that merit further analysis.

“Interestingly, we found that several of these newly identified genes are implicated in a number of pathways,” said Gerard Schellenberg, Ph.D., University of Pennsylvania School of Medicine, Philadelphia, who directs one of the major IGAP consortia. “Alzheimer’s is a complex disorder, and more study is needed to determine the relative role each of these genetic factors may play. I look forward to our continued collaboration to find out more about these—and perhaps other—genes.”

Schellenberg heads the Alzheimer’s Disease Genetics Consortium (ADGC), one of the four founding partners of IGAP. The ADGC is a collaborative body established and funded by the NIA with the goal of identifying genetic variants associated with risk for Alzheimer’s. Schellenberg noted that the study was made possible by the research infrastructures established and supported by the NIA over many years, including 29 Alzheimer’s Disease Centers, the National Alzheimer’s Coordinating Center, the NIA Genetics of Alzheimer’s Disease Data Storage Site, the Late-onset Alzheimer’s Disease Family Study, and the National Cell Repository for Alzheimer’s Disease. These endeavors collect, store and make available to qualified researchers DNA samples, datasets containing biomedical and demographic information about participants, and genetic analysis data.

Blood Test May Predict Ketamine’s Antidepressant Effect.


Researchers have identified biomarkers that they hope will eventually help predict which patients with bipolar depression will respond to subanesthetic doses of ketamine.

An experimental blood test using pharmacometabolomics found patterns, or metabolic “fingerprints,” that differ between people whose bipolar depression improved with intravenous ketamine and those who did not get better. Pharmacometabolomics uses sophisticated chemistry techniques to quantify and analyze metabolites that the body produces in response to drugs.

“This work is telling us what we should measure [in the blood],” study coauthor Irving Wainer, PhD, from the National Institute on Aging‘s Intramural Research Program in Baltimore, Maryland, toldMedscape Medical News. “Then we’ll develop the technology to predict, pretreatment, which patients will respond to ketamine treatment and find ways to individualize and optimize treatment with ketamine.”

The research findings were presented here recently at the American Society of Anesthesiologists (ASA) 2013 Annual Meeting.

Rapid Antidepressant Effect

Recent studies of this investigational use of ketamine, a glutamateN-methyl-D-aspartate (NMDA) receptor antagonist, show that many patients with treatment-refractory bipolar depression ( Biol Psychiatry, 2012;71:939-946) or other types of major depression respond rapidly to intravenous infusion of ketamine — typically in several hours.

However, 1 in 3 patients do not respond to this treatment at all, said another author of the new study, Michael Goldberg, MD, an anesthesiologist from Cooper University Health Care in Camden, New Jersey.

“It’s not ethical to put everyone [with depression] on ketamine because it has risks. Hallucinations are common and may require other medications to counteract,” Dr. Goldberg told Medscape Medical News.

Knowing before ketamine administration who would be least likely to benefit would spare them from exposure to the drug and its common dissociative side effects, he said.

Toward that end, the researchers collected blood samples from 22 patients with bipolar disorder and major depression both before and after a single intravenous infusion of 0.5 mg/kg of ketamine or a placebo; the patients then were crossed over to the alternate therapy after a 1-week drug washout.

Patients also received treatment with either lithium (n = 16) or valproate (n = 6) as a mood stabilizer and responded to this therapy, according to Dr. Goldberg. Response to ketamine was identified 230 minutes after infusion using the Montgomery-Åsberg Depression Rating Scale.

The investigators analyzed the metabolomic patterns in the blood samples using liquid chromatography/quadruple time-of-flight mass spectometry and capillary electrophoresis/laser-induced fluorescence.

They also reportedly identified the compound that ketamine breaks down into, which they called 2S, 6S hydroxynorketamine.

Quick Turnaround Test

In the group receiving comedication with lithium, patients who did not respond to ketamine treatment (“nonresponders”) had significant post-treatment differences in 18 metabolites compared with ketamine responders, according to the abstract (P values not reported).

These included significantly increased levels of the fatty acids phenyllactic acid and monoglyceride and significantly decreased levels of trimethyl-L-lysine, lysophosphatidylethanolamine, and lysophosphatidylcholine.

The same trend was seen in the patients receiving valproate, but the number of patients in this group was not large enough to determine statistical significance, Dr. Goldberg stated.

Independently of the mood stabilizer, nonresponders also reportedly had significantly higher levels of the amino acid D-serine. In their abstract, the authors suggested a difference in the activity of the enzyme serine racemase between responders and nonresponders, possibly owing to an indirect change in NMDA receptor activity.

Although most patterns observed in metabolites were of fatty acids, some were metabolites of ketamine, Nagendra Singh, PhD, said during the oral presentation of the findings. Dr. Singh, who was not a coauthor of the study, will soon be joining Cooper University Health Care as a research associate.

Of 323 metabolites analyzed, only 2 were reportedly the same between the lithium and valproate groups.

“Two different mood stabilizers had very different metabolite patterns in their fatty acids, which was very interesting,” Dr. Singh said. “This clearly demonstrates that lithium and valproate affect fatty acids differently.”

The researchers speculated that patients with bipolar depression respond to ketamine on the basis of various endogenous factors, including fatty acid metabolism and the plasma levels of endogenous compounds involved in neurotransmission.

A limitation of the study, said Dr. Wainer, is that it was not designed prospectively for pharmacometabolomics. He described the pharmacometabolomic process as “time-consuming and costly.”

Dr. Wainer said his team plans to limit future analyses to 5 or 6 metabolites.

“Our goal is to develop the technology so that there will be on-site rapid turnaround of blood test results,” he said.

Not Ready for Routine Use

Commenting on the study for Medscape Medical News, Timothy Lineberry, MD, associate professor of psychiatry at Mayo Clinic, Rochester, Minnesota, said that identifying who will benefit from and have good response to ketamine is important.

Dr. Lineberry, who was not involved with the study, said that there are not yet enough research data for physicians to routinely use ketamine in clinical practice for this off-label purpose.

“There is also a need for identifying what are the psychological markers associated with response to ketamine and the particular diagnoses that will respond to treatment,” he added.

The next patient population in which Dr. Goldberg said their research team plans to study pharmacometabolomics after ketamine infusion is those with post-traumatic stress disorder.

Cardiac disease linked to mild cognitive impairment.


Cardiac disease is associated with increased risk of mild cognitive impairment such as problems with language, thinking and judgment, according to a study.
The study by researchers with the Mayo Clinic found the connection was significant in women with heart disease more so than in men.

Known as nonamnestic because it does not include memory loss, this type of mild cognitive impairment may be a precursor to vascular and other non-Alzheimer’s dementias, the researchers noted. Mild cognitive impairment is an important stage for early detection and intervention in dementia, said Rosebud Roberts, MB, ChB, the study’s lead author and a health sciences researcher at the Mayo Clinic.

“Prevention and management of cardiac disease and vascular risk factors are likely to reduce the risk,” Roberts said in a news release.

The researchers evaluated 2,719 people ages 70 to 89 at the beginning of the study and every 15 months after. Of the 1,450 without mild cognitive impairment at the beginning, 669 had heart disease and 59 (8.8%) developed nonamenestic mild cognitive impairment. In comparison 34 (4.4%) of 781 who did not have heart disease developed nonamenestic mild cognitive impairment.

The association varied by sex, with cardiac disease and mild cognitive impairment appearing together more often among women than men.

Source: JAMA