Study reveals link between cannabis use and current asthma prevalence in US adolescents and adults


smoking a joint

Asthma is more common among U.S. individuals who report cannabis use in the most recent 30 days, with the odds of asthma being significantly even greater among individuals who reported cannabis use 20 to 30 days per month, according to a new study by researchers at Columbia University Mailman School of Public Health, City University of New York and Children’s National Hospital at George Washington University.

Until now little was known about the use of cannabis among youth and its relationship with asthma. The findings are published in the journal Preventive Medicine.

The study results show that the more frequent the use, the higher the likelihood of asthma, and there is little change after adjusting for cigarette use.

“With the growing use of cannabis across the U.S., understanding potential links between cannabis use and asthma is increasingly relevant to population health. This relationship is an emerging area and requires thorough collaborative investigation by experts in these fields,” said corresponding author Renee Goodwin, Ph.D., in the Department of Epidemiology at Columbia Mailman School of Public Health and Epidemiology at the City University of New York.

Data were drawn from the 2020 National Survey on Drug Use and Health a representative, annual survey of 32,893 individuals aged 12 and older in the United States. The researchers used regression modeling to examine the relationship between frequency of any cannabis and/or blunt (i.e., cannabis smoked in a hollowed-out cigar) use in the past 30 days among individuals with current asthma, and adjusting further for demographics and current cigarette smoking.

Current asthma was more common among individuals who reported cannabis use in the most recent 30 days, relative to those who did not (10% vs. 7.4%.) The odds of asthma were significantly greater among individuals reporting cannabis use 20-30 days/month and blunt use 6-15 and 20-30 days/month respectively than in individuals without asthma. Overall, the prevalence of asthma was 7.4% in the sample.

“Our findings add a significant layer to the nascent body of research on potential harms associated with cannabis use by being the first to show a link between cannabis use in the community and respiratory health risks; specifically increased asthma prevalence. Examining asthma prevalence in both adolescents and adults helps to inform public health initiatives and policies geared towards mitigating its risks, and underscores the importance of understanding the interplay between cannabis use and respiratory health.”

New York Declares Social Media As ‘Public Health Hazard’, Same As Tobacco And Guns


New york declares social media as ‘public health hazard’, same as tobacco and guns© Provided by Times Now

New York City Mayor Eric Adams on Wednesday officially declared social media as an ‘environmental toxin’ and ‘public health hazard’, putting it in the same category as tobacco and guns. With this, the Big Apple becomes the first city to issue an advisory against social media.

Adams further criticized TikTok, YouTube and Facebook, blaming the three platforms for mental health issues in children. His observation is based on latest surveys saying that teen depression levels have hit their highest levels in a decade. In the advisory, the New York mayor added that parents should impose ‘tech-free times’ for children. The Democrat also urged teens to consider turning off their notifications and tracking their emotions while online.

The city’s Department of Health and Mental Hygiene also identified unrestricted access to and use of social media as a public health hazard

“Today, Dr. Ashwin Vasan is issuing a Health Commissioner’s Advisory, officially designating social media as a public health hazard in New York City,” Adams announced during his State of the City address. The advisory cited a 2021 survey stating that on weekdays 77% of New York City high schoolers spent three or more hours per day in front of screens, not including homework.

New York City mayor declares social media a public health threatUnmute

Adams added that the platforms are “fueling a mental health crisis by designing their platforms with addictive and dangerous features.”

“We are the first major American city to take this step and call out the danger of social media like this. Just as the surgeon general did with tobacco and guns, we are treating social media like other public health hazards and ensuring that tech companies take responsibility for their products,” Adams said.

Can Taking Vitamin D Help Reduce the Risk of Dementia?


Canadian and UK-based scientists have completed a large-scale study to determine whether vitamin D supplements can help prevent the onset of dementia and uncovered some impressive results.

DEMENTIA IS A GROWING PROBLEM AROUND THE WORLD

Dementia is a growing concern worldwide; it is predicted that the number of people with dementia will triple in less than 30 years. This debilitating condition can have a profound impact on the individual suffering from it as well as on their loved ones. 

At the moment, there exists no cure for dementia. However, scientists are continuously exploring ways to slow or stop its onset and progression. One such avenue of investigation of how keeping your brain active with games may defer dementia for several years

Another commonly discussed avenue is the link between vitamin D intake and dementia. While there is some evidence to suggest that vitamin D may have the potential to reduce the risk of dementia, studies have produced mixed results

However, scientists at the University of Exeter and the University of Calgary have shed new light on this topic, highlighting specific groups that could benefit from vitamin D supplementation. Their findings suggest that starting vitamin D supplementation earlierbefore cognitive decline sets in, may be particularly effective

IMPRESSIVE RESULTS

The researchers explored the link between vitamin D supplements and the onset of dementia in a large-scale study with over 12,000 participants. The team’s findings were impressive, revealing a 40% reduction in dementia diagnoses among the group who took supplements.

Out of the total sample of participants, 2,696 individuals developed dementia during the 10-year study period. Of these individuals, 2,017 (75%) had not been exposed to vitamin D during any of the visits prior to their diagnosis, while 679 (25%) had been exposed at baseline vitamin D intake.

MORE EFFECTIVE FOR WOMEN

Although Vitamin D intake had a helpful effect in all groups, the research team observed that the impact was significantly more pronounced in females than in males. Additionally, the study found that Vitamin D had a greater effect on individuals with normal cognition than those who reported mild cognitive impairment, which is associated with a higher risk of developing dementia. 

Furthermore, the study revealed that the effects of vitamin D were notably more substantial in individuals who did not possess the APOEe4 gene, which is associated with a higher risk for Alzheimer’s dementia. 

This suggests that carriers of the APOEe4 gene may absorb vitamin D more effectively from their intestine, potentially reducing the impact of vitamin D supplementation. However, the researchers did not draw blood levels to test this hypothesis. 

Nevertheless, these findings highlight the importance of considering genetic factors when developing personalized strategies for preventing or treating dementia.

Dr. Bryon Creese (one of the authors) stated in a press release that, given the rising number of people affected by dementia, preventing or delaying its onset has become a critical concern. 

The study’s association between vitamin D and dementia suggests that taking vitamin D supplements could be a useful preventative measure, but clinical trials are needed to confirm this hypothesis. 

The University of Exeter is currently conducting further research by randomly assigning participants to either take vitamin D or a placebo and examining changes in memory and thinking tests over time.

They published their findings in the peer-reviewed journal Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring. We listed it below for those interested in further reading on the subject.

Ductal Carcinoma Quadruples Risk of Invasive Breast Cancer


Breast cancer cells

Women who are diagnosed with ductal carcinoma in situ (DCIS) are around four times as likely to develop invasive breast cancer and to die from breast cancer than women in the general population, according to a study published by The BMJ this week. This increased risk lasted for at least 25 years after diagnosis, suggesting that DCIS survivors may benefit from regular checks for at least three decades, said the researchers.

In addition, this suggest more personalized risk-based approach to breast cancer screening might be possible that considers family history and hereditary genetic variants.

The study’s lead author is Gurdeep S. Mannu, of the University of Oxford.

DCIS occurs when malignant breast cells are found but have not spread beyond the milk ducts. The condition isn’t immediately life-threatening, but can increase the risk of developing invasive breast cancer in the future. 

The NHS breast screening program often detects DCIS, but some diagnoses occur outside the programme, either because women are not in the eligible 50–70 year age range, or they did not respond to a screening invitation, or because their DCIS developed between screens.

An earlier study by the same authors found that screen-detected DCIS is associated with more than twice the risk of invasive breast cancer and breast cancer related death than the general population, but long term rates after non-screen detected disease are still unclear.

To address this, the authors used data from the National Disease Registration Service to compare rates of invasive breast cancer and death from breast cancer after non-screen detected DCIS with national rates for women of the same age in the same calendar year, and with women diagnosed with DCIS by the NHS breast screening programme.

Their findings are based on all 27,543 women in England diagnosed with DCIS outside the NHS breast screening programme from 1990 to 2018. 

They found that by December 2018, 3,651 women had developed invasive breast cancer, a rate of 13 per 1,000 per year and more than four times the number expected from national rates.

In the same group of women, 908 died from breast cancer, a rate of three per 1,000 per year and almost four times the number expected from national rates. For both invasive breast cancer and death from breast cancer, the increased risk continued for at least 25 years after DCIS diagnosis.

The authors point to limited information on lifestyle and health-related behavior. But they say “we consider the overall quality of the data underpinning the conclusions in our study remains high.” They explain that, after a DCIS diagnosis, women are offered yearly mammograms for the first five years, with those who are then aged 50–70 entering the NHS breast screening programme and receiving invitations to attend for screening at three yearly intervals thereafter, until aged 70.

“We have, however, provided evidence that the increased risk of invasive disease and breast cancer death following a diagnosis of DCIS in both screen detected and non-screen detected DCIS lasts for at least 25 years,” they write.

“These findings should inform considerations regarding the frequency and duration of surveillance following a diagnosis of DCIS, particularly for women diagnosed at younger ages,” they conclude.

Is it time for risk-based screening and follow-up after a DCIS diagnosis, ask researchers in a linked editorial?

Opportunities for a more personalized risk-based approach to breast cancer screening might be possible, especially for younger women, they say. Other factors that need to be considered include family history and hereditary genetic variants.

In conclusion, they say this study is highly relevant for three reasons. Firstly, to showcase the often overlooked risks of non-screen detected DCIS in the context of the ongoing debate about overdiagnosis and overtreatment of DCIS. 

Secondly, because the results suggest that longer follow-up after DCIS might be recommended because risks remain high for a long period after diagnosis, and finally, because the study provides essential information for further development of personalized risk based screening strategies.

Eyes Provide Window into Many Diseases


Beaty woman clor eye closeup supermacro background

Researchers have found that the eye can provide a window into many different diseases, with thinner retinal layers having a genetic basis and an association with ocular and cardiometabolic diseases as well as neuropsychiatric conditions.

The findings suggest that markers for systemic and ocular health could be developed from optical coherence tomography (OCT), which is routinely used for obtaining 3D images of the eye that requires minimal training.

The results appear in the journal Science Translational Medicine.

“Our study highlights how retinal imaging may be integrated with electronic health records, genomic data, and other biomarkers to advance our understanding of disease mechanisms and help inform risk prediction and risk modification strategies,” report Seyedeh Maryam Zekavat, from Massachusetts Eye and Ear Infirmary, and colleagues.

Changes in retinal thickness have previously been linked with systemic disease.

In the current study, Zekavat and co-workers studied retinal images captured from 44,823 participants in the UK Biobank using OCT.

They then studied the relationship between the thickness of nine retinal layers and 1866 incident conditions, 88 quantitative traits and blood biomarkers, and over 13 million genetic variants.

Specifically, the researchers performed OCT layer cross-phenotype and genome-wide association analyses to identify which phenotypes were associated with the different layers of the retina, and what genetic variants influenced these layers.

A schematic showing the structure of the retina and measurements of the thickness of 9 different retinal layers with retinal OCT imaging data available in the UK Biobank (N=44,823).
A schematic showing the structure of the retina and measurements of the thickness of 9 different retinal layers with retinal OCT imaging data available in the UK Biobank (N=44,823)

Genome-wide association studies identified inherited genetic markers influencing retinal layer thicknesses, which were replicated among 6313 individuals in the LIFE-Adult study.

Overall, 259 loci were linked with retinal layer thickness.

During a median of 10 years of follow up, the team identified links between retinal layer thickness and incident ocular, neuropsychiatric, and cardiometabolic diseases.

Retinal layer thickness was associated with incident mortality during the decade of follow up and was greatest among individuals with thinner photoreceptor segment layers and thinner ganglion cell complex (GCC) layers, after adjusting for multiple sociodemographic factors.

Consistency between epidemiologic and genetic associations indicated a relationship between the thickness of particular retinal layers assessed using OCT and particular ocular and systemic conditions.

For example, thinner layers of photoreceptor segments (PSs) were linked with age-related macular degeneration, and both poorer cardiometabolic and pulmonary function.

There were also links between a thinner retinal nerve fiber layer (RNFL) and glaucoma.

Among incident cardiometabolic diseases, each standard deviation in thinner PS layer was significantly associated with the risk of future incident hypertension (Hazard Ratio [HR]=1.09), hypercholesterolemia (HR=1.10), myocardial infarction (HR=1.17), nonhypertensive congestive heart failure (HR=1.25), cerebrovascular disease (HR=1.15), peripheral vascular disease (HR=1.32), and abdominal aortic aneurysms (HR=1.47).

The authors add: “Other retinal layers where thinning was associated with increased risk of future circulatory disease included GCC for aortic aneurysms and congestive heart failure, RNFL for hypertension, heart failure, cerebrovascular disease, and paroxysmal supraventricular tachycardia, and [choroid-scleral interface] for ischemic heart disease and hypertrophic cardiomyopathy.”

Gene Identified That Plays a Role in People Who Recover from Dilated Cardiomyopathy


human heart

Researchers at the Mayo Clinic investigating factors related to recovery from dilated cardiomyopathy have identified a variant in a specific gene that plays a role in people who recover from the condition and those who don’t. Results of the genome-wide association study (GWAS), published in Circulation Research, is expected to spur future research to target the gene for future dilated cardiomyopathy treatments.

“We found genetic variation in the CDCP1 gene, a gene that no one has heard of in cardiology, and its link to improvement in heart function in these patients,” says lead author Naveen Pereira, MD, a Mayo Clinic cardiologist.

Cardiomyopathy is characterized by its effect on the heart’s left ventricle, making it more difficult to pump blood throughout the rest of the body.

According to the investigators, CDCP1 genetic variation leads to differences in the protein’s structure which may influence a person’s susceptibility to disease or response to specific drugs. Once the gene was associated with improving function in the left ventricle, the team examined why this occurs. A key finding was that the CDCP1 gene is often variably expressed in fibroblasts of people with dilated cardiomyopathy, and fibrosis—an excess of fibrous connective tissue in the heart—plays a central role in prognosis of patients with the condition. Further, Pereira noted that genetic variation in or near CDCP1 has a significant association with heart failure death.

Strengthening the case for CDCP1 as a drug target, the Mayo Clinic researchers observed that decreasing the expression of the gene in cardiac connective tissue decreased fibroblast proliferation in the heart. It also downregulated the IL1RL1 gene, which encodes a prominent heart failure biomarker aST2, which when found in high levels is associated with fibrosis and death. This suggests the importance of developing a better understanding of the relationship between CDCP1 and aST2, as researchers search for ways to treat heart failure.

This finding now comes as heart failure rates are expected to rise significantly over the rest of the decade. The American Heart Association forecasts that heart failure will affect eight million people in the U.S. by 2030, a 46% increase over current rates. Of those cases, between 30% and 40% are caused by dilated cardiomyopathy.

Building in this research, the Mayo Clinic team is continuing studies in animal models to better understand the role CDCP1 plays in heart failure, and they are developing molecules to assess their potential as treatments for dilated cardiomyopathy.

“By continuing with this research that started with a human population that we took to the molecular and now animal laboratory, we hope to find new avenues for treatments to take back to the human population we studied, to improve patients’ survival and quality of life ultimately,” says Pereira.

Revolutionizing Depression Treatment: The Psilocybin Breakthrough


Discover how psilocybin, the active compound in ‘magic mushrooms,’ is challenging conventional antidepressant therapies. Could this controversial substance hold the secret to healing deep-seated mental anguish?

When standard antidepressants fail, where can the estimated 100 million trapped in disabling depression turn?1 Emerging research suggests an answer drawing scientists and psychiatrists back to prohibited fungi rather than forward to yet undiscovered pills. 

In a pioneering 19-patient trial, a single dose of psilocybin mushrooms brought nearly half into full remission for weeks—an accomplishment unmatched by even newly approved ‘rapid-acting’ antidepressants.2 Yet guidelines still recommend escalating conventional drugs that may never provide real relief. As lead author Dr. Guy Goodwin declares: 

“We’re witnessing no less than a paradigm shift away from just daily meds towards finally holistically healing the mind itself.”3

No Quick Fixes for Treatment-Resistant Depression…Except Maybe Psychedelics  

When multiple antidepressant trials yield little lasting improvement, a patient meets the diagnosis of “treatment-resistant” —a status held by roughly 30% today.4 Few emerge from this dark abyss through therapy tweaks or combination meds alone.5 In fact, remission plummets from nearly 40% to less than 20% after a first failed antidepressant.6  

Perhaps no population urgently needs innovative solutions more. Hopelessness and the risk of side effects compound with each new ineffective prescription. Issues earning income or parenting often rise while motivation, concentration, and sleep suffer.7 No wonder treatment-resistant depression doubles suicide attempt rates.8

Psilocybin as Rapid-Acting Antidepressant 

Psilocybin mushrooms contain psilocybin and psilocin—serotonin-like compounds triggering vivid hallucinations and spiritual experiences by stimulating 5-HT2A receptors.9 Though illegal and controversial, psilocybin intrigue erupted in 2016 when UK researchers found a single dose delivered rapid antidepressant responses lasting 6 months for cancer patients.10

Another 12-patient study reported 71% remission at 5 weeks after psilocybin-assisted therapy.11 Confirming results, last October researchers led by Dr. Goodwin announced a randomized 233-patient trial—the largest of its kind ever conducted.12 Two doses of psilocybin bracketed by psychology support outperformed escitalopram (Lexapro) in treatment-resistant depression: After 3 weeks, 57% improved on psilocybin versus just 28% assigned medication.  

The psilocybin groups also enjoyed nearly 4-fold higher complete remission—a holy grail combination of nearly vanished symptoms plus restored functioning lasting months. By contrast, newly approved esketamine (Spravato)—the first antidepressant innovation in decades—demonstrated 37% response and 29% remission after 6 weeks when doubly augmented by daily oral antidepressants.13

Adding Psilocybin to Antidepressants 

The recent 19-patient trial led once more by Imperial College’s Dr. Goodwin breaks further new ground.14 Rather than substituting psilocybin for antidepressant withdrawal, participants added a single 25mg psilocybin dose alongside their regular SSRI like Prozac, Zoloft, or Lexapro. Remission still proved twice as frequent as the best modern comparators at 42% sustained a month out. 

Side effect frequency also proved comparable to placebo with mostly mild headaches resolving quickly. This relieves concerns combining serotonin-elevating compounds could risk serious toxicity. Beyond symptom relief, 89% rated their psilocybin session “spiritually significant” with lasting life improvements in relationships, creativity, mindfulness, and nature connectedness.

Mechanisms: A Mystical Connection 

Researchers increasingly recognize depression as a disorder of both mood and worldview—a mind besieged by negative biases that narrows options.15 Psilocybin co-founder of psychedelic therapy research Roland Griffiths proposes the compound “opens a window of awareness compared to normal waking consciousness” through which entrenched mental patterns loosen their grip.16

Brain scans reveal psilocybin hyperconnects neural regions while damping habitual fear and control circuits. Users describe emerging “unity” and “interconnectedness” with existence often lasting years.17

Lead author Dr. Michael Pollan suggests this temporary ego dissolution offers a wellspring of empathy and flexibility refreshing rigid perspectives.18 Users feel less “apart from nature” and more appreciation towards loved ones. By flexing mental muscles of holism and awe at existence’s interconnectivity, Pollan theorizes psilocybin “trips” may act as emotional anti-inflammatories cooling searing psychological pain. The more mystical the experience, research corroborates, the further depression and anxiety recede months later.19

Beyond a Chemical Cure 

Antidepressant medications are designed to address falling serotonin or norepinephrine activity. Psilocybin, within a scaffolding of psychological support before and after, can be used to purse no less than a wholesale mental renewal of the patient.20 Whereas daily regimens only suppress symptoms, this soul-searching expedition seeks a wholesale reconfiguration of one’s entire worldview and relationship to the spiritual domains of existence and to one’s very own soul.

By inducing a temporary ‘psychosis, psilocybin may facilitate relearning pathways out of stolid paralysis in a redirected brain state.21 Goodwin envisions adding psilocybin sessions

“to help reset rigid thoughts holding people back once conventional meds provide initial symptom relief.”22

The Verdict: A “Paradigm Shift” in Mental Health 

More research is still needed, but depression dynamos like Goodwin and Pollan envisage an approaching reality where therapy offices offer guided psychedelic voyages. Instead of chasing chemical tweaks to cumbersome daily meds, psilocybin provides periodic mental spring cleaning—a reboot for worldviews warped by despair and isolation.23

With replication studies underway in the UK, North America, Australia, and Europe on disorders from alcoholism to anorexia, evidence supporting psilocybin therapy mounts.24 As Pollan concludes, banned mushrooms may yet

“augur a paradigm shift in the treatment of these conditions, perhaps even replacing antidepressants and talk therapy.”25

Two-thirds receiving psilocybin rate it amongst their life’s most meaningful experiences—a gateway to self-transcendence offering liberation where medications never quite reached.

New Study Finally Sheds Light On One of the Universe’s Weirdest Objects


These weird celestial objects can form like stars or planets, but the end result is the same: a gas giant almost big enough to be a star.

Artist's concept of how the brown dwarf Gliese 229 b might appear from a distance of about a half mi...

It’s not entirely fair to call brown dwarfs failed stars. At least some of them are actually really ambitious planets, a new study suggests.

Brown dwarfs are known amongst astronomers as celestial misfits. They’re too massive to be a gas giant like Jupiter but not quite massive enough to be a star. (For context, the smallest stars are about 80 times Jupiter’s mass.) Astronomers often refer to these awkwardly-sized objects as “failed stars” (so rude) since they seem to form as stars do — coalescing out of dense clouds of interstellar gas and dust – but they don’t grow large enough to kickstart nuclear fusion in their cores. But a recent study found at least one brown dwarf that formed like a planet, gobbling up a wide swath of material from the disk orbiting a newborn small star.

That means there’s more than one way to make a brown dwarf, and these dim giants may not be failures after all: Some of them are wildly successful planets.

Caltech astronomer Steven Giacalone presented his work at the 243rd meeting of the American Astronomical Society.

Artist's concept of how the brown dwarf Gliese 229 b might appear from a distance of about a half mi...
Artist’s concept of how the brown dwarf Gliese 229 b might appear from a distance of about a half million miles.

Reach for the stars — you may almost make it

Giacalone used data from the Keck Planet Finder, an instrument at the Keck Observatory in Hawai’i, to study the brown dwarf GPX-1b, a behemoth 20 times more massive than Jupiter (and about a quarter of the size of a red dwarf, the smallest known type of star). GPX-1b orbits a red dwarf star, and Giacalone and his colleagues wanted to know whether the pair had formed from the same stellar nursery (a nebula where new stars are forming) and been caught up in each other’s gravity or whether GPX-1b had formed from the disk of gas and dust around the newborn red dwarf, like a planet.

Brown dwarfs are fascinating objects in their own right since they occupy a murky space between planethood and stardom, but understanding how they form can also shed light on the evolution of small stars and giant planets.

The tilt and shape of one object’s orbit around another can reveal a lot about their shared history. For example, in our own Solar System, Neptune’s largest moon, Triton, orbits the planet backward (in the opposite direction from Neptune’s rotation and the other moons’ orbits), and that’s a clue that Neptune’s gravity probably captured Triton and pulled it into orbit sometime in the past.

In their Keck data, the astronomers saw that GPX-1b’s orbit around its star is lined up with the star’s equator — not tilted at a wild angle That suggests the brown dwarf formed in the disk of gas and dust orbiting the star shortly after its birth — like a planet, in other words.

diagram showing a brown dwarf orbiting in alignment with its star's equator (top) and at a tilted an...
Brown dwarfs that orbit around their star’s equators probably formed similar to planets, while those tilted at wild angles probably formed more like stars.

If the brown dwarf had formed like a star in the same dense nebula as its host star, the pair would have pulled each other into a close orbital dance (but probably a wild, swinging dance that left them orbiting each other at jaunty angles.) Instead, they promenade around each other with their equators neatly lined up.

“This is only one data point, and preliminary, but it suggests that the brown dwarf migrated close to its companion star in a similar manner to planets,” said Giacalone during his presentation.

GPX-1b is the first time astronomers have ever actually witnessed a brown dwarf that formed like a planet, although models had suggested it was possible. Other brown dwarfs, especially those that orbit farther from their host stars, seem to have formed more like binary pairs in which one partner didn’t quite achieve stardom and is just holding onto the other’s coattails.

In other words, some brown dwarfs are near-miss failures, but others are runaway successes.

Inside the Quest to Turn Genetically Engineered Immune Cells Into An Anti-Aging Remedy


Aged mice rejuvenate. Young mice age slower.

Illustration of a lymphocyte white blood cell

As the years wear down on us, our bodies accumulate a whole host of damaged cells that stubbornly refuse to die. Called senescent cells, these cellular oddballs are like the moldy fruit in the basket, hastening the spoilage of the rest or, in our body’s case, ushering in a myriad of age-related diseases.

In the quest for the fountain of youth, senescent cells have become a hot ticket item. Scientists are actively researching the genes and other biological factors that make these cells so resilient, including testing a class of drugs called senolytics designed to wipe them out. Now, one group of researchers has figured out a way to reprogram a key player of the immune system to take down senescent cells like a heat-seeking missile of rejuvenation.

The process involves genetically engineering white blood cells called T cells to create what’s known as chimeric antigen receptor (or CAR) T cells. These specialized cells recognize and attack a specific cellular target. Researchers at Cold Spring Harbor Laboratory in New York tailored a CAR T cell to home in on a molecule found in large amounts of senescent cells called urokinase plasminogen activator receptor (uPAR). When these T cells were sicced on senescent cells in aged mice with metabolic issues and young mice fed a high-fat diet, which can trigger age-related senescence, the animals pulled a 180 with almost no side effects, losing weight, becoming more physically active, and seeing improvements in their metabolism.

“If we give it to aged mice, they rejuvenate. If we give it to young mice, they age slower. No other therapy right now can do this,” Corina Amor Vegas, the study’s first author and an assistant professor at Cold Spring Harbor Laboratory, said in a press release.

These findings were published Wednesday in the journal Nature Aging.

Weaponizing T cells

In recent decades, CAR T cells have made a name for themselves in fighting cancer as a promising immunotherapy. They are often dubbed “living drugs” because the white blood cells can be taken directly from someone through a blood sample and then reintroduced after they’ve been genetically altered, which is lately done with the gene-editing tool CRISPR.

One significant challenge with CAR T cells, however, is finding just the right molecule – or biomarker — specific to the cell you want to target, a concern especially crucial in cancer treatment where you wouldn’t want your “living drug” to target healthy instead of cancerous cells. No two cancer cells share the exact same cast of molecules on their cell surfaces, which is likewise for senescent cells.

The new study relies on earlier research that laid the groundwork for that search. In a 2020 paper published in Nature, researchers from the Memorial Sloan Kettering Cancer Center, which included Amor Vegas, found that senescent cells had more uPAR dotting their outside surfaces compared to other cell types.

When the researchers tested anti-uPAR CAR T cells in two separate groups of mice, one with lung cancer and the other with liver fibrosis (a condition where healthy liver tissue becomes scarred and can lead to cirrhosis and liver cancer), the animals, surprisingly, lived longer.

This success prompted a next step: testing whether a CAR T-cell therapy could extend longevity in regular mice.

The image shows healthy pancreatic tissue samples from an old mouse treated with CAR T cells as a young pup. Senescent cells visible in blue.

Amor Vegas and her colleagues used the same CAR T cells developed from their earlier experiment, intravenously infusing them into a group of mice between the ages of 18 to 20 months old (equivalent to 56 to almost 70 years old in humans); a control group didn’t get the treatment. These mice were fed a normal diet but, because of their advanced age, suffered from age-related metabolic dysfunction — a condition humans also develop as we age — where they had elevated blood sugar levels and weren’t able to move around or exercise all that much.

CAR T cells targeting uPAR were also given to a group of much younger mice, about three months old, before they were started on a high-fat diet, where about 60 percent of their calories came from fat. Studies show that consuming that amount of fat can promote senescent cells through the stress and inflammation caused by obesity. While these animals didn’t yet have any metabolic disorders, the CAR T cells were given as a prophylactic in an attempt to belay the inevitable aging.

For both groups of mice, the CAR T cells did their magic. Older mice found themselves healthier with lower glucose and insulin levels and fewer inflammatory markers circulating in their blood; they were moving around and weighed less. The younger mice who were given the treatment about a month and a half before starting their high-fat diet didn’t gain as much weight, and their blood sugar levels were better compared to their counterparts who didn’t get this treatment.

Further research needed

These findings are striking in that we have a possible new path to use our own immune cells to shed the cellular damage wrought by aging that could be long-lasting even after one dose. For example, in the younger mice, the researchers found that the anti-uPAR CAR T cells were still hanging around in the animals’ spleens and livers and were mostly a subtype of T cells, known as a CD8+ T cell, that has the ability to fight off harmful cells.

“T cells have the ability to develop memory and persist in your body for really long periods, which is very different from a chemical drug,” said Amor Vegas. “With CAR T cells, you have the potential of getting this one treatment, and then that’s it. For chronic pathologies, that’s a huge advantage. Think about patients who need treatment multiple times per day versus you get an infusion, and then you’re good to go for multiple years.”

However, implementing an anti-aging CAR T-cell therapy needs to go through more clinical trials with both animals and eventually humans before it can ever reach the market. The good news is that research into engineering and developing CAR T cells is pretty advanced, and there are more clinical trials, specifically for cancer, now than ever. As plans for cheaper and even off-the-shelf CAR T-cell therapies are being explored, your anti-aging “living drug” could be here sooner than you think.

Overactive Gene Linked to Heart Defects in Down Syndrome


About half of all babies born with Down syndrome have heart defects that may require high-risk surgery or ongoing monitoring depending on the severity of the condition. Now, scientists from the Francis Crick Institute and University College London have linked Dyrk1a, a gene on human chromosome 21, to heart defects in these individuals that could open a door to new therapeutic possibilities. Their findings are reported in Science Translational Medicine in a paper titled, “Increased dosage of DYRK1A leads to congenital heart defects in a mouse model of Down syndrome.” 

This isn’t the first time that Dyrk1a has been linked to Down syndrome. Other studies have tied it to cognitive impairment and craniofacial dysmorphology observed in people with Down syndrome, but its link to heart defects is a novel finding. By looking at heart data from embryonic mouse models, the researchers found that Dyrk1a caused heart defects when present in three copies in mice. 

Dyrk1a codes for an enzyme called DYRK1A. The study showed that an extra copy of Dyrk1a turned down the activity of genes required for cell division in the developing heart and the function of the mitochondria. These changes were correlated with a failure to correctly separate the chambers of the heart.

Furthermore, when the researchers tested a DYRK1A inhibitor on pregnant mice with pups that model Down syndrome hearts defects as their hearts were forming, they observed that the genetic changes were partially reversed and the heart defects in the pups were less severe.  

The findings do suggest a potential therapeutic approach targeting this gene could work in humans. “However, in humans the heart forms in the first 8 weeks of pregnancy, likely before a baby could be screened for Down syndrome, so this would be too early for treatment,” noted Victor Tybulewicz, PhD, group leader of the Immune Cell Biology Laboratory & Down Syndrome Laboratory at the Crick and senior author on the paper. “The hope is that a DYRK1A inhibitor could have an effect on the heart later in pregnancy, or even better after birth. These are possibilities we are currently investigating.”

They are also investigating the possible involvement of other genes in heart defect development. While Dyrk1a is an important part of the equation, the researchers suspect it isn’t the only player. This was also reflected in the study data. The evidence shows that Dyrk1a is required in three copies to cause heart defects in mice, it was not sufficient alone. Furthermore, the inhibitor they used only partially reversed the changes in the mouse pups’ hearts. This suggests that another unknown gene must also be involved in the origin of heart defects in Down syndrome. And the team is currently searching for it.

Alongside those studies, the researchers are working with Perha Pharmaceuticals to test the DYRK1A inhibitor for treating cognitive disorders associated with both Down syndrome and Alzheimer’s disease. But they are also exploring other potential therapeutic avenues beyond Dyrk1a.

Rifdat Aoidi, PhD, a postdoctoral project research scientist at the Crick and co-first author, added, “We don’t yet know why the changes in cell division and mitochondria mean the heart can’t correctly form chambers. Dysfunction in the mitochondria has also been linked to cognitive impairment in Down syndrome, so boosting mitochondrial function could be another promising avenue for therapy.”