5 Proven Ways to Boost Testosterone Naturally


The alternative path is to support the body's natural production of testosterone both by removing testosterone-blocking chemicals and supporting one's own body's ability to produce more testosterone endogenously.(Shutterstock)

The alternative path is to support the body’s natural production of testosterone both by removing testosterone-blocking chemicals and supporting one’s own body’s ability to produce more testosterone endogenously.(Shutterstock)

Boosting testosterone has become all the rage today, but unless you activate your body’s innate ability to do it naturally you will have to face the possibility of serious side effects.

As men reach their mid-forties their testosterone levels begin to decline, with approximately a 1 to 2 percent decrease in measurable blood levels annually, and then dropping off precipitously after age 60 into full blown “andro-pause.”  This ever-increasing decline can have a wide range of adverse effects, both physically and psychologically, ranging from muscle loss to insulin resistance and low libido to depression.

Today, an increasing number of aging men are opting for testosterone replacement therapy, some with dramatic results. But this approach, while often positive in the short term, can have some serious drawbacks in the long term, especially if the underlying and modifiable factors causing the deficiency are not addressed at their root.

First, testosterone replacement therapy often involves administering levels far higher than a normal physiologic dose, which can increase the risks of serious side effects, including certain cancers.

Second, when testosterone is replaced, a negative endocrine feedback loop is activated sending a signal to the gonads to reduce its production further, ultimately feeding the original deficiency and even leading to testicular atrophy.

Third, when testosterone levels are suddenly increased through exogenous sources, there is often a concomitant increase in testosterone metabolites such as dihydrotestosterone (DHT) and estradiol, both which can lead to some particularly undesirable downstream effects, which include male pattern hair loss and excessive prostate growth.

Given these risks, the alternative path is to support the body’s natural production of testosterone both by removing testosterone-blocking chemicals and supporting one’s own body’s ability to produce more testosterone endogenously.

Here Are 5 Natural Things That May Help Boost Your Testosterone Naturally:

1. Zinc

It is well known that a zinc deficiency can lead to testicular suppression, including suppression of testosterone levels.

The male prostate happens to have one of the highest concentrations of zinc of any organ within the body, indicating how important it is to the male reproductive system. Also, physical activity in both normally sedentary men and elite athletes can lead to both testosterone and thyroid hormone suppression, which can be mitigated by zinc supplementation.

Zinc has also been found to protect against heavy metal (cadmium) associated DNA damage to the testicles, preserving their ability to produce testosterone. Animal research also indicates that it can improve erectile function along with optimizing levels of prolactin and testosterone.

Keep in mind that minerals are connected in a matrix of interdependence. Excess zinc can lead to copper deficiency and vice versa. This speaks to the importance of working with a licensed health professional versed in this area of expertise to help clinically ascertain your deficiencies and rectify them without causing unintended adverse effects.

When in doubt, locate food sources of the minerals you are trying to replenish your body with from food, as minerals have a far lower risk of causing imbalances when found in food form. You can always use the USDA-based database SELF Nutrition Data to find the top nutrient-containing foods of your choice.

2. Vitamin C

One of the most important ways to optimize testosterone levels is to preserve its activity and regenerate it when it naturally converts to a transient hormone metabolite.

Preliminary research indicates that vitamin C, a well-known electron donor, may be able to both regenerate testosterone and reduce levels of its toxic hormone metabolite. Read the article “Sunshine Vitamin Regenerates and Detoxifies Your Hormones” on GreenMedInfo.com, to learn more.

3. Magnesium

Magnesium levels are strongly and independently associated with the anabolic hormones testosterone and IGF-1 in the elderly. This observation indicates that this mineral, which is involved in over 300 enzyme pathways, can help to positively modulate the anabolic/catabolic equilibrium, which is often disrupted in elderly people.

One proposed mechanism for magnesium’s testosterone boosting role is that it inhibits the binding of testosterone (TT) to sex hormone-binding globulin (SHBG) leading to an enhancement of bioavailable TT.

4. Saw Palmetto/Astaxanthin

One of the best ways to increase testosterone naturally is to block it from converting to dihydrotestosterone and estrogen (estradiol). This can be accomplished through natural aromatase enzyme inhibitors and 5-alpha reductase inhibitors.  An enzyme, 5-alpha reductase  converts testosterone into dihydrotestosterone and aromatase enzyme converts testosterone into estradiol.

A promising study from 2009 found that in healthy males between 37–70 years of age a combination of these two substances resulted in exactly such an improved ratio: increased testosterone, decreased estrogen, and dihydrotestosterone.

5. Phosphatidyl Serine

This critically important cell membrane component, mainly found in meat, fish, and dairy products, but also found in soy and sunflower lecithin, has been found to decrease cortisol levels and increase testosterone levels following moderate physical activity in athletes.

Here Are 5 Things That One Should Avoid to Keep Testosterone Production Optimal:

1.     Statin Drugs

No category of drug is so thoroughly confirmed in the biomedical literature to suppress testosterone production and/or libido in men. Not only are these drugs misrepresented as ‘lifesaving’ for cardiovascular disease, but they may contribute to over 200 different adverse health effects. Any man concerned with preserving his production of testosterone should consider avoiding this drug.

2.     Bisphenol A

This ubiquitous endocrine disrupter found mostly in plastics, canned foods, and thermal printer receipts, has been found to block testosterone production in the testicles and to have potentially “feminizing” estrogenic effects. Also, don’t be fooled by so-called Bisphenol A free products, because it turns out that many contain other bisphenols, which have at least the same toxicity profile.

3.     Phthalates

Mainly used in plastics to make them flexible (that is, a plasticizer), but also found in pharmaceuticals as an excipient and in cosmetic products, it has been found to suppress testosterone production.

4.     Parabens

Another ubiquitous petrochemical found as a preservative in a wide range of products, but especially cosmetics and body care productions, it has been found to disrupt testosterone levels.

5.     Glyphosate (GMO food)

This testosterone-disrupting chemical is now found virtually everywhere in regions where GM agriculture predominates. Most GMO foods are designed to survive being sprayed with glyphosate, and therefore are contaminated with significant residues. But even explicitly non-GMO foods like oats are sprayed with the stuff as a pre-harvest desicant. Therefore the best way to avoid exposure is to eat 100 percent organically certified foods.

Could a New Test That Detects Dopamine Levels Help Diagnose Neurological Diseases?


Summary: A new test that measures dopamine levels in biological fluids could help with the detection of depression, Parkinson’s disease, and other disordered marked by abnormal dopamine levels.

Source: Wiley

Altered levels of the neurotransmitter dopamine are apparent in various conditions, such as Parkinson’s disease and depression.

In research published in ChemistrySelect, investigators describe a quick, sensitive, and simple test to determine dopamine levels in biological fluids.

The method could help clinicians spot abnormal blood levels of dopamine in patients, potentially allowing for earlier disease detection.

The method relies on what are called carbon quantum dots, a type of carbon nanomaterial with photoluminescence properties, and ionic liquid, which is comprised of several mineral anions and organic cations existing in liquid form at room temperature.

This shows a brain
The method could help clinicians spot abnormal blood levels of dopamine in patients, potentially allowing for earlier disease detection. Image is in the public domain

“The proposed electrochemical sensor could be an exceptional step forward in dopamine detection and pave the way for the molecular diagnosis of neurological illnesses,” the authors wrote.

Dopamine (DA) as a neurotransmitter has a pivotal role in the central nervous system. Because of altered levels of DA in various neuroscience diseases, development of a quick, sensitive, and simple analytical approach to determine DA in biological fluids could be very applicable.

In this research, a novel electrochemical sensor based on a carbon paste electrode (CPE) modified with ionic liquid (IL) and carbon quantum dots (CQDs) for measuring DA with uric acid and ascorbic acid was developed. IL and CQDs were synthesized and characterized for their specific properties such as composition, emission, size distribution, and morphology structure.

Then, the modified CPE and different DA concentration was determined via cyclic voltammetry. The modified electrode exhibited great electrocatalytic activity for DA oxidation.

Under optimal conditions, the calibration diagram for DA was linear within the range of 0.1–50 μM in phosphate buffer (pH=7.4) and limit of detection was 0.046 μM. The electrode was successfully used in the determination of DA in real samples and generated acceptable outputs.

The proposed electrochemical sensor could be an exceptional step forward in DA detection and pave the way for the molecular diagnosis of neurological illnesses.

The Impact of Vitamin D and Thyroid Hormones on Child Development


Summary: Lower levels of vitamin D in-utero were associated with delays in fine motor skill development at age five. Exposure to thyroid hormones in-utero was associated with cognitive development during childhood.

Source: Marshall University

Prenatal exposure to altered levels of vitamin D and/or thyroid hormones has the potential to impact child development long after birth, according to a new study by researchers at the Marshall University Joan C. Edwards School of Medicine.

A retrospective study analyzed the presence of 20 different elements, thyroid hormones and vitamin D levels in umbilical cord blood collected at birth. The levels were compared with how well a child met developmental milestones as part of their well child examinations conducted between birth to age 5.

The findings, published last month in Biomedicine & Pharmacotherapy, an open access, peer-reviewed medical journal focused on clinical and basic medicine and pharmacology, showed that vitamin D levels were associated with a delay in fine motor development and thyroid hormone levels were associated with cognitive development. Certain metals such as lead, mercury, copper and manganese were associated with language, cognitive or motor skill development.

“Our study demonstrates the importance of the in-utero environment,” said Jesse Cottrell, M.D., assistant professor of obstetrics and gynecology at the Joan C. Edwards School of Medicine and lead author on the study.

This shows a pregnant woman
A retrospective study analyzed the presence of 20 different elements, thyroid hormones and vitamin D levels in umbilical cord blood collected at birth.

“The study found multiple associations between umbilical cord essential and toxic elements, thyroid levels and Vitamin D on childhood development for a pronounced time after birth.”

“Very little existing research addresses the long-term effects on child development of in utero exposure to environmental agents,” said Monica Valentovic, Ph.D., professor of biomedical sciences and toxicology research cluster coordinator at the Joan C. Edwards School of Medicine and corresponding author on the study.

“With the original umbilical cord blood samples collected in 2013, having long-term follow-up on developmental outcomes adds significantly to the literature.”

Chelsea Nelson, M.D., Catherine Waldron, Mackenzie Bergeron and Abigail Samson also served as co-authors on the abstract. The work is supported by the Robert C. Byrd Center for Rural Health at Marshall University, the West Virginia Higher Education Policy Commission, the translational research pilot grant program at the Joan C. Edwards School of Medicine and a National Institutes of Health grant (P20GM103434).

The team continues to investigate development of children beyond age 5 as well as in utero exposure to environmental metals and the impact on development of the newborn or health effects related to vitamin D levels.

Abstract

Effect of umbilical cord essential and toxic elements, thyroid levels, and Vitamin D on childhood development

Introduction

The in-utero environment has dramatic effects on childhood development. We hypothesized prenatal levels of inorganic agents, thyroid levels, and Vitamin D effect childhood development.

Methods

Umbilical cord blood was collected from April 3, 2013 to January 30, 2014 and analyzed for 20 different elements, thyroid and Vitamin D. A retrospective review (n = 60) was performed of well-child examinations from birth to 5 years old (y.o.).

Results

There were associations with calcium and 4 month BMI (p = <0.01), 12 month language (p = 0.03); Magnesium and 6 month language (p = 0.04) and gross motor skills at 5 years old (y.o.) (p = 0.03); Copper and 12 month fine motor (p = 0.02); Zinc with fine motor (p = <0.01) and language (p = 0.03) at 2 y.o.; Manganese was associated with language development at 2 y.o. (p = 0.02); Molybdenum and fine motor at 12 months of age (p = 0.02); Selenium with gross motor (p = 0.04) and BMI (p = 0.02) at 5 y.o.; Lead with cognitive function at 4 months (p = 0.04) and 2 y.o. (p = 0.01); Mercury with gross motor at 4 months (p = 0.04) and language at 2 y.o. (p = 0.02). Platinum at 12 months of age (p = <.01) as well as multiple associations at 5 y.o. (p = <.01). Thyroid function tests for free T3 were associated with multiple cognitive and physical milestones. T3 Uptake was associated with 5 y.o. gross motor skills (p = 0.02). Total and Free T4 was associated with cognitive development (p = <.01) and fine motor development, respectively. Vitamin D was associated with a delay of fine motor development (p<0.01).

Conclusion

There were multiple associations between umbilical cord essential and toxic elements, thyroid levels, and Vitamin D on childhood development.

Could a Viral Illness Increase Chances of Developing Alzheimer’s or Other Neurodegenerative Disease?


Summary: Study reveals a significant association between certain viral illnesses, including viral encephalitis and pneumonia-causing flu with an increased risk of developing a neurodegenerative disorder later in life. Researchers say existing vaccines against the viruses may reduce the chances of developing neurodegeneration.

Source: NIH

Some viral illnesses may increase a person’s chances of later developing Alzheimer’s disease or another neurodegenerative disorder.

Though a causal link cannot be confirmed, an NIH study in which researchers mined the medical records of hundreds of thousands of people in Finland and the United Kingdom found significant associations.

As published in Neuron, the researchers found there may be at least 22 pairings between a neurodegenerative disorder diagnosis and a previous viral infection that led to a hospital visit.

The strongest risk association was between viral encephalitis—an inflammation of the brain caused by a virus—and Alzheimer’s disease. Meanwhile, hospitalizations due to pneumonia-causing flu viruses were linked to the diagnoses of several disorders, including dementia, Parkinson’s disease, and amyotrophic lateral sclerosis (ALS).

The study results also raised the possibility that existing vaccinations may help some people reduce the chances of experiencing these disorders.

“Neurodegenerative disorders are a collection of diseases for which there are very few effective treatments and many risk factors,” said Andrew B. Singleton, Ph.D., director, NIH Center for Alzheimer’s Related Dementias (CARD); NIH Distinguished Investigator; and a study author.

“Our results support the idea that viral infections and related inflammation in the nervous system may be common—and possibly avoidable—risk factors for these types of disorders.”

Neurodegenerative disorders damage different parts of the nervous system. Typically, this happens later in life and produces a variety of problems, including with thinking, remembering, and moving. Several previous studies have suggested that certain viruses may play a role in each of these disorders.

For example, a 1991 study of autopsied brain tissue suggested there may be link between herpes simplex virus and Alzheimer’s disease. More recently, scientists found evidence for a link between the Epstein Barr virus and multiple sclerosis by analyzing patient blood samples and medical records. The latter study sparked the CARD team to conduct this new study.

“After reading the Epstein Barr virus study we realized that for years scientists had been searching, one-by-one, for links between an individual neurodegenerative disorder and a specific virus,” said Michael Nalls, Ph.D., leader of the NIH CARD Advanced Analytics Expert Group and study senior author.

“That’s when we decided to try a different, more data science-based approach. By using medical records, we were able to systematically search for all possible links in one shot.”

Led by Kristin S. Levine, M.S. and Hampton L. Leonard, M.S., two NIH CARD data scientists, the researchers mined the medical records of 300,000 individuals stored in FinnGen, a nationwide Finnish biobank.

Specifically, they searched for individuals who had one of six neurodegenerative disorder diagnoses: Alzheimer’s disease, ALS, generalized dementia, multiple sclerosis, Parkinson’s disease, or vascular dementia; and then checked to see if a viral infection caused those individuals to make a prior visit to the hospital. Hospitalizations due to COVID-19 were not included in the study.

Initially, they found 45 significant associations between a neurodegenerative disease diagnosis and a previous viral infection. That number narrowed to 22 associations after the scientists performed a second search of UKBiobank, which contains the records of 500,000 individuals from the United Kingdom.

Of all the neurodegenerative disorders, generalized dementia had the most associations, with links to six different virus exposures. These exposures were categorized as viral encephalitis, viral warts, other viral diseases, all influenza, influenza and pneumonia, and viral pneumonia. Individuals who had viral encephalitis were at least 20 times more likely to be diagnosed with Alzheimer’s than those who did not experience that virus.

This shows a brain
The strongest risk association was between viral encephalitis—an inflammation of the brain caused by a virus—and Alzheimer’s disease. Meanwhile, hospitalizations due to pneumonia-causing flu viruses were linked to the diagnoses of several disorders, including dementia, Parkinson’s disease, and amyotrophic lateral sclerosis (ALS).

Severe cases of influenza were linked to the widest range of risks. Influenza and pneumonia exposures were associated with all the neurodegenerative disorder diagnoses except multiple sclerosis.

“Keep in mind that the individuals we studied did not have the common cold. Their infections made them so sick that they had to go to the hospital,” said Dr. Nalls.

“Nevertheless, the fact that commonly-used vaccines reduce the risk or severity of many of the viral illnesses observed in this study raises the possibility that the risks of neurodegenerative disorders might also be mitigated.”

Further analysis of the FinnGen data suggested that the risks associated with some viruses may wear off over time. Here the researchers analyzed 16 of the 22 associations that were common between the FinnGen and the UKBioBank data.

For all 16, the risks of being diagnosed with a neurodegenerative disorder was high within one year of an infection. However, only six of those associations remained significant if the infection happened five to 15 years before the diagnosis.

Finally, it is known that about 80% of the viruses observed in this study can invade the nervous system and trigger the immune system’s inflammatory response.

“The results of this study provides researchers with several new critical pieces of the neurodegenerative disorder puzzle,” said Dr. Nalls. “In the future, we plan to use the latest data science tools to not only find more pieces but also help researchers understand how those pieces, including genes and other risk factors, fit together.”

Seven technologies to watch in 2023


Nature’s pick of tools and techniques that are poised to have an outsized impact on science in the coming year.

A NASA engineer examines JWST mirror segments at NASA's Marshall Space Flight Center.
The James Webb Space Telescope’s 6.5-metre primary mirror (6 of 18 segments shown) can detect objects billions of light years away.

From protein sequencing to electron microscopy, and from archaeology to astronomy, here are seven technologies that are likely to shake up science in the year ahead.

Single-molecule protein sequencing

The proteome represents the complete set of proteins made by a cell or organism, and can be deeply informative about health and disease, but it remains challenging to characterize.

Proteins are assembled from a larger alphabet of building blocks relative to nucleic acids, with roughly 20 naturally occurring amino acids (compared with the four nucleotides that form molecules such as DNA and messenger RNA); this results in much greater chemical diversity. Some are present in the cell as just a few molecules — and, unlike nucleic acids, proteins cannot be amplified, meaning protein-analysis methods must work with whatever material is available.

Most proteomic analyses use mass spectrometry, a technique that profiles mixtures of proteins on the basis of their mass and charge. These profiles can quantify thousands of proteins simultaneously, but the molecules detected cannot always be identified unambiguously, and low-abundance proteins in a mixture are often overlooked. Now, single-molecule technologies that can sequence many, if not all, of the proteins in a sample could be on the horizon — many of them analogous to the techniques used for DNA.

Edward Marcotte, a biochemist at the University of Texas at Austin, is pursuing one such approach, known as fluorosequencing1. Marcotte’s technique, reported in 2018, is based on a stepwise chemical process in which individual amino acids are fluorescently labelled and then sheared off one by one from the end of a surface-coupled protein as a camera captures the resulting fluorescent signal. “We could label the proteins with different fluorescent dyes and then watch molecule by molecule as we cut them away,” Marcotte explains. Last year, researchers at Quantum-Si, a biotechnology firm in Guilford, Connecticut, described an alternative to fluorosequencing that uses fluorescently labelled ‘binder’ proteins to recognize specific sequences of amino acids (or polypeptides) at the ends of proteins2.

(Top) Natural embryo at embryonic day e8.5, (bottom) synthetic embryo at day 8 of development.
Researchers can now make synthetic embryos in the laboratory (bottom) that resemble natural, eight-day-old embryos (top).

Other researchers are developing techniques that emulate nanopore-based DNA sequencing, profiling polypeptides on the basis of the changes they induce in an electric current as they pass through tiny channels. Biophysicist Cees Dekker at Delft University of Technology in the Netherlands and his colleagues demonstrated one such approach in 2021 using nanopores made of protein, and were able to discriminate between individual amino acids in a polypeptide passing through the pore3. And at the Technion — Israel Institute of Technology in Haifa, biomedical engineer Amit Meller’s team is investigating solid-state nanopore devices manufactured from silicon-based materials that could enable high-throughput analyses of many individual protein molecules at once. “You might be able to look at maybe tens of thousands or even millions of nanopores simultaneously,” he says.

Although single-molecule protein sequencing is only a proof of concept at present, commercialization is coming fast. Quantum-Si has announced plans to ship first-generation instruments this year, for example, and Meller notes that a protein-sequencing conference in Delft in November 2022 featured a discussion panel dedicated to start-ups in this space. “It reminds me a lot of the early days before next-generation DNA sequencing,” he says.

Marcotte, who co-founded the protein- sequencing company Erisyon in Austin, Texas, is bullish. “It’s not really a question of whether it will work,” he says, “but how soon it will be in people’s hands.”

James Webb Space Telescope

Astronomers began last year on the edges of their collective seats. After a design and construction process lasting more than two decades, NASA — in collaboration with the European and Canadian space agencies — successfully launched the James Webb Space Telescope (JWST) into orbit on 25 December 2021. The world had to stand by for nearly seven months as the instrument unfolded and oriented itself for its first round of observations.

It was worth the wait. Matt Mountain, an astronomer at the Space Telescope Science Institute in Baltimore, Maryland, who is a telescope scientist for JWST, says the initial images exceeded his lofty expectations. “There’s actually no empty sky — it’s just galaxies everywhere,” he says. “Theoretically, we knew it, but to see it, the emotional impact is very different.”

JWST was designed to pick up where the Hubble Space Telescope left off. Hubble generated stunning views of the Universe, but had blind spots: ancient stars and galaxies with light signatures in the infrared range were essentially invisible to it. Rectifying that required an instrument with the sensitivity to detect incredibly faint infrared signals originating billions of light years away.

The final design for JWST incorporates an array of 18 perfectly smooth beryllium mirrors that, when fully unfolded, has a diameter of 6.5 metres. So precisely engineered are those mirrors, says Mountain, that “if you stretched a segment out over the United States, no bump could be more than a couple of inches [high].” These are coupled with state-of-the-art near- and mid-infrared detectors.NatureTech

That design allows JWST to fill in Hubble’s gaps, including capturing signatures from a 13.5-billion-year-old galaxy that produced some of the first atoms of oxygen and neon in the Universe. The telescope has also yielded some surprises; for instance, being able to measure the atmospheric composition of certain classes of exoplanet.

Researchers around the world are queueing up for observation time. Mikako Matsuura, an astrophysicist at Cardiff University, UK, is running two studies with JWST, looking at the creation and destruction of the cosmic dust that can contribute to star and planet formation. “It’s a completely different order of sensitivity and sharpness” compared with the telescopes her group has used in the past, Matsuura says. “We have seen completely different phenomena ongoing inside these objects — it’s amazing.”

Volume electron microscopy

Electron microscopy (EM) is known for its outstanding resolution, but mostly at the surface level of samples. Going deeper requires carving a specimen into exceptionally thin slices, which for biologists are often insufficient for the task. Lucy Collinson, an electron microscopist at the Francis Crick Institute in London, explains that it can take 200 sections to cover the volume of just a single cell. “If you’re just getting one [section], you’re playing a game of statistics,” she says.

Now researchers are bringing EM resolution to 3D tissue samples encompassing many cubic millimetres.

Previously, reconstructing such volumes from 2D EM images — for example, to chart the neural connectivity of the brain — involved a painstaking process of sample preparation, imaging and computation to turn those images into a multi-image stack. The latest ‘volume EM’ techniques now drastically streamline this process.

Those techniques have various advantages and limitations. Serial block-face imaging, which uses a diamond-edged blade to shave off thin sequential layers of a resin-embedded sample as it is imaged, is relatively fast and can handle samples approaching one cubic millimetre in size. However, it offers poor depth resolution, meaning the resulting volume reconstruction will be comparatively fuzzy. Focused ion beam scanning electron microscopy (FIB-SEM) yields much thinner layers — and thus finer depth resolution — but is better suited to smaller-volume samples.

Collinson describes the rise of volume EM as a ‘quiet revolution’, with researchers highlighting the results of this approach rather than the techniques used to generate them. But this is changing. For example, in 2021, researchers working on the Cell Organelle Segmentation in Electron Microscopy (COSEM) initiative at Janelia Research Campus in Ashburn, Virginia, published a pair of papers in Nature highlighting substantial progress in mapping the cellular interior4,5. “It’s a very impressive proof of principle,” says Collinson.Seven technologies to watch in 2022

The COSEM initiative uses sophisticated, bespoke FIB-SEM microscopes that increase the volume that can be imaged in a single experiment by roughly 200-fold, while preserving good spatial resolution. Using a bank of these machines in conjunction with deep-learning algorithms, the team was able to define various organelles and other subcellular structures in the full 3D volume of a wide range of cell types.

The sample-preparation methods are laborious and difficult to master, and the resulting data sets are massive. But the effort is worthwhile: Collinson is already seeing insights in infectious-disease research and cancer biology. She is now working with colleagues to explore the feasibility of reconstructing the entire mouse brain at high resolution — an effort she predicts will take more than a decade of work, cost billions of dollars and produce half a billion gigabytes of data. “It’s probably on the same order of magnitude as the effort to map the first human genome,” she says.

CRISPR anywhere

The genome-editing tool CRISPR–Cas9 has justifiably earned a reputation as the go-to method for introducing defined changes at targeted sites throughout the genome, driving breakthroughs in gene therapy, disease modelling and other areas of research. But there are limits as to where it can be used. Now, researchers are finding ways to circumvent those limitations.

CRISPR editing is coordinated by a short guide RNA, which directs an associated Cas nuclease enzyme to its target genomic sequence. But this enzyme also requires a nearby sequence called a protospacer adjacent motif (PAM); without one, editing is likely to fail.

At the Massachusetts General Hospital in Boston, genome engineer Benjamin Kleinstiver has used protein engineering to create ‘near-PAMless’ Cas variants of the commonly used Cas9 enzyme from the bacterium Streptococcus pyogenes. One Cas variant requires a PAM of just three consecutive nucleotide bases with an A or G nucleotide in the middle position6. “These enzymes now read practically the entire genome, whereas conventional CRISPR enzymes read anywhere between 1% and 10% of the genome,” says Kleinstiver.

Such less-stringent PAM requirements increase the chances of ‘off-target’ edits, but further engineering can improve their specificity. As an alternative approach, Kleinstiver’s team is engineering and testing large numbers of Cas9 variants that each exhibit high specificity for distinct PAM sequences.

There are also many naturally occurring Cas variants that remain to be discovered. In nature, the CRISPR–Cas9 system is a bacterial defence mechanism against viral infection, and different microorganisms have evolved various enzymes with distinct PAM preferences. Virologist Anna Cereseto and microbiome researcher Nicola Segata at the University of Trento in Italy have combed through more than one million microbial genomes to identify and characterize a diverse set of Cas9 variants, which they estimate could collectively target more than 98% of known disease-causing mutations in humans7.

Only a handful of these will work in mammalian cells, however. “Our idea is to test many and see what are the determinants that make those enzymes work properly,” says Cereseto. Between the insights gleaned from these natural enzyme pools and high-throughput protein-engineering efforts, Kleinstiver says, “I think we’ll end with a pretty complete toolbox of editors that allow us to edit any base that we want”.

High-precision radiocarbon dating

Last year, archaeologists took advantage of advances in radiocarbon dating to home in on the precise year — and even the season — in which Viking explorers first arrived in the Americas. Working with pieces of felled timber unearthed in a settlement on the northern shore of Newfoundland, Canada, a team led by isotope-analysis expert Michael Dee at the University of Groningen in the Netherlands and his postdoc Margot Kuitems determined that the tree was likely to have been cut down in the year 1021, probably in the spring8.

Scientists have been using radiocarbon dating of organic artefacts since the 1940s to narrow down the dates of historical events. They do so by measuring traces of the isotope carbon-14, which is formed as a result of the interaction of cosmic rays with Earth’s atmosphere and which decays slowly over millennia. But the technique is usually precise only to within a couple of decades.

Reconstructed viking site at L'Anse-aux Meadows in Newfoundland, Canada.
Precise radiocarbon dating of timber at L’Anse aux Meadows in Newfoundland, Canada, revealed that Vikings cut down a tree at the site in 1021.

Things changed in 2012, when researchers led by physicist Fusa Miyake at Nagoya University in Japan showed9 they could date a distinctive spike in carbon-14 levels in the rings of a Japanese cedar tree to ad 774–5. Subsequent research10 not only confirmed that this spike was present in wood samples around the world from this period, but also identified at least five other such spikes dating as far back as 7176 bc. Researchers have linked these spikes to solar-storm activity, but this hypothesis is still being explored.

Whatever their cause, these ‘Miyake events’ allow researchers to put a precise pin in the year in which wooden artefacts were created, by detecting a specific Miyake event and then counting the rings that formed since then. Researchers can even establish the season in which a tree was harvested, on the basis of the thickness of the outermost ring, Kuitems says.

Archaeologists are now applying this approach to Neolithic settlements and sites of volcanic eruption, and Dee hopes to use it to study the Mayan empire in Mesoamerica. In the next decade or so, Dee is optimistic that “we will have really absolute records for a lot of these ancient civilizations to the exact year, and we’ll be able to talk about their historical development … at a really fine scale”.

As for Miyake, her search for historical yardsticks continues. “We are now searching for other carbon-14 spikes comparable to the 774–5 event for the past 10,000 years,” she says.

Single-cell metabolomics

Metabolomics — the study of the lipids, carbohydrates and other small molecules that drive the cell — was originally a set of methods for characterizing metabolites in a population of cells or tissues, but is now shifting to the single-cell level. Scientists could use such cellular-level data to untangle the functional complexity in vast populations of seemingly identical cells. But the transition poses daunting challenges.

The metabolome encompasses vast numbers of molecules with diverse chemical properties. Some of these are highly ephemeral, with subsecond turnover rates, says Theodore Alexandrov, a metabolomics researcher at the European Molecular Biology Laboratory in Heidelberg, Germany. And they can be hard to detect: whereas single-cell RNA sequencing can capture close to half of all the RNA molecules produced in a cell or organism (the transcriptome), most metabolic analyses cover only a tiny fraction of a cell’s metabolites. This missing information could include crucial biological insights.

“The metabolome is actually the active part of the cell,” says Jonathan Sweedler, an analytical chemist at the University of Illinois at Urbana-Champaign. “When you have a disease, if you want to know the cell state, you really want to look at the metabolites.”

Many metabolomics labs work with dissociated cells, which they trap in capillaries and analyse individually using mass spectrometry. By contrast, ‘imaging mass spectrometry’ methods capture spatial information about how cellular metabolite production varies at different sites in a sample. For instance, researchers can use a technique called matrix-assisted laser desorption/ionization (MALDI), in which a laser beam sweeps across a specially treated tissue slice, releasing metabolites for subsequent analysis by mass spectrometry. This also captures the spatial coordinates from which the metabolites originated in the sample.Seven technologies to watch in 2021

In theory, both approaches can quantify hundreds of compounds in thousands of cells, but achieving that typically requires top-of-the-line, customized hardware costing in the million-dollar range, says Sweedler.

Now, researchers are democratizing the technology. In 2021, Alexandrov’s group described SpaceM, an open-source software tool that uses light microscopy imaging data to enable spatial metabolomic profiling of cultured cells using a standard commercial mass spectrometer11. “We kind of did the heavy lifting on the data-analysis part,” he says.

Alexandrov’s team has used SpaceM to profile hundreds of metabolites from tens of thousands of human and mouse cells, turning to standard single-cell transcriptomic methods to classify those cells into groups. Alexandrov says he is especially enthusiastic about this latter aspect and the idea of assembling ‘metabolomic atlases’ — analogous to those developed for transcriptomics — to accelerate progress in the field. “This is definitely the frontier, and will be a big enabler,” he says.

In vitro embryo models

The journey from fertilized ovum to fully formed embryo has been mapped in detail at the cellular level for mice and humans. But the molecular machinery driving the early stages of this process remains poorly understood. Now a flurry of activity in ‘embryoid’ models is helping to fill these knowledge gaps, giving researchers a clearer view of the vital early events that can determine the success or failure of fetal development.

Some of the most sophisticated models come from the lab of Magdalena Zernicka-Goetz, a developmental biologist at the California Institute of Technology in Pasadena and the University of Cambridge, UK. In 2022, she and her team demonstrated that they could generate implantation-stage mouse embryos entirely from embryonic stem (ES) cells12,13.

Representative immunostaining image of a self-organized blastoid.
An embryoid made using cells engineered to resemble the eight-cell stage of an embryo.Credit: M.A Mazid et al./Nature

Like all pluripotent stem cells, ES cells can form any cell or tissue type — but they require close interaction with two types of extra-embryonic cell to complete normal embryonic development. The Zernicka-Goetz team learnt how to coax ES cells into forming these extra-embryonic cells, and showed that these could be co-cultured with ES cells to yield embryo models that mature to stages that were previously unattainable in vitro. “It’s as faithful as you can imagine an embryonic model,” says Zernicka-Goetz. “It develops a head and heart — and it’s beating.” Her team was able to use this model to reveal how alterations in individual genes can derail normal embryonic development12.

At the Guangzhou Institutes of Biomedicine and Health, Chinese Academy of Sciences, stem-cell biologist Miguel Esteban and colleagues are taking a different tack: reprogramming human stem cells to model the earliest stages of development.

“We started with the idea that actually it might even be possible to make zygotes,” Esteban says. The team didn’t quite achieve that, but they did identify a culture strategy that pushed these stem cells back to something resembling eight-cell human embryos14. This is a crucial developmental milestone, associated with a massive shift in gene expression that ultimately gives rise to distinct embryonic and extra-embryonic cell lineages.

Although imperfect, Esteban’s model exhibits key features of cells in natural eight-cell embryos, and has highlighted important differences between how human and mouse embryos initiate the transition to the eight-cell stage. “We saw that a transcription factor that is not even expressed in the mouse regulates the whole conversion,” says Esteban.

Collectively, these models can help researchers to map how just a few cells give rise to the staggering complexity of the vertebrate body.

Research on human embryos is restricted beyond 14 days of development in many countries, but there’s plenty that researchers can do within those constraints. Non-human primate models offer one possible alternative, Esteban says, and Zernicka-Goetz says that her mouse-embryo strategy can also generate human embryos that develop as far as day 12. “We still have lots of questions to ask within that stage that we are comfortable studying,” she says.

Source: Nature

The race to supercharge cancer-fighting T cells


With a slew of tools to trick out immune cells, researchers are expanding the repertoire of CAR-T therapies.

Cartoon of an anthropomorphised T-cell driving a hot rod at speed with fire and smoke coming out of the exhaust

Crystal Mackall remembers her scepticism the first time she heard a talk about a way to engineer T cells to recognize and kill cancer. Sitting in the audience at a 1996 meeting in Germany, the paediatric oncologist turned to the person next to her and said: “No way. That’s too crazy.”

Today, things are different. “I’ve been humbled,” says Mackall, who now works at Stanford University in California developing such cells to treat brain tumours. The US Food and Drug Administration approved the first modified T cells, called chimeric antigen receptor (CAR)-T cells, to treat a form of leukaemia in 2017. The treatments have become game changers for several cancers. Five similar products have been approved, and more than 20,000 people have received them. A field once driven by a handful of dogged researchers now boasts hundreds of laboratory groups in academia and industry. More than 500 clinical trials are under way, and other approaches are gearing up to jump from lab to clinic as researchers race to refine T-cell designs and extend their capabilities. “This field is going to go way beyond cancer in the years to come,” Mackall predicts.CRISPR cancer trial success paves the way for personalized treatments

Advances in genome editing through processes such as CRISPR, and the ability to rewire cells through synthetic biology, have led to increasingly elaborate approaches for modifying and supercharging T cells for therapy. Such techniques are providing tools to counter some of the limitations of current CAR-T therapies, which are expensive to make, can have dangerous side effects, and have so far been successful only against blood cancers. “These techniques have expanded what we’re able to do with CAR strategies,” says Avery Posey, a cancer immunology researcher at the University of Pennsylvania in Philadelphia. “It will really take this type of technology forward.”

Even so, the challenge of making such a ‘living drug’ from a person’s cells extends beyond complicated designs. Safety and manufacturing problems remain to be addressed for many of the newest candidates. “There’s an explosion of very fancy things, and I think that’s great,” says immunologist Michel Sadelain at the Memorial Sloan Kettering Cancer Center in New York City. “But the complexity cannot always be brought as described into a clinical setting.”

Revved up and ready to go

CAR-T therapies capitalize on the activities of T cells, the immune system’s natural hunters that prowl through the body looking for things that don’t belong. Foreign cells, or those infected with a virus, express unusual proteins that serve as a beacon to T cells, some of which release a toxic stew of molecules to destroy the abnormal cells. This search-and-destroy function can also target cancer cells for elimination, but tumours often have ways of disarming the immune system, such as by cloaking abnormal proteins or suppressing T-cell function.

CAR-T cells carry synthetic proteins — the chimeric antigen receptors — that span the cell membrane. On the outside is a structure that functions like an antibody, binding to specific molecules on the surface of some cancer cells. Once that has bound, the portion of the protein inside the cell stimulates T-cell activity, hot-wiring it into action. The result is a tiny, revved-up, cancer-fighting machine.

Approved CAR-T therapies target one of two proteins found on immune cells called B cells, and are used to treat certain forms of leukaemia and lymphoma that involve the unchecked proliferation of these cells. The proteins — CD19 and BCMA — are not unique to cancer, meaning that the therapies kill B cells indiscriminately. However, people can live without these cells.

Composite coloured scanning electron micrograph (SEM) of T-cells and prostate cancer cells (pink)
T cells (blue) of the immune system attacking prostate cancer cells (pink).

There is still plenty of room for improvement in CAR-T therapies. Although the effects can be long-lasting — sometimes even curative — cancer eventually returns in most people who have been treated. Solid tumours, such as those found in lung or pancreatic cancers, have so far not responded convincingly to CAR-T cells. The therapy has safety risks and can, in rare instances, be fatal. And it must be custom-made for each recipient, using their own T cells as a starting point, resulting in a relatively slow and expensive manufacturing process.

As yet, there are no simple solutions to any of these problems. “We clearly have a long way to go,” says Mackall. “But we’re now seeing promising signals.”

Some progress is being made against solid tumours. These often contain a heterogeneous mosaic of cells that have different combinations of mutations. This means that a CAR-T therapy directed at a particular mutated protein might work for only one subset of cells. The tight mass of a solid tumour can also be difficult for T cells to penetrate, and researchers have struggled to find suitable targets that won’t wreak havoc in healthy tissues.

Despite this, some clinical trials have shown glimmers of efficacy. Mackall and her colleagues have engineered CAR-T cells to target a molecule called GD2, which is expressed at high levels by some brain and spinal-cord cancers called gliomas. The team gave one intravenous dose of CAR-T therapy to people with gliomas, then administered multiple, lower doses directly into the brain. She and her colleagues reported last year that three of four people treated in this way responded positively1. “These cells just dive right into the brain,” says Mackall. “And the body doesn’t reject them up there — it’s playing in that immune-privileged space.”

Targeting solid tumours could require T-cell therapies that recognize more than one mutated protein or that can target cancer cells expressing higher levels of a given protein than normal cells do. One clinical trial that reported results in November 2022 took this to the extreme: rather than using CARs, the team used CRISPR to engineer natural T-cell receptors (see ‘Targeting T cells’) to recognize mutated proteins found in each participant’s tumour2. The individuals received a mixture of cells targeting different proteins, in the hope that solid tumours would be less likely to develop resistance to a therapy with multiple targets. Tumours stopped growing in 5 of the 16 participants 28 days after treatment. Researchers hope to tweak the protocol, including giving higher doses, to boost effectiveness.

TARGETING T CELLS. Graphic showing how cancer treatments use T cells to kill tumours.
Source: Premier Research; adapted from https://go.nature.com/3WXCRYX

The ability to track and fine-tune T-cell activity is also improving, says immunologist Carl June at the University of Pennsylvania. Through advanced single-cell analyses, researchers can follow the fate of both the engineered cells and the tumours they are designed to kill. They can determine which T cells have become ‘exhausted’ — a dysfunctional state that can come from prolonged stimulation — and which tumour cells are becoming resistant to treatment. They can also see whether the environment surrounding a CAR-T-treated tumour has become riddled with immune-suppressing cells (such as macrophages or regulatory T cells). Overcoming that local immune suppression will be key to harnessing T cells to fight solid tumours, says Yangbing Zhao, chief scientific officer at UTC Therapeutics, a biotechnology company headquartered in Singapore that is developing CAR-T therapies. “No matter how many targets you target, if the tumour is evading the immune response, it won’t work,” he says.

June and his colleagues used a single-cell approach to study resistance to CAR-T therapies that target CD19, and found that CAR-T products that were less able to activate certain helper T cells were associated with the emergence of resistance3. They also used single-cell techniques to learn more about why CAR-T cells directed against a protein called mesothelin, found in pancreatic cancer cells, often fail. Reducing the activity of two genes in CAR-T cells might bolster the therapy4. “We’re going to be able to understand these resistance mechanisms,” says June. “And then with all of these tools like CRISPR, we’re going to engineer around them.”

In addition to editing T cells, CRISPR has been used to find more ways of modifying them. Immunologist Alexander Marson at the Gladstone Institutes in San Francisco, California, and his colleagues used CRISPR to activate or suppress thousands of genes in T cells, and then looked at the effect the changes had on the production of crucial immune-regulating proteins called cytokines5. In another screen using CRISPR, the team found that reducing the activity of a protein called RASA2 enhanced the ability of CAR-T cells to kill their targets6. “We’re learning lessons about the genes that we can turn up and turn down to tune T cells to behave as we want,” says Marson.

Synthetic biologists have also set their sights on T cells, and are engineering sophisticated cellular circuits that could allow greater control over the expression of CARs and other proteins that might increase T-cell activity. In December last year, synthetic biologist Wendell Lim at the University of California, San Francisco, and his colleagues reported7 that they had engineered T cells to express both a CAR and IL-2, an immune-regulating protein. IL-2 can improve T-cell penetration into solid tumours and overcome the immunosuppressive signals that tumours release, but it can be toxic when administered systemically. Letting the T cells produce IL-2 enables local administration of the protein, which could bypass its toxicity to other tissues.Last-resort cancer therapy holds back disease for more than a decade

Other synthetic circuits have been designed to allow precise regulation of CAR expression, by placing it under the control of genetic elements that activate the necessary genes in response to a drug8. So far, however, most of these complicated designs have not yet gone through the safety studies and standardization required for use in people, says Sadelain.

Researchers are learning so many lessons that a big question for the field is now determining which engineered T cells to take forwards into human studies, says oncologist Marcela Maus at Massachusetts General Hospital in Boston. “We can invent and innovate so much in the lab, but there is this funnel of translating that into clinical trials,” she says. “There’s so many things we can do. We have to figure out which are the best things to tweak and test in trials.”

Costly business

Manufacturing CAR-T cells is already wildly complex by pharmaceutical standards. So far, all approved therapies require engineering a person’s own T cells to express the CAR. That adds to the time and thus the cost of producing the therapies: in the United States, a single treatment with CAR-T cells can be about US$500,000, not including the cost of hospitalization and associated treatments.

Creating CAR-T cells that can be given to multiple people — often called off-the-shelf cells — has long been viewed as crucial to lowering the price of the therapy. But early results suggest that there is still work to do, says bioengineer Rahul Purwar at the Indian Institute of Technology Bombay. Although the cells can be edited to reduce the chance that they will themselves be eliminated by the immune system, early trials suggest that they do not survive long after infusion and might still be rejected (see, for example, ref. 9)9. “Off-the-shelf is a great approach,” he says. “It is coming, but right now we are not yet there.”Cancer treatments boosted by immune-cell hacking

The therapy is also rarely available outside wealthy countries. In Brazil, haematologist Renato Luiz Guerino Cunha at Oncoclínicas Group in São Paulo was the first in the country to treat someone with CAR-T therapy in 2019. But progress has been slow, he says: he lacks the capacity to rapidly produce large quantities of cells. “In three years, we treated just six patients,” he says. “We need new technology for the processing.”

Producing a CAR-T cell therapy typically involves using a type of virus called a lentivirus as a vector to shuttle in the synthetic CAR gene. But more research into gene therapies has increased demand for clinical-grade lentiviruses. Researchers now wait months and pay top dollar to complete their experiments; Cunha produces his own but can do so only in tiny quantities. Improvements to CRISPR gene editing could help in this regard.

Despite the challenges, CAR-T therapies continue to expand, with some of the hundreds of clinical trials worldwide exploring entirely new applications. Last year, researchers reported promising results in a small trial of CAR-T therapies to treat a form of the autoimmune disease lupus10. And in a study in mice, researchers reprogrammed T cells without the usual first step of removing them from the body, creating CAR-T cells designed to clear scar tissue from the heart11.

In December, June and his colleagues unveiled a way to streamline cell production. At the American Society of Hematology’s annual meeting in New Orleans, Louisiana, the team announced12 that reducing manufacturing times and engineering CAR-T cells to express a protein called IL-18 boosted their efficacy and allowed researchers to reduce the dose of cells given to people. “Those patients had incredible responses,” says Maus of the clinical trial, “which gives you this really tantalizing hint that if you engineer the T cell better, you can make it even more powerful.”

Source: Nature

Massive health-record review links viral illnesses to brain disease


Study ties common viruses such as flu to Alzheimer’s and other conditions — but the analysis has limitations, researchers warn.

Coloured scanning electron micrograph (SEM) of influenza (flu) viruses (blue) budding from a burst epithelial cell.
In this false-colour scanning electron microscope image, influenza virus particles (blue) stand ready to release from a burst epithelial cell (red).

An analysis of about 450,000 electronic health records has found a link between infections with influenza and other common viruses and an elevated risk of having a neurodegenerative condition such as Alzheimer’s or Parkinson’s disease later in life. But researchers caution that the data show only a possible connection, and that it’s still unclear how or whether the infections trigger disease onset.

The analysis, published in Neuron on 19 January1, found at least 22 links between viral infections and neurodegenerative diseases. Some of the viral exposures were associated with an increased risk of brain disease up to 15 years after infection.

“It’s startling how widespread these associations seem to be, both for the number of viruses and number of neurodegenerative diseases involved,” says Matthew Miller, a viral immunologist at McMaster University in Hamilton, Canada.

Mining health records

This isn’t the first time viruses have been linked to neurodegenerative disease. Infection with a type of herpes virus has been associated with the development of Alzheimer’s2, for instance. And a landmark study published in Science3 last year found the strongest evidence yet that Epstein–Barr virus is tied to multiple sclerosis. But many of these past studies examined only a single virus and a specific brain disease.The quest to prevent MS — and understand other post-viral diseases

To understand whether viruses are linked to brain diseases more broadly, Kristin Levine, a biomedical data scientist at the US National Institutes of Health’s Center for Alzheimer’s Related Dementias in Bethesda, Maryland, and her colleagues analysed hundreds of thousands of medical records to look for instances in which a person had both a viral infection and a brain disease on file.

First, the team examined records from about 35,000 people with brain diseases and about 310,000 people without, sourced from FinnGen, a large Finnish database that includes health information. The team found 45 significant links between infections and brain diseases, and then tested those against more than 100,000 records from another database, the UK Biobank. After this analysis, they were left with 22 significant pairings.

One of the strongest associations was between viral encephalitis, a rare inflammation of the brain that can be caused by multiple types of virus, and Alzheimer’s. People with encephalitis were about 31 times more likely to develop Alzheimer’s later in life than were people who did not have encephalitis. Most other associations were more modest: people who had a bout of flu that led to pneumonia were four times more likely to develop Alzheimer’s than were people who didn’t develop the flu with pneumonia. There were no pairings that suggested a protective link between viral infection and brain disease.

“I’m very excited they’re expanding this research broader than what other studies have looked at,” says Kristen Funk, a neuroimmunologist at the University of North Carolina, Charlotte, who studies the link between herpesviruses and Alzheimer’s.

Data shortcomings

Kjetil Bjornevik, an epidemiologist at the Harvard T.H. Chan School of Public Health in Boston, Massachusetts, and an author of the Epstein–Barr paper in Science, applauds Levine and her colleagues for bringing more attention to the role of viral infections in brain diseases. But he warns that their approach of using medical records “could be problematic” because they analysed only infections that were severe enough to warrant a trip to a health practitioner. Taking milder infections into account might weaken the associations, he says.Are infections seeding some cases of Alzheimer’s disease?

The data are also sourced almost exclusively from people of European ancestry, which means that the findings might not be applicable to the larger global population, Funk says. Furthermore, she adds, outside Europe, “certain viruses are more prevalent”, such as Zika or West Nile virus, so the analysis might have missed links between those pathogens and brain disease. Levine acknowledges the limitations of the analysis; the team worked with the data that were available, she says.

These limitations also underscore the difficulty of untangling whether a viral infection leads to neurodegenerative disease, or whether the disease makes a person more susceptible to infection, Bjornevik says. To make it even more tricky, the authors found that the more time that elapsed between the infection and the diagnosis of brain disease, the weaker the link was. The body is known to begin changing years before symptoms of brain disease develop and a diagnosis is made4, so it’s tough to determine which is causing which, he adds. Another plausible theory is that these viral infections might be accelerating molecular changes in the body that were already ongoing, says Cornelia van Duijn, a genetic epidemiologist at the University of Oxford, UK.Could long COVID be linked to herpes viruses? Early data offer a hint

If future studies add more weight to the connection between viral infection and brain disease, it could offer health officials a tangible way to delay the onset of neurodegenerative disease. Vaccines exist for many of these viruses, van Dujin says. Because multiple types of dementia are diagnosed late in life — close to the average life expectancy — if clinicians could postpone disease onset by even a couple of years, that could mean that many people might never develop the disease, she adds.

“It’s not very clear that the infections are causing brain disease,” she says. But viral infections aren’t pleasant, and if there’s any link to brain disease, “I think we owe it to people to prevent them.”

Has Earth’s inner core stopped its strange spin?


Earthquake data hint that the inner core stopped rotating faster than the rest of the planet in 2009, but not all researchers agree.

Earth's core, illustration.
Earth’s inner core is made mostly of solid iron, and can rotate separately from the outer parts of the planet.

Thousands of kilometres beneath your feet, Earth’s interior might be doing something very weird. Many scientists think that the inner core spins faster than the rest of the planet — but sometime in the past decade, according to a study, it apparently stopped doing so.Mars’s core has been measured — and it’s surprisingly large

“We were quite surprised,” say Yi Yang and Xiaodong Song, seismologists at Peking University in Beijing who reported the findings today in Nature Geoscience1.

The results could help to shine light on the many mysteries of the deep Earth, including what part the inner core plays in maintaining the planet’s magnetic field and in affecting the speed of the whole planet’s rotation — and thus the length of a day. But they are just the latest instalment in a long-running effort to explain the inner core’s unusual rotation, and might not be the final word on the matter.

“I keep thinking we’re on the verge of figuring this out,” says John Vidale, a seismologist at the University of Southern California in Los Angeles. “But I’m not sure.”

Mysteries of the deep

Researchers discovered the inner core in 1936, after studying how seismic waves from earthquakes travel through the planet. Changes in the speed of the waves revealed that the planet’s core, which is about 7,000 kilometres wide, consists of a solid centre, made mostly of iron, inside a shell of liquid iron and other elements. As iron from the outer core crystallizes on the surface of the inner core, it changes the density of the outer liquid, driving churning motions that maintain Earth’s magnetic field.

Ebeko Volcano, Paramushir Island, Kuril Islands, Russia.
Researchers have learnt about the inner core’s rotation by studying earthquakes that originated in the same region, such as the Kuril Islands (shown here), over decades.

The liquid outer core essentially decouples the 2,400-kilometre-wide inner core from the rest of the planet, so the inner core can spin at its own pace. In 1996, Song and another researcher reported2 studying earthquakes that originated in the same region over three decades, and whose energy was detected by the same monitoring station thousands of kilometres away. Since the 1960s, the scientists said, the travel time of seismic waves emanating from those earthquakes had changed, indicating that the inner core rotates faster than the planet’s mantle, the layer just beyond the outer core.

Later studies refined estimates of the rate of that ‘super-rotation’, to conclude that the inner core rotates faster than the mantle by about one-tenth of a degree per year. But not everyone agrees. Other work has suggested that super-rotation happens mostly in distinct periods, such as in the early 2000s, rather than being a continuous, steady phenomenon3. Some scientists even argue that super-rotation does not exist, and that the differences in earthquake travel times are instead caused by physical changes on the surface of the inner core4.

Last June, Vidale and Wei Wang, an Earth scientist also at the University of Southern California, threw another spanner into the works. Using data on seismic waves generated by US nuclear test blasts in 1969 and 1971, they reported that between those years, Earth’s inner core had ‘subrotated’, or rotated more slowly than the mantle5. Only after 1971, they say, did it speed up and begin to super-rotate.

A rotational shift

Now, Yang and Song say that the inner core has halted its spin relative to the mantle. They studied earthquakes mostly from between 1995 and 2021, and found that the inner core’s super-rotation had stopped around 2009. They observed the change at various points around the globe, which the researchers say confirms it is a true planet-wide phenomenon related to core rotation, and not just a local change on the inner core’s surface.Earth’s magnetic field is acting up and geologists don’t know why

The data hint that the inner core might even be in the process of shifting back towards subrotation. If so, something is probably happening to the magnetic and gravitational forces that drive the inner core’s rotation. Such changes might link the inner core to broader geophysical phenomena such as increases or decreases in the length of a day on Earth.

Still, many questions remain, such as how to reconcile the slow pace of the changes that Yang and Song report with some of the faster changes reported by others. The only way out of the morass is to wait for more earthquakes to happen. A “long history of continuous recording of seismic data is critical for monitoring the motion of the heart of the planet”, say Yang and Song.

“We just have to wait,” Vidale adds.

How antidepressants help bacteria resist antibiotics


A laboratory study unravels ways non-antibiotic drugs can contribute to drug resistance.

Coloured scanning electron micrograph of Escherichia coli
In the presence of antidepressants, the Gram-negative bacterium E. coli can fend off antibiotics.

The emergence of disease-causing bacteria that are resistant to antibiotics is often attributed to the overuse of antibiotics in people and livestock. But researchers have homed in on another potential driver of resistance: antidepressants. By studying bacteria grown in the laboratory, a team has now tracked how antidepressants can trigger drug resistance1.

“Even after a few days exposure, bacteria develop drug resistance, not only against one but multiple antibiotics,” says senior author Jianhua Guo, who works at the Australian Centre for Water and Environmental Biotechnology at the University of Queensland in Brisbane. This is both interesting and scary, he says.

Globally, antibiotic resistance is a significant public-health threat. An estimated 1.2 million people died as a direct result of it in 20192, and that number is predicted to climb.

Early clues

Guo became interested in the possible contributions of non-antibiotic drugs to antibiotic resistance in 2014, after work by his lab found more antibiotic-resistance genes circulating in domestic wastewater samples than in samples of wastewater from hospitals, where antibiotic use is higher.

Guo’s group and other teams also observed that antidepressants — which are among the most widely prescribed medicines in the world — killed or stunted the growth of certain bacteria. They provoke “an SOS response”, Guo explains, triggering cellular defence mechanisms that, in turn, make the bacteria better able to survive subsequent antibiotic treatment.

In a 2018 paper, the group reported that Escherichia coli became resistant to multiple antibiotics after being exposed to fluoxetine3, which is commonly sold as Prozac. The latest study examined 5 other antidepressants and 13 antibiotics from 6 classes of such drugs and investigated how resistance in E. coli developed.

In bacteria grown in well-oxygenated laboratory conditions, the antidepressants caused the cells to generate reactive oxygen species: toxic molecules that activated the microbe’s defence mechanisms. Most prominently, this activated the bacteria’s efflux pump systems, a general expulsion system that many bacteria use to eliminate various molecules, including antibiotics. This probably explains how the bacteria could withstand the antibiotics without having specific resistance genes.

But exposure of E. coli to antidepressants also led to an increase in the microbe’s mutation rate, and the subsequent selection of various resistance genes. However, in bacteria grown in anaerobic conditions, levels of reactive oxygen species were much lower and antibiotic resistance developed much more slowly.

Moreover, at least one antidepressant, sertraline, promoted the transfer of genes between bacterial cells, a process that can speed up the spread of resistance through a population. Such transfer can occur between different types of bacterium, allowing resistance to hop between species — including from harmless bacteria to pathogenic ones.

Growing recognition

Kiran Patil, who studies microbiome–chemical interactions at the University of Cambridge, UK, says that in the past five years there has been a growing appreciation that many non-antibiotic medicines that target human cells can also affect bacteria and contribute to antibiotic resistance. “The strength of the study is the mechanistic details,” says Patil.

Lisa Maier, who is based at the University of Tübingen in Germany and studies interactions between drugs and the microbiome, says that to understand how antidepressants can drive antibiotic resistance, researchers need to determine what molecules the drugs are targeting in the bacteria and to assess the effects of the medications on a wider variety of clinically relevant bacterial species. In 2018, Maier and her colleagues surveyed 835 medicines that did not target microbes and found that 24% inhibited the growth of at least one strain of human gut bacteria4.

Patil and Maier say it is important to gather evidence to assess the real-world impact of antidepressants on resistance, such as whether antidepressants are driving the accumulation of antibiotic-resistant bacteria, particularly disease-causing ones, in people, animals or the environment.

Although significant amounts of antidepressants have been found in wastewater, reported levels tend to fall below the concentrations at which Guo’s group saw significant effects in E. coli. But concentrations of some of the antidepressants that had strong effects in this study are expected to be reached in the large intestines of people taking the drugs.

Follow-up studies

Maier says that several studies now link antidepressants and other non-antibiotic pharmaceuticals to changes in bacteria, and that preliminary studies have given the “first hints” regarding how such drugs can affect the microbiomes of people taking them.

But in healthy humans, E. coli is found mainly in the large intestine, where conditions are anaerobic, meaning that the process described in the paper might not occur at the same rate in people, says Maier. Future studies should use bacterial growing conditions that model sites at which antidepressants might be acting, says Patil.

Guo says his lab is now looking at the microbiomes of mice given antidepressants. Early, unpublished data suggest that the drugs can change the animals’ gut microbiota and promote gene transfer.

But Guo and Maier caution people against stopping taking antidepressants on the basis of this research. “If you have depression, that needs to be treated in the best possible way. Then, bacteria second,” says Maier.

Researchers and pharmaceutical companies need to quantify the contribution of non-antibiotic pharmaceuticals to antibiotic resistance, says Guo. “Non-antibiotic pharmaceuticals are a big concern that we shouldn’t overlook,” he says.

Don’t ask if artificial intelligence is good or fair, ask how it shifts power


Those who could be exploited by AI should be shaping its projects

Law enforcement, marketers, hospitals and other bodies apply artificial intelligence (AI) to decide on matters such as who is profiled as a criminal, who is likely to buy what product at what price, who gets medical treatment and who gets hired. These entities increasingly monitor and predict our behaviour, often motivated by power and profits.

It is not uncommon now for AI experts to ask whether an AI is ‘fair’ and ‘for good’. But ‘fair’ and ‘good’ are infinitely spacious words that any AI system can be squeezed into. The question to pose is a deeper one: how is AI shifting power?

From 12 July, thousands of researchers will meet virtually at the week-long International Conference on Machine Learning, one of the largest AI meetings in the world. Many researchers think that AI is neutral and often beneficial, marred only by biased data drawn from an unfair society. In reality, an indifferent field serves the powerful.

In my view, those who work in AI need to elevate those who have been excluded from shaping it, and doing so will require them to restrict relationships with powerful institutions that benefit from monitoring people. Researchers should listen to, amplify, cite and collaborate with communities that have borne the brunt of surveillance: often women, people who are Black, Indigenous, LGBT+, poor or disabled. Conferences and research institutions should cede prominent time slots, spaces, funding and leadership roles to members of these communities. In addition, discussions of how research shifts power should be required and assessed in grant applications and publications.

A year ago, my colleagues and I created the Radical AI Network, building on the work of those who came before us. The group is inspired by Black feminist scholar Angela Davis’s observation that “radical simply means ‘grasping things at the root’”, and that the root problem is that power is distributed unevenly. Our network emphasizes listening to those who are marginalized and impacted by AI, and advocating for anti-oppressive technologies.

Consider an AI that is used to classify images. Experts train the system to find patterns in photographs, perhaps to identify someone’s gender or actions, or to find a matching face in a database of people. ‘Data subjects’ — by which I mean the people who are tracked, often without consent, as well as those who manually classify photographs to train the AI system, usually for meagre pay — are often both exploited and evaluated by the AI system.

Researchers in AI overwhelmingly focus on providing highly accurate information to decision makers. Remarkably little research focuses on serving data subjects. What’s needed are ways for these people to investigate AI, to contest it, to influence it or to even dismantle it. For example, the advocacy group Our Data Bodies is putting forward ways to protect personal data when interacting with US fair-housing and child-protection services. Such work gets little attention. Meanwhile, mainstream research is creating systems that are extraordinarily expensive to train, further empowering already powerful institutions, from Amazon, Google and Facebook to domestic surveillance and military programmes.

Many researchers have trouble seeing their intellectual work with AI as furthering inequity. Researchers such as me spend our days working on what are, to us, mathematically beautiful and useful systems, and hearing of AI success stories, such as winning Go championships or showing promise in detecting cancer. It is our responsibility to recognize our skewed perspective and listen to those impacted by AI.

Through the lens of power, it’s possible to see why accurate, generalizable and efficient AI systems are not good for everyone. In the hands of exploitative companies or oppressive law enforcement, a more accurate facial recognition system is harmful. Organizations have responded with pledges to design ‘fair’ and ‘transparent’ systems, but fair and transparent according to whom? These systems sometimes mitigate harm, but are controlled by powerful institutions with their own agendas. At best, they are unreliable; at worst, they masquerade as ‘ethics-washing’ technologies that still perpetuate inequity.

Already, some researchers are exposing hidden limitations and failures of systems. They braid their research findings with advocacy for AI regulation. Their work includes critiquing inadequate technological ‘fixes’. Other researchers are explaining to the public how natural resources, data and human labour are extracted to create AI.

Race-and-technology scholar Ruha Benjamin at Princeton University in New Jersey has encouraged us to “remember to imagine and craft the worlds you cannot live without, just as you dismantle the ones you cannot live within”. In this vein, it is time to put marginalized and impacted communities at the centre of AI research — their needs, knowledge and dreams should guide development. This year, for example, my colleagues and I held a workshop for diverse attendees to share dreams for the AI future we desire. We described AI that is faithful to the needs of data subjects and allows them to opt out freely.

When the field of AI believes it is neutral, it both fails to notice biased data and builds systems that sanctify the status quo and advance the interests of the powerful. What is needed is a field that exposes and critiques systems that concentrate power, while co-creating new systems with impacted communities: AI by and for the people.

Source: Nature