Why do we need sleep? It’s all about the brain’s reset button


 What is the purpose of sleeping? Researchers at Washington University in St. Louis believe they have unraveled the mystery, proposing that sleeping acts as a reset button for the brain’s “operating system.” This novel theory, integrating concepts from physics and biology, suggests that sleep is essential for maintaining the brain’s optimal state for processing and thinking.

“The brain is like a biological computer,” says study author Keith Hengen, an assistant professor of biology at Washington University in St. Louis, in a university release. “Memory and experience during waking change the code bit by bit, slowly pulling the larger system away from an ideal state. The central purpose of sleep is to restore an optimal computational state.”

woman sleeping while holding still active smartphone
woman sleeping while holding still active smartphone

This study closely observed the brain activity of sleeping rats to support this hypothesis. The theory hinges on a concept known as “criticality,” a state that balances order and chaos, maximizing information processing. Criticality, a term from physics, describes a complex system at the tipping point between complete regularity and randomness.

“At one extreme, everything is completely regular. At the other extreme, everything is random,” notes study co-author Ralf Wessel, a professor of physics at Washington University in St. Louis.

Researchers observed the brain activity of young rats during sleep and wakefulness. They tracked neural avalanches — cascades of brain activity — which demonstrated how information flows through the brain. They found that after restorative sleep, the brain exhibited avalanches of all sizes, indicating a return to criticality. As the rats stayed awake, these cascades shifted towards smaller sizes.

The study marks a departure from the long-held belief that sleep replenishes depleted chemicals. Instead, it posits that sleep is a systemic solution to a systemic problem, resetting the brain away from the extremes of too much order (rigidity) or chaos (randomness).

Man taking a nap and sleeping at his work desk

The concept of criticality was first developed in the late 1980s by physicists studying sand piles on a grid. These sand piles self-organized into a complex system, a metaphor for the neural avalanches in the brain. Hengen notes that every neuron, like an individual grain of sand, follows basic rules. When billions of neurons reach criticality, they create a complex and efficient system.

“Criticality maximizes a bunch of features that sound very desirable for a brain,” explains Hengen.

This multidisciplinary effort combines experimental data from biology with mathematical equations from physics, providing a novel perspective on the purpose of sleep.

“It’s a beautiful collaboration between physics and biology,” says Wessel, highlighting the unique blend of disciplines that led to this discovery.

Mapping brain repair and remodeling after stroke


Researchers at Weill Cornell Medicine have catalogued the cellular response to stroke in a preclinical model, identifying the immune cells involved and the roles they may play in the days and weeks following a stroke.

During a stroke, loss of oxygen leads to brain damage and cell death. It also triggers a powerful inflammatory response in which the brain’s resident immune cells, along with cells recruited from the blood, infiltrate the injured tissue.

The findings, published Jan. 4 in Nature Immunology, could point toward novel approaches to fostering stroke recovery and provide insight into why therapies to control inflammation after a stroke haven’t been successful.

Image of a mouse brain section
Image of a mouse brain section 14 days after stroke. Immune cells called leukocytes (yellow) infiltrate the core of the injury (right edge of brain), surrounded by enlarged blood vessels (magenta). Cell nuclei were stained cyan blue.

“Nearly every one of us knows someone who’s had a stroke. It’s a huge problem,” said senior author Dr. Josef Anrather, a professor of neuroscience and vice chair for research in the Feil Family Brain and Mind Research Institute at Weill Cornell Medicine. “But in terms of treatment, there is little a physician can do.”

Interventions that restore blood flow to the affected brain region must be administered within hours to be effective. “So most people, more than 80%, receive no therapy at all,” he said.

Understanding how immune cells contribute to repairing and remodeling the brain in the later, chronic phase after a stroke could help doctors minimize the long-term neurological consequences, including dementia and even seizures.

In 2016, Anrather and his colleagues observed that immune cells called monocytes, which are made in the bone marrow, accumulate in the brain following a stroke. Once there, they appeared to undergo a physical transformation: Some sprouted spindly arms, adopting the appearance of the brain’s resident immune cells, the microglia; others grew more amorphous and amoeba-like.

But what, if anything, did this shapeshifting have to do with their behavior?

“We became interested in knowing the function of these different structural characteristics,” said lead study author Lidia Garcia-Bonilla, the Finbar and Marianne Kenny Research Scholar in Neurology and an assistant professor of research in neuroscience at the Brain and Mind Research Institute, Weill Cornell Medicine.

They also wondered whether these cells were contributing to recovery or compounding the damage.

“There are always two sides to the coin,” Anrather said. The same cell type might be harmful in some circumstances but helpful in others. “That might be why the clinical trials of drugs that reduce immune cell infiltration into the brain and inflammation have shown no benefit for stroke.”

The most direct way to assess what a particular cell is doing is to determine which of its many genes are turned on. Working with a preclinical model, they collected immune cells at two days and 14 days after an induced stroke – the blockage of an artery in the brain. They then sequenced the RNA molecules, which encode proteins, produced by each cell. Using this approach, the researchers identified exactly each type of cell they had isolated. It also provided a readout of which genes each cell had switched on, an indication of their roles after the stroke.

The researchers first noticed that a population of microglia were rapidly proliferating. That made sense, Anrather said, “because microglia cover the territory of the brain.” When their numbers are depleted by an injury, such as stroke, the cells multiply to blanket the damaged tissue.

Then they “take out the trash,” Anrather said.

“For the brain to rebuild itself, you have to clean up, remove dead cells,” he said. Indeed, two days after the experimental stroke, the researchers detected a cadre of microglia that switch on genes involved in clearing away cellular debris.

Joining the microglia in this effort were monocytes – white blood cells that responded to the injury. “These cells circulate continuously and don’t really have a job until there is a problem, like an infection, trauma or any kind of tissue death,” Anrather said. “Then they are called in to help clean up.”

Once there, the researchers found, these monocytes transformed themselves into the type of cell that’s needed to get the job done. “They’re like little kids that get educated in the tissue,” Anrather said.

After the acute clean-up phase, the immune response was restructured toward tissue remodeling. Some cellular recruits produced growth factors triggering repair while immunological “professionals” such as T cells were called in to play a neuroprotective role.

By identifying which immune cells will heed the stroke-induced distress call, the researchers provide a novel vehicle for intervention. “Because these cells know how to get to the brain,” Anrather said, “you could use them as a shuttle and engineer them to deliver a therapeutic.”

Furthermore, understanding precisely what these cells do when they get to the brain could be key to developing treatments that can be administered weeks or months after a stroke. “Finding a way to activate the brain’s natural repair mechanism could improve the outcome for stroke patients,” Garcia-Bonilla said.

Research gives new insight into how signals are transmitted between cells


The image shows in vivo super-resolution analysis taken with a Lattice-Structured Illumination Microscope

The quest to gain a greater and more in-depth understanding about how signals are transmitted between cells has taken a remarkable step forward.

New research, led by experts from the University of Exeter’s Living Systems Institute, has discovered a new synergy between transport and signalling within cells for the first time.

In all multicellular organisms, cells ‘communicate’ with each other in order to be able to grow and function properly.

In order to conduct this communication, cells express signal molecules and hand them over to the neighbouring cells which provide the exact docking molecules – similar to a key being placed into a lock – which in turn activates a cascade of responses in the cell.

One of the most essential messaging systems is the Wnt signalling network, which acts as the key to fit a lock called Ror2 to change the movement of these cells.

In the new research, scientists used special techniques to mark Wnt5b and Ror2 fluorescently in developing zebrafish embryos.

Then, using state-of-the-art high-resolution microscopy from the Bioimaging Centre, they were able see how these components move around in real time.

They found that the Wnt5b and Ror2 were made by the very same cells, and travel together from one cell to another using tiny cellular extensions called cytonemes.  The  research team discovered the ‘key and lock’ are connected throughout the journey to the new cell, rather than travel separately.

The research challenges conventional understandings of how cells respond to signals, and suggests that even cells without the right lock can react if they get a functional key-lock duo through these thin cytonemes.

The experts now want to explore how these tiny threads form, how the key-lock pairs are carried along these cell threads to the correct cells, and if this also happens in other critical signalling systems in the body.

The next phase of the research could help advance understanding of illnesses like cancer, where this signalling process goes wrong, and develop new strategies to combat these devastating disorders.

Professor Steffen Scholpp, lead author of the study and an expert in Cell and Developmental Biology at the LSI, said: “As cell signalling underpins multicellular life, our research findings will be transformative for the biosciences. Based on these results, we must redefine our strategies for controlling signalling. Cytonemes can now be targeted for designing new therapeutics for many diseases caused by defective cell communication.”

Efanesoctocog Alfa Shows Promise for Severe Hemophilia A Therapy


Among the benefits of treatment with this recombinant factor VIII product were fewer bleeding episodes, improved joint health, and reduced treatment burden.

Replacement with factor VIII concentrates remains the standard of care for hemophilia A but requires frequent infusions and imposes a significant therapy burden. Efforts to extend factor VIII half-life though factor VIII modification are under way to reduce the therapy burden.

Researchers evaluated the safety, efficacy, and pharmacokinetics of efanesoctocog alfa, an investigational molecule composed of recombinant factor VIII protein with additional modifications — a von Willebrand D’D3 fragment and Fc portion of IgG — that extend its half-life up to 5 days. In the multinational, open-label, phase 3 trial, previously treated patients with severe hemophilia A (≥150 factor VIII exposure days) ages 12 years and older received either once weekly prophylaxis with intravenous efanesoctocog alfa (50 IU per kg) for 52 weeks (group A, n=133) or on-demand efanesoctocog alfa for 26 weeks followed by once-weekly prophylaxis for 26 weeks (group B, n= 26).

The mean annualized bleeding rate (ABR) in group A — the primary endpoint — was 0.71. Patients in group A had a significant reduction in mean ABR from 2.96 pre-study to 0.69 post-study. The median ABR was 0 (interquartile range, 0 to 1.04). There were 362 bleeding episodes, most in group B, and 97% resolved with one dose of efanesoctocog alfa. With weekly prophylaxis, mean factor VIII activity was >40 IU/dL for roughly 4 days and was 15 IU/dL at day 7.

Patients in group A also experienced significantly improved physical health, pain intensity, and joint health. In the overall study population, efanesoctocog alfa had an acceptable side-effect profile, and no patients developed factor VIII inhibitors.

Comment

This comprehensive study shows that once-weekly efanesoctocog alfa offers a promising alternative to currently available factor VIII products for prophylaxis and treatment of severe hemophilia A. Major weaknesses of the study are (1) it was not a randomized clinical trial and (2) the incidence of alloantibodies against factor VIII — a serious complication of factor VIII therapy — could not be evaluated.

Sublobar Resection Is Now Standard of Care for Some Patients with Early-Stage NSCLC


Sublobar resection is noninferior to lobectomy in patients with carefully staged, peripheral cT1aN0 NSCLC and should be offered to them.

To determine whether sublobar resection (wedge resection or segmentectomy) is noninferior to lobar resection in early-stage stage non–small-cell lung cancer (NSCLC), researchers conducted an international phase 3 trial. Nearly 700 patients with histologically confirmed stage IA node-negative NSCLC were randomized to lobar resection (357 patients) or sublobar resection (340 patients). Node status was confirmed by frozen section examination of level 10 lymph nodes and at least 2 mediastinal stations.

At a median follow-up of 7 years, the primary endpoint — disease-free survival (DFS) — did not differ significantly between groups (hazard ratio, 1.01; 90% CI, 0.83–1.24). The 5-year DFS was 63.6% in the sublobar-resection arm and 64.1% in the lobar-resection arm. There were no significant differences in DFS between the arms in subgroup analyses by tumor size or sites of recurrence. Overall survival (OS) also did not differ significantly between arms (HR, 0.95; 95% CI, 0.72–1.26). The 5-year OS was 80.3% with sublobar resection and 78.9% with lobar resection. There were no significant differences between the arms in lung cancer–related deaths (HR, 0.99) or deaths from other causes (HR, 1.12).

At 6 months, there was a greater reduction from baseline in the median percentage of predicted forced expiratory volume in 1 second (FEV1)in the lobar-resection arm (−6.0; 95% CI, −8.0 to −5.0) than in the sublobar-resection arm (−4.0; 95% CI, −5.0 to −2.0). The reduction in the median percentage of predicted forced vital capacity (FVC) was also greater in the lobar-resection arm (−5.0; 95% CI, −7.0 to −3.0) than in the sublobar resection arm (−3.0; 95% CI, −4.0 to −1.0).

Comment

A recent trial from Japan showed that segmentectomy was noninferior to lobectomy in patients with T1a/bN0 NSCLC (NEJM JW Oncol Hematol May 9 2022 and Lancet 2022; 399:1607). Together, these trials provide conclusive evidence that sublobar resection is the standard of care for patients with small, peripheral, node-negative NSCLC. As computed tomography screening for lung cancer becomes more widespread, the proportion of patients who meet these criteria will continue to increase. As noted by an editorialist, although many patients are cured of their first NSCLC, the risk for metachronous primary tumors remains quite high. Sublobar resection allows more treatment options for these patients.

Bevacizumab Added to FTD-TPI in Third-Line Therapy for Metastatic Colorectal Cancer


A phase 3, randomized, placebo-controlled trial shows benefits, including in overall survival and progression-free survival.

Trifluridine-tipiracil (FTD-TPI) is an approved late-line therapy in chemotherapy-refractory metastatic colorectal cancer. Bevacizumab is combined with first- and second-line chemotherapy, but its continued use in later-line therapy has not been clearly supported in clinical trials. Investigators now report results of the SUNLIGHT trial, a global, open-label, industry-sponsored, randomized, phase 3 trial evaluating the use of standard-regimen FTD-TPI, given with or without bevacizumab (5 mg/kg) every 2 weeks, in chemotherapy-refractory colorectal cancer.

Of the 492 patients treated, 72% had left-sided primary tumors, 92% had received two prior chemotherapy regimens, 72% had received prior anti–vascular endothelial growth factor (VEGF) therapy, 20% received bevacizumab as part of first- and second-line chemotherapy, and of the 31% with RAS wild-type cancers, 94% had received prior epidermal growth factor (EGFR)–targeted therapy.

At a median follow-up of 14 months, the primary endpoint of median overall survival was significantly longer with the addition of bevacizumab (10.8 vs. 7.5 months; hazard ratio, 0.61). Overall survival at 6 months and at 12 months were also better with bevacizumab added (77% vs. 61% and 43% vs. 30%, respectively). Progression-free survival was significantly longer with bevacizumab (5.6 vs. 2.4 months; HR for disease progression or death, 0.44), and response rate was also better (6.1% vs. 1.2%). No new safety signals were observed; hypertension and neutropenia were more common with bevacizumab.

Comment

The SUNLIGHT trial is practice changing and indicates that bevacizumab should be continued into late-line treatment with FTD-TPI, as the addition of bevacizumab was associated with clinically meaningful improvements in all treatment endpoints. Although a cleaner trial comparison would have included only patients with prior first- and second-line bevacizumab therapy, most patients received at least two lines of prior chemotherapy, and nearly all had prior treatment with EGFR or anti-VEGF agents.

Capivasertib for Advanced ER+/HER2− Breast Cancer


Adding the investigational AKT inhibitor capivasertib to fulvestrant improved progression-free survival overall and in patients with AKT pathway alterations.

Aberrant AKT signaling, through the PI3K–AKT–PTEN pathway, is often present in tumor cells that develop endocrine resistance. Capivasertib, an investigational oral, small-molecule inhibitor of AKT, has antiproliferative properties and demonstrated synergy with endocrine therapy. A phase 2 trial (FAKTION) previously showed fulvestrant plus capivasertib improved progression-free survival (PFS) and overall survival (OS) over fulvestrant alone in postmenopausal patients with ER-positive/HER2-negative breast cancer who previously received endocrine therapy (Lancet Oncol 2020; 21:345).

Now, the industry-sponsored phase 3 CAPItello-291 trial evaluated fulvestrant plus capivasertib in pre-, peri-, and postmenopausal patients with ER-positive/HER2-negative metastatic breast cancer who had relapse or progression following treatment with an aromatase inhibitor with or without a CDK4/6 inhibitor. A total of 708 patients were randomized to fulvestrant (500 mg intramuscularly every 14 days for three injections and every 28 days thereafter) plus either capivasertib (400 mg orally twice daily for 4 days each week) or matching placebo; 41% had AKT pathway alterations and 69% had previously received a CDK/6 inhibitor.

Median PFS was 7.2 months in the capivasertib group compared with 3.6 months in the placebo group (hazard ratio for progression or death, 0.60; P<0.001). In patients with AKT pathway alterations, the median PFS was 7.3 versus 3.1 months, respectively (HR, 0.50; P<0.001). The most frequent grade 3 or higher adverse events with capivasertib were rash (12.1% vs. 0.3% with placebo) and diarrhea (9.3% vs. 0.3%, respectively). Baseline global health status and quality of life were maintained longer in the capivasertib arm than the placebo arm.

Comment

Capivasertib may offer yet another partner for endocrine therapy following treatment with a CDK4/6 inhibitor. The FDA is considering capivasertib for approval in this setting. Alpelisib combined with fulvestrant is available for treating patients with PIK3CA mutations; however, capivasertib appears to have a better toxicity profile than alpelisib and is effective even in patients without a mutation in the AKT pathway.

Perioperative Nivolumab and Chemotherapy in Stage III NSCLC


The addition of nivolumab to platinum-based chemotherapy improved pathologic complete response rates in patients with resectable stage IIIA or IIIB NSCLC.

Approximately 20% of patients with non–small-cell lung cancer (NSCLC) have stage III disease. Although therapeutic intent is curative for patients with locally advanced disease, historically, treatment outcomes have been poor, and there is lack of consensus on the most appropriate management.

In the industry-sponsored, open-label, multicenter, phase 2 NADIM II trial, 86 treatment-naive patients with resectable stage IIIA or IIIB NSCLC were randomized 2:1 to receive either neoadjuvant nivolumab and paclitaxel plus carboplatin (experimental group) or paclitaxel plus carboplatin alone (control group), followed by surgery. Patients in the experimental group who had R0 resections received adjuvant nivolumab for 6 months.

Pathologic complete response (pCR), the primary endpoint, occurred in 37% of patients in the experimental group compared with 7% in the control group (relative risk, 5.34; 95% CI 1.34–21.23; P=0.02). Progression-free survival at 24 months was 67.2% in the experimental group and 40.9% in the control group (hazard ratio for disease progression, disease recurrence, or death, 0.47; 95% CI, 0.25–0.88).

During neoadjuvant treatment, grade 3 or 4 adverse events occurred in 19% of patients in the experimental group compared with 10% in the control group, most commonly febrile neutropenia (5%) and diarrhea (4%). A higher percentage of patients in the experimental group underwent surgery (93% vs. 69%); there were no delays in surgery due to adverse events. All patients who attained pCR were free from progression at the time of data cutoff.

Comment

In the NADIM II trial, patients with stage IIIA or stage IIIB NSCLC who were treated with neoadjuvant nivolumab and paclitaxel plus carboplatin achieved a higher rate of pCR and longer survival than those treated with chemotherapy alone. These findings add further support for a neoadjuvant chemo-immunotherapy strategy as demonstrated in CheckMate 816 (NEJM JW Oncol Hematol Apr 14 2022 and N Engl J Med 2022; 386:1973) and in KEYNOTE 677 (NEJM JW Oncol Hematol Jun 20 2023 and N Engl J Med 2023 Jun 3; [e-pub]) in a restricted population of patients with stage III NSCLC.

How CRISPR could yield the next blockbuster crop


Scientists are attempting to rapidly domesticate wild plant species by editing specific genes, but they face major technical challenges — and concerns about exploitation of Indigenous knowledge.

A scientist next to the wild-type rice plant Oryza Alta, the line used in de novo domestication, at Institute of Genetics and Developmental Biology in Beijing.
Plant geneticists in China are targeting genes in the wild rice Oryza alta to make it easier to farm. Credit: Hong Yu and Jiayang Li

In the space of just a few years, Jiayang Li is trying to achieve something that once took people centuries. He wants to turn a wild rice species into a domesticated crop by hacking its genome. And he is already part of the way there.

Li, a plant geneticist at the Institute of Genetics and Developmental Biology in Beijing, is working on a wild rice species from South America called Oryza alta. It produces edible, nutritious grains, but they cannot be harvested because the seeds drop to the ground as soon as they ripen. To tame the plant, Li and his colleagues need to remove this trait, known as seed shattering, and alter a few others.

Li and his co-workers sequenced the O. alta genome and compared it with that of domestic rice, searching for genes similar to those that control important traits in the conventional crop, such as stem diameter, grain size and seed shattering. They then targeted these genes with customized gene-editing tools, trying to recapitulate some of the genetic changes that make domesticated rice easy to farm1. All the traits improved to some degree, says Li, although the plants still drop their grains too soon. “We are working on that,” he says.

The modification of this rice is one of a growing number of efforts to rapidly domesticate new crops using genome editing. Through this process, known as de novo domestication, transformations that took the world’s early farmers millennia could be achieved in just a handful of years. The work might improve the resilience of the global food supply: many wild relatives of staple crops have useful traits that could prove valuable when climate change puts stress on global agriculture. O. alta, for example, has “very sharp resistance to salt and to drought and to some very severe or very dangerous diseases”, says Li.

But the technical challenges of de novo domestication are immense. Most wild plants are understudied, and without an understanding of their fundamental biology it is impossible to domesticate them by rewriting their genomes. Targeted gene editing, using tools such as CRISPR–Cas9, is a powerful approach, but it cannot fully replicate the thousands of mutations that have fine-tuned modern domestic crops for growing and harvest.Apple revival: how science is bringing historic varieties back to life

“It seems like a very simple idea, but the more you start unpacking, the more complex it becomes conceptually,” says plant physiologist Agustin Zsögön at the Federal University of Viçosa in Minas Gerais, Brazil. As a result, although commercial producers are interested in the concept, no companies are publicly pursuing it.

There are also concerns that de novo domestication could be misused. Many wild plants are well known only to Indigenous peoples, who have cared for them over many generations. Throughout history, colonial powers have stolen or exploited the knowledge of Indigenous peoples — as happened with the tea plant rooibos (Aspalathus linearis) in South Africa. “I am very conscious of not repeating the mistakes of the past,” says botanist Madelaine Bartlett at the University of Massachusetts Amherst.

There are proposals for how researchers could work ethically with Indigenous peoples and their knowledge, but so far these have not been widely adopted or codified into laws. “In terms of food crops, we probably have largely ignored Indigenous communities,” says botanist Nokwanda Makunga at Stellenbosch University in South Africa. “People that are doing de novo domestication need to be more aware.”

Taming tomatoes

People have been domesticating plants for around 10,000 years. But domestication is a fuzzy concept, says Zsögön. Many plants can be grown to produce food, but they don’t match the predictability and yields of commonly cultivated crops, such as maize (corn) or potatoes, and they are not as easy to harvest. A useful rule of thumb is that domesticated species have developed a permanent relationship with humans. If they are left to their own devices they might wither, fail to propagate or simply lose the traits that humans value over a few generations.

Although there is no written record of the first domesticated plant species, it is clear that they were generated — intentionally or not — through breeding that selected for desirable traits, such as large fruits or a lack of toxins. Over many generations, the mutations that control these traits accumulated, resulting in crops that were very different from the ancestral line. For instance, the large, soft kernels of modern maize look almost nothing like the small, hard seeds of its wild ancestor, teosinte.

Two varieties of tomatoes (wild, left, and domesticate) each cut in half on a black background.
Wild (left) and domesticated South American tomatoes, Solanum pimpinellifolium.Credit: Agustin Zsögön

Selective breeding is still a mainstay of agriculture. But breeders now target specific traits and often use mutation-causing radiation or chemicals to speed up the process of creating genetic variants.

Despite these advances, many of the methods for introducing traits to crops or producing entirely new crops rely to some extent on chance. Breeders have no way to control what mutations arise. Instead, they must create large numbers of mutants and carefully screen them, in the hope of finding the few useful mutations among thousands of harmful ones.

Gene editing promises to change that, by allowing researchers to edit the genomes of organisms in a targeted way. Geneticists have been doing this for decades by using established methods for adding entire genes to organisms to create ‘transgenic’ crops such as insect-resistant or herbicide-tolerant maize or soya bean plants. But new gene-editing tools provide much more control, allowing researchers to precisely edit the existing genome at chosen sites. The most prominent technique uses CRISPR–Cas9, which was originally part of the ‘immune system’ of bacteria and can be reprogrammed to edit genomes2.

The first demonstrations of de novo domestication through genome editing happened in 2018. In one, Zsögön and his colleagues domesticated wild South American tomatoes called Solanum pimpinellifolium. They are the closest wild relatives of domesticated tomatoes (Solanum lycopersicum). The fruits of S. pimpinellifolium are small, even compared with cherry tomato variants, but edible. “They are sweet and sour with a hint of spiciness,” says Zsögön. His team edited six key regions of the plant’s genome to produce a version that resembled a domestic tomato. The new plants produced ten times as many fruits as the wild plants did, and the fruits were three times the size3.

In another study4, a team led by Zachary Lippman at Cold Spring Harbor Laboratory in New York and Joyce Van Eck at Cornell University in Ithaca, New York, took a wild groundcherry (Physalis pruinosa) a few steps closer to domestication. Groundcherry belongs to the same family of plants as tomatoes, potatoes and peppers. It is grown in parts of Central and South America for its sweet, golden berries. But harvesting it is difficult because of the plant’s sprawling growth and because the fruits are small and drop to the ground quickly once they ripen. The team modified one gene called Ppr-SP5G to make the plants more compact, and tweaked another, Ppr-CLV1, to make the fruits 24% heavier.

These were dramatic breakthroughs, but the new plants are not yet being grown on a large scale, let alone being sold to consumers. Although that is the ultimate goal, these first studies were “a proof of concept”, says Zsögön. “We just showed that it could be done.”CRISPR tools found in thousands of viruses could boost gene editing

He says that de novo domestication should be particularly useful for creating crops that can resist non-biological stressors such as drought, because the relevant traits often involve multiple genes; breeding each one into domestic species would be enormously time-consuming. With de novo domestication, researchers could, theoretically, take the wild plant and quickly domesticate it by tweaking a handful of genes.

Some wild species also use nutrients such as nitrogen more efficiently than do domesticated varieties, says Li. The domestication of wild plants should allow farmers to use less fertilizer, reducing costs as well as harmful run-off into rivers.

These potential benefits have spurred multiple groups to attempt domestication projects.

In 2018, molecular geneticist Sophia Gerasimova started trying to domesticate wild potatoes while at the Siberian Branch of the Russian Academy of Sciences in Novosibirsk. Her efforts were disrupted by Russia’s invasion of Ukraine in 2022: she protested against the war and moved to the Genomics for Climate Change Research Center in Campinas, Brazil.

Gerasimova and her colleagues screened wild potato genomes looking for a good candidate species. To be suitable for domestication, a plant had to be amenable to CRISPR and have potentially useful traits. If the plant had ‘bad’ traits, these needed to be controlled by a small number of genes. The wild potato they eventually settled on, Solanum chacoense, had many appealing properties: it produced round tubers that looked like domestic potatoes, was resistant to viruses and pests, and the plants were easy to work with because they were neat and compact. It was also resistant to ‘cold sweetening’ — the tendency of some potatoes to become rich in glucose and fructose when stored in the cold, leading to an unpleasant taste when cooked. However, the tubers were “small and bitter”, says Gerasimova. They needed to fix that.

Gerasimova and her colleagues identified five target genes for CRISPR editing, which they think are involved in crucial traits such as the timing of tuber formation and the accumulation of toxic steroidal glycoalkaloids5. However, the researchers have struggled to make the necessary edits to the plants. Gerasimova says that they have succeeded in editing the genome in plant cells, but have not yet managed to get these mutations to propagate to an entire plant. She is optimistic that they will overcome this hurdle.

Close up of the fruit Ground Cherry (Physalis pruinosa).
Researchers are editing the genome of a wild groundcherry to aid harvesting.Credit: Getty

There are a host of reasons why de novo domesticated crops are not yet being grown commercially. One is that, as Gerasimova’s experience illustrates, applying CRISPR to a new species is a challenge in itself.

Equally important is the complexity of domestication. Although it’s true that a handful of genes can cause marked changes, domesticated crops differ from their wild relatives in many regions of their genomes, and each difference can have a small but important effect. “There are many thousands of genes that contribute to making corn different to teosinte,” says Bartlett. It’s not practical to use CRISPR to reproduce all these changes.

So, conventional breeding techniques will continue to have a large role. Developmental biologist David Marks at the University of Minnesota in St. Paul is part of a team working to domesticate field pennycress (Thlaspi arvense) as part of his institution’s Forever Green initiative. Pennycress has a single vertical stem, with small cabbage-like leaves and white flowers. Its seeds contain a useful oil, “extremely similar to canola oil”, Marks says.

The entire domestication project has relied on mutagenesis and selective breeding — conventional methods that Marks notes are still being improved and are now much faster than in previous decades6. By the time CRISPR took off, the project was already at an advanced stage.

“Don’t get me wrong,” says Marks. “The CRISPR technique is elegant, beautiful and simple. I wish like hell it was available back in my early days.” However, it is practical only in certain circumstances. “In the case of pennycress, we’re starting off with a plant that already has desirable characteristics,” he says. The single-gene changes achievable with CRISPR were not needed. But many other potentially useful wild plants, such as O. alta, need these kinds of targeted changes in a small number of genes.

Fundamental gap

There is one further obstacle to de novo domestication by gene editing, and that is botanists’ limited knowledge of wild-plant biology. Much of what is known about plants comes from a handful of model species, such as thale cress (Arabidopsis thaliana). Most wild plants have not even had their genomes sequenced, let alone been subject to the intensive study required to learn what the DNA sequences do, which is necessary before de novo domestication can be attempted. “You have to have basic information and the basic building blocks in order to be able to do this manipulation,” says Makunga.Gene-edited tomatoes could provide new source of vitamin D

“The technologies have far outpaced our knowledge of the fundamental biology,” says Bartlett.

Another complication is finding ways to account for the rights of Indigenous groups. Bartlett and Makunga argue that these communities need to be included in any de novo domestication programme from the start7. “We need to be much more ethical in our practice,” says Makunga.

When Indigenous people have a claim on a wild plant, “they should be involved in those projects and benefit from any sorts of innovations that emerge from them,” says Maui Hudson at the University of Waikato in Hamilton, New Zealand (see also ref. 8).

South Africa has taken steps in this direction. Makunga and her colleagues have met with representatives of the San people to discuss the benefits of a new project — something that they were required to do under South Africa’s National Environmental Management: Biodiversity Act 10 of 2004. The 2010 Nagoya Protocol, part of the Convention on Biological Diversity, also requires that benefits from the use of genetic resources are shared with Indigenous groups. Likewise, Brazil has created a repository for all research that involves native species, and a mechanism to compensate Indigenous communities if their knowledge leads to a profit. Zsögön does not expect his projects to trigger this mechanism because the plants he works with grow widely. Similarly, the rice Li works on is “widespread in South America” and “is not tied to any particular Indigenous group and has not been cultivated by anyone anywhere in our knowledge”.

However, arrangements such as those in Brazil remain rare. For example, South Africa’s commercial rooibos tea industry has existed for more than a century. The plant is only weakly domesticated, so the industry is only possible thanks to Traditional Knowledge preserved by the Khoi and San peoples. Yet it took until 2019 for the industry to sign an agreement that requires it to pay Khoi and San communities.

Despite the challenges, both technical and political, researchers are enthused about the potential of de novo domestication. “I’m excited by a future where we have customizable and modifiable plant development,” says Bartlett. “I think that that is actually a prospect that we might see in my lifetime.”

Chimpanzees are first animals shown to develop telltale markers of Alzheimer’s disease


Analysis of chimp brains reveals protein plaques and tangles that signal brain disease in humans, but whether the animals can develop dementia is unclear.

Chimpanzees are among humans’ closest genetic relatives.

chimpanzees develop brain characteristics that are similar — but not identical — to those seen in early Alzheimer’s disease in humans, researchers report on 1 August in Neurobiology of Aging1. The findings from humanity’s closest relatives could help researchers to understand why people develop dementia, as well as suggest that caretakers of aging, captive chimpanzees watch them closely for behavioural changes.

Although most animals’ cognitive abilities decline late in life, only people seem to develop Alzheimer’s disease, which can result in severe dementia symptoms. The brains of people with Alzheimer’s show several signs of the disease: plaques made of a protein called amyloid-β, tangles of a protein called tau and the loss of neurons. Humans were thought to be the only primates with brains that can contain plaques and tangles simultaneously — although one study did find both markers in a single chimp brain2.

Using a collection of chimpanzee brains that has been compiled over several decades, a team led by Mary Ann Raghanti, a biological anthropologist at Kent State University in Ohio, analysed the brains of 20 aged chimps that had died between 37 and 62 years of age. The team examined the brain regions that are damaged in people with Alzheimer’s — such as the memory-forming hippocampus — and found that four of the preserved chimp brains contained both plaques and tangles. All 20 of the specimens contained ‘pre-tangles’, and blood vessels in several of the chimp brains contained amyloid-β. Because the protein is normally found outside of blood vessels in the human brain, this suggests that plaques may form in a different way in chimps.

“This is a really cool paper”, says Elizabeth Head, a neuroscientist at the University of Kentucky in Lexington. Even if chimps never develop the symptoms of Alzheimer’s, knowing that they spontaneously develop biological signs of the disease could yield useful information about its early stages and potentially how to prevent it, she says.

Lary Walker, an experimental neuropathologist at Emory University in Atlanta, Georgia, is also impressed with the study. The strength of the paper, he says, is that the large number of animals involved provide a good sample of the different ways in which chimp brains age.

The researchers were not able to link the biological changes in the chimps’ brains to shifts in their behaviour later in life. The animals had lived in zoos, research labs and sanctuaries, and had, therefore, been exposed to different stimuli and undergone different cognitive tests.

Although severe dementia has never been observed in chimps, the presence of both plaques and tangles suggests that it could, says study co-author William Hopkins, a psychologist at Georgia State University in Atlanta. Given these findings, he says, those who look after chimps should actively monitor the cognitive health of ageing animals. In 2015, the United States effectively ended biomedical research on chimps, including magnetic resonance imaging scans, but Hopkins says that such scans could be useful for the aging population of animals retired from research facilities.

The sequences of amyloid-β and tau proteins are identical in humans and chimps. But it is possible that there is a factor protecting chimp brains from severe dementia. “They’re missing something,” Walker says of the chimps. One possibility, he adds, is that the amyloid-β and tau proteins may fold differently in chimps than in people.

Another difference could be the behaviour of a protein called APOE, which controls how amyloid-β aggregates into plaques. Humans have evolved several versions of the APOE gene, one of which — APOE4 — makes a person more likely to develop Alzheimer’s. It is possible that evolution selected for the ‘bad’ version of APOE in people because it protects them from something else, such as a parasite3, says Caleb Finch, who studies ageing at the University of Southern California in Los Angeles.

Raghanti says that the researchers are now counting the neurons in the chimp brains they studied to determine whether the cells are lost with age, and studying inflammation in the brains. Both neuron loss and inflammation seem to contribute to Alzheimer’s in humans. “If we could identify the things that are similar and different in chimpanzees and humans, we can start to unlock why humans are so uniquely susceptible to this pathology,” she says.