US scientist recommends adding salt to make perfect cup of tea.


A tea drinker removes the tea bag from their cup.
How you make the perfect cuppa can be an intensely individual experience

The British claim to know a thing or two when it comes to making a good cup of tea.

The beverage is a cultural institution in the UK, where an estimated 100 million cups are drunk every day.

But now a scientist based more than 3,000 miles (5,000km) away in the US claims to have found the secret to a perfect cuppa that many Brits would initially find absolutely absurd – adding salt.

Prof Michelle Francl’s research has caused quite the stir in the UK, and has even drawn a diplomatic intervention from the US embassy.

“We want to ensure the good people of the UK that the unthinkable notion of adding salt to Britain’s national drink is not official United States policy. And never will be,” the embassy said on X, formerly known as Twitter.

It is not the first time the drink has caused controversy on both sides of the Atlantic.

Back in 1773, demonstrators in Boston, colonial Massachusetts, threw 300 chests full of tea into the harbour in protest at British taxes – a key moment which sparked the American Revolution.

“I certainly did not mean to cause a diplomatic incident,” Prof Francl, a professor of chemistry at Bryn Mawr College in Pennsylvania, tells the BBC.

“My emails have been going crazy today. I did not anticipate waking up this morning to see loads of people talking about salt in their tea.”

So why add salt?

It turns out that it is not a new idea – the ingredient is even mentioned in Eighth Century Chinese manuscripts, which Prof Francl analysed to perfect her recipe.

“What is new is our understanding of it as chemists,” Prof Francl said.

She explains that salt acts as a blocker to the receptor which makes tea taste bitter, especially when it has been stewed.

By adding a pinch of table salt – an undetectable amount – you will counteract the bitterness of the drink.

“It is not like adding sugar. I think people are afraid they will be able to taste the salt.”

She urges tea-loving Brits to have an open mind before prejudging her research, which she has documented in her new book Steeped: The Chemistry of Tea, published by the Royal Society of Chemistry.

“It is OK to experiment,” she says. “I did experiments in my kitchen for this – channel your inner scientist.”

Prof Francl has loved tea ever since her mother made her first brew when she was 10 years old.

Everyone has their own opinion on what makes the perfect cuppa, but Prof Francl recommends using loose leaves instead of tea bags and giving the drink a constant stir so the tea gets a good exposure to the water and milk.

Adding a small squeeze of lemon juice can also remove the “scum” that sometimes appears on the surface of the drink, she adds.

Other suggestions she makes include using short, stout mugs to keep the tea hotter, and warming up the mug and milk, with the latter added in only after pouring the tea.

But chief among her advice is to never, ever heat up the water in a microwave: “It’s less healthy and it does not taste as good,” Prof Francl says.

“You end up getting tea scum forming on the surface, and that scum contains some of the antioxidants and taste compounds.”

While the concept of microwaving tea might sound a bit alien in the UK, it is “totally common” in the US.

“Americans have some truly awful tea-making habits,” Prof Francl says.

“I have had better cups of tea at service stations in Ireland than I have had at fancy restaurants in the US.

“I think it is just that people do not know [how to make a good cuppa]. If you do not drink tea, you do not know you are making a horrible cup of tea for someone and giving them a miserable experience.”

She says she loves coming to the UK, where she knows she will be able to locate a decent brew.

“I know when I land I can get a great cup of tea. It is good to have that common ground,” she says.

So, what next for British-American tea relations?

The US embassy is not heeding Prof Francl’s advice and says it will stick to what it calls the “proper way” of making tea – by microwaving it – while the UK Cabinet Office is adamant it can only be made using a kettle.

Leukemia Breakthrough: Phase III Trial Achieves “Exceptional” Results


A new clinical trial showed that personalized treatment durations based on blood tests greatly enhance survival and remission in CLL, marking a major advancement in leukemia therapy.

Personalized therapy enhances survival rates for patients with CLL leukemia.

A phase III trial conducted by the University of Leeds discovered that personalized treatment for the most prevalent type of adult leukemia extends patient survival and maintains remission.

This significant research was recently published in the New England Journal of Medicine and highlighted at the 65th American Society of Hematology (ASH) Annual Meeting and Exposition in San Diego.

The data shows that the duration of therapy can be individualized for each patient by using regular blood tests to monitor their response. In the trial, this approach resulted in significant improvements in both progression-free and overall survival in patients with previously untreated chronic lymphocytic leukemia (CLL). The effect was stronger among patients with poorer outcomes to standard treatments, such as those with some genetic mutations.

Personalized Treatment Approach and Its Efficacy

Adult patients were given a combination of cancer growth-blocking drugs over varied durations depending on how rapidly their disease responded.  

The trial found that this approach significantly improved progression-free and overall survival compared to the standard treatment for CLL, with more than 19 in 20 patients in remission three years after starting treatment.

The study, named FLAIR, is a phase III randomized controlled trial for untreated CLL, taking place in more than 100 hospitals across the UK. It was funded by Cancer Research UK, Janssen Research & Development, LLC, and AbbVie Pharmaceutical Research and Development.

Lead author Peter Hillmen, Professor of Experimental Haematology in the University of Leeds’ School of Medicine, and Honorary Consultant Haematologist at Leeds Teaching Hospitals NHS Trust, said: “Our findings show that, for this group of patients, the treatment is very effective at tackling their disease and is well tolerated by them. This means that patients on our trial had better outcomes while also enjoying a better quality of life during their treatment. Most patients treated with the new combination have no detectable leukemia in their blood or bone marrow by the end of treatment which is better than with previous treatments and is very encouraging.”

Research Findings and Future Implications

Dr Iain Foulkes, Executive Director of Research and Innovation at Cancer Research UK, said: “We are delighted to see these results from the FLAIR trial which show the importance and effectiveness of tailoring cancer treatment to the individual patient. Not only this, but the trial has found a way to do so without requiring frequent bone marrow tests which are more invasive and can be painful.

“The collaborative effort that went into this trial – involving researchers, healthcare professionals, funders and dedicated patients and their families – point to a new standard of care which could see real progress made against leukemia.”

Chronic lymphocytic leukemia is a type of cancer that affects the blood and bone marrow. It cannot usually be cured but can be managed with treatment. More than nine in 10 people are aged 55 and over when they are diagnosed. 

Current treatments include chemotherapy, immunotherapy, or cancer growth blockers.

The FLAIR trial tested cancer growth blockers called Ibrutinib and Venetoclax (I+V). Also known by the brand names Imbruvica and Venclexta, these are usually administered either continuously or for the same fixed duration rather than tailored to each patient’s response. This means that many patients may stop treatment too early and don’t get the full potential benefit from their therapy or continue therapy for longer than necessary. This could lead to a greater chance of relapse of their leukemia and/or of treatment side effects. 

FLAIR researchers aimed to discover whether it was possible to personalize I+V treatment duration for patients based on regular blood sampling and/or bone marrows, and whether this was as effective or better than standard treatment (FCR).

This regular blood and bone marrow monitoring gave researchers a more up-to-date picture of how patients were responding to I+V, and meant that the duration of I+V treatment could be tailored accordingly to each patient.  In addition, it was found that basing the duration of treatment on less invasive, quicker blood samples was just as effective as using bone marrows, which can be painful and sometimes require sedation.

Trial Design and Results

FLAIR was launched in 2014, recruiting 1,509 patients with CLL. They were randomized to four treatment groups, each receiving a different treatment.

This part of the FLAIR trial compared two of the groups, placing 260 patients on I+V and 263 on the standard treatment, known as FCR. Almost three-quarters were male, which was to be expected as CLL occurs more frequently in males. The average age was 62, and just over a third had advanced disease. 

At the end of this stage of the trial, 87 patients had seen their disease progress, 75 of which were on FCR, and 12 on I+V.

To date, 34 of these patients have died during the trial. Of these, 25 were treated with FCR and only nine with I+V.

The patients on I+V underwent blood tests and bone marrows to monitor their response to treatment. The technique used is known as measurable residual disease (MRD) which allows clinicians to see the number of remaining cancer cells. The number of cells may be so small that the patient is asymptomatic. An MRD positive test result means that there are remaining cancer cells.

The research team now hopes that this more personalized therapy approach, guided by blood test monitoring will be adopted as a new standard of care for patients needing first-line CLL treatment.

Professor Hillmen said: “The results of the FLAIR Trial, led by the Leeds Cancer Research UK Clinical Trials Unit at the University of Leeds, are exceptional and herald a change in the way chronic lymphocytic leukemia will be treated. FLAIR has been a huge collaborative effort over the last decade by the UK’s leading CLL specialists and by the hematology teams in over 100 hospitals throughout the UK. The participation of patient groups, individual patients, and their families was critical to delivering such progress, particularly through the challenges of the pandemic.”

The trial was co-ordinated by the Leeds Cancer Research UK Clinical Trials Unit at the University of Leeds. Deputy Director Professor David Cairns said: “The vision of the Leeds Cancer Research UK CTU is to improve the length and quality of survival for cancer patients on a worldwide scale. Our strategy to do this is to ensure that we build evidence to identify the correct treatment, for the correct duration, for the correct patient. FLAIR is a trial well aligned with our strategy, and reflects team science including clinicians, laboratory scientists, methodologies, and operational experts working together to deliver important trial results. None of this would be achieved without the selfless commitment of trial participants who contribute their time and data.”

Unveiling Peptide YY: A Molecular Maestro of Appetite and Immunity


Candida albicans usually co-exists peacefully in the body, but under the right conditions it transforms into hyphae, the dark red filaments pictured above, which can form harmful biofilms.

Research shows that a gut hormone called peptide YY also plays a vital role in maintaining the health of the gut microbiome by preventing helpful fungi from turning into more dangerous, disease-causing forms.

Peptide YY (PYY), a hormone produced by gut endocrine cells that was already known to control appetite, also plays an important role in maintaining the balance of fungi in the digestive system of mammals, according to new research from the University of Chicago.

In a study published in the journal Science, researchers found that specialized immune cells in the small intestine called Paneth cells express a form of PYY that prevents the fungus Candida albicans from turning into its more virulent form. PYY was already known to be produced by endocrine cells in the gut as a hormone that signals satiety, or when an animal has had enough to eat. The new research shows that it also functions as an antimicrobial peptide that selectively allows commensal yeast forms of C. albicans to flourish while keeping its more dangerous forms in check.

“So little is known about what regulates these fungi in our in our microbiome. We know that they’re there, but we have no idea what keeps them in a state that provides health benefit to us,” said Eugene B. Chang, MD, Martin Boyer Professor of Medicine at UChicago and senior author of the study. “We now think that this peptide we discovered is actually important for maintaining fungal commensalism in the gut.”

Regulating the ‘Mycobiome’

Chang and his team didn’t set to explore the fungal side of the gut microbiome, or “mycobiome” as he calls it. Joseph Pierre, PhD, a former postdoctoral scholar in Chang’s lab who is now an Assistant Professor of Nutritional Sciences at the University of Wisconsin-Madison, was studying the enteroendocrine cells in mice that produce PYY when he noticed that it was also present in Paneth cells. These are important immune system defenders in the gut of mammals, secreting several antimicrobial compounds to prevent dangerous bacteria from flourishing.

At first this didn’t make sense, because until then, PYY was only recognized as an appetite hormone. When they tested it against a variety of bacteria, it wasn’t very good at killing them either. But when they ran a computer search for other classes of peptides with a similar structure, they discovered one similar to PYY called magainin 2, which is found on the skin of the African clawed frog. This peptide protects the frogs from infection by both bacteria and fungi, so Chang’s team thought to test PYY’s antifungal properties too. As it turns out, it is not only an effective antifungal agent, but a very specific one as well.

“So little is known about what regulates these fungi in our in our microbiome. We know that they are there, but we have no idea what keeps them in a state that provides health benefit to us.”

—Eugene B. Chang, MD

C. albicans is a yeast that typically grows in small amounts in the mouth, on the skin, and in the intestines. The basic yeast form is commensal, or coexists peacefully in the body, but given the right conditions it transforms into what are called hyphae that branch out to form biofilms. When too much grows, it causes thrush, an infection in the mouth and throat, vaginal yeast infections, or more serious generalized infections in the body. When Chang’s team tested PYY against both forms of the fungus, it effectively prevented growth and killed the more dangerous hyphae while sparing the commensal Candida yeast.

“This is a unique example of an ‘innate’ antimicrobial peptide secreted by Paneth cells that specifically kills the virulent form of this fungi and has no effect on the on the commensal form,” Chang said.

Making the Most Out of Your Molecules

While PYY could be useful as a tool to combat fungal infections, its newly discovered function may play a role in digestive diseases as well. Patients with Crohn’s disease of the ileum, the last portion of the small intestine, often have dysfunctional Paneth cells. Chang said it’s possible that this dysfunction, and lack of PYY, could create an environment for fungi to overgrow and trigger the onset of disease.

The full, unmodified version of PYY has 36 amino acids, and when Paneth cells secrete it into the gut it’s an effective antifungal peptide. But when endocrine cells produce PYY, an enzyme clips off two amino acids to turn it into a hormone that can travel through the bloodstream and tell the brain you’re not hungry. Just like discovering its function from a frog, Chang hopes more research on this peptide will turn up more surprises.

“This is an example of the wisdom and beauty of nature that has repurposed a molecule, so it has two different functions,” he said. “That’s really cool, because this is an efficient way of making the most out of things you already have.”

Breakthrough in Food Allergy Treatment: Omalizumab Trial Shows Promising Results


The NIAID-funded OUtMATCH trial shows that omalizumab, a monoclonal antibody treatment, significantly improves food tolerance in children and adolescents with food allergies, leading to a potential new treatment option.

A treatment using monoclonal antibodies substantially improved the tolerance of various everyday foods in children and adolescents with food allergies, according to a planned interim analysis of an advanced clinical trial. The trial is sponsored and funded by the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health.

The laboratory-made antibody, omalizumab, is approved by the Food and Drug Administration for three indications other than food allergy. FDA is reviewing a supplemental biologics license application for omalizumab for food allergy based on this interim analysis of the NIAID trial.

In addition to NIAID funding, the trial has support from Genentech, a member of the Roche Group, and Novartis Pharmaceuticals Corporation. The two companies collaborate to develop and promote omalizumab, marketed as Xolair, and are supplying it for the trial.

Overview of OUtMATCH Trial

The multi-stage trial is called Omalizumab as Monotherapy and as Adjunct Therapy to Multi-Allergen OIT in Food Allergic Children and Adults, or OUtMATCH. The NIAID-funded Consortium for Food Allergy Research (CoFAR) is conducting OUtMATCH at 10 locations across the United States.

The first stage of the study was designed to assess the efficacy of omalizumab in increasing the amount of food it takes to cause an allergic reaction, thereby reducing the likelihood of reactions to small amounts of food allergens in the event of accidental exposure. The study team enrolled children and adolescents ages 1 to 17 years and three adults ages 18 to 55 years, all with confirmed allergies to peanut and at least two other common foods.

In the planned interim analysis, the study’s independent Data and Safety Monitoring Board (DSMB) examined data on the first 165 children and adolescents who participated in the first stage of the trial. Using strict criteria, the DSMB found that study participants who received omalizumab injections could consume higher doses of peanut, egg, milk, and cashew without allergic reactions than participants who received placebo injections.

Based on these favorable results, the DSMB recommended halting enrollment into the first stage of the trial. NIAID accepted the board’s recommendation. More detailed information about the findings will become available when they are published in a peer-reviewed journal.

Scientists Discover Previously Unknown Structure of a Cancer-Associated Protein


The p38a protein is a key enzyme involved in regulating various cellular functions and has been linked to the progression of several diseases, including cancer, chronic inflammation, and neurodegenerative disorders. Its role in these diseases is often associated with its ability to control cell growth, death, and response to stress. The recent discovery of its oxidized form, which alters its functional state, provides a deeper understanding of its mechanisms in disease and could lead to more effective treatments targeting p38a.

The p38a protein, an important enzyme involved in regulating a wide range of cell functions, is significantly implicated in several diseases such as cancer, chronic inflammation, and neurodegenerative disorders. Since its identification, numerous pharmaceutical companies and research teams have invested substantial resources in creating inhibitors targeting this protein. Despite these efforts, the outcomes have yet to reach the anticipated level necessary for successful drug development.

A team of researchers led by Dr. Maria Macias and Dr. Angel R. Nebreda, both ICREA researchers at IRB Barcelona, ​​has discovered that p38a adopts a conformation not previously described. In brief, they have revealed a new “oxidized” form, in which a disulfide bridge is established. The protein would adopt this form temporarily depending on the redox state of the cell. This new form of p38a, which has been described in the journal Nature Communications, does not allow binding with activators or substrates and it is therefore unable to perform its characteristic functions. However, this process is reversible, and protein function is recovered under reducing conditions.

https://www.youtube.com/embed/HsKGX3rjDJg?feature=oembed
Animation showing the transition between the reduced (PDB:3OBG) and the oxidized (PDB:8ACM) p38𝛼 structures. 𝛼D/LD is shown in gold, A-loop in purple. Credit: IRB Barcelona

“The identification of a new form of p38a could explain previous difficulties in designing effective p38a inhibitors as studies have so far focused on reduced conformations. Our results open up new avenues for the development of therapeutic compounds that modulate the activity of p38a more precisely,” explains Dr. Macías, ICREA researcher and head of the Structural Characterization of Macromolecular Assemblies laboratory at IRB Barcelona.

An oxidized form and a reduced form

The Protein Data Bank holds 357 structures of p38a protein, but they all correspond to its reduced form—the only one known so far. The predominance of this form is possibly due to the prevalence of experimental conditions that include reducing agents in the structural studies carried out. In the oxidized form described in this study, a disulfide bridge is established, which forces a conformational change and blocks access to the binding site of activators and substrates. Thus, this is a new inactive form of p38a, which would be present in certain cellular conditions.

“The study of kinases in their oxidized forms is complex due to the influence of oxidative stress conditions and the transience of these forms in the cellular environment,” explained Drs. Joan Pous and Pau Martin Malpartida and doctoral student Blazej Baginski, first authors of the study. “However, the key to addressing them effectively from a pharmacological perspective may lie in these forms,” they conclude.

A promising approach

This new form illustrates a mechanism of action of p38a regulated by the cellular redox state, thereby explaining biochemical observations described to date but with no structural molecular basis.

In future work, the researchers will focus on exploring new interaction cavities that appear in the oxidized form as these may help to inactivate the protein without interfering with the catalytic center, thereby gaining specificity.

New Research Shatters Vitamin D Supplementation Myth


The largest-ever study on vitamin D supplementation in children, led by Queen Mary University of London and Harvard T.H. Chan School of Public Health, reveals that vitamin D supplements do not prevent fractures or improve bone strength in vitamin D deficient children, contradicting previous assumptions about the benefits of vitamin D on bone health.

A major clinical trial conducted by Queen Mary University of London in collaboration with the Harvard T.H. Chan School of Public Health has discovered that vitamin D supplements do not enhance bone strength or reduce the risk of bone fractures in children who are deficient in vitamin D. This research contradicts common beliefs about the impact of vitamin D on bone health.

Around one-third of children have at least one fracture before the age of 18. This is a major global health issue, as childhood fractures can lead to life years of disability and/or poor quality of life. The potential for vitamin D supplements to improve bone strength has attracted growing interest in recent years, based on vitamin D’s role in promoting bone mineralization. However, clinical trials designed to test whether vitamin D supplements can prevent bone fractures in children have not previously been conducted.

Study Methodology and Results

Working with partners in Mongolia, a setting with a particularly high fracture burden and where vitamin D deficiency is highly prevalent, researchers from Queen Mary and Harvard conducted a clinical trial to determine if vitamin D supplementation would decrease risk of bone fractures or increase bone strength in schoolchildren. The study, recently published in Lancet Diabetes & Endocrinology, is the largest randomized controlled trial of vitamin D supplementation ever conducted in children.

Over the course of three years, 8,851 schoolchildren aged 6-13 living in Mongolia received a weekly oral dose of vitamin D supplementation. 95.5% of participants had vitamin D deficiency at baseline, and study supplements were highly effective in boosting vitamin D levels into the normal range. However, they had no effect on fracture risk or on bone strength, measured in a subset of 1,438 participants using quantitative ultrasound.

Implications

The trial findings are likely to prompt scientists, doctors, and public health specialists to re-consider the effects of vitamin D supplements on bone health.

Dr Ganmaa Davaasambuu, Associate Professor at the Harvard T.H. Chan School of Public Health, said:

“The absence of any effect of sustained, generous vitamin D supplementation on fracture risk or bone strength in vitamin D deficient children is striking. In adults, vitamin D supplementation works best for fracture prevention when calcium is given at the same time – so the fact that we did not offer calcium alongside vitamin D to trial participants may explain the null findings from this study.”

Professor Adrian Martineau, Lead of the Centre for Immunobiology at Queen Mary University of London, added:

“It is also important to note that children who were found to have rickets during screening for the trial were excluded from participation, as it would not have been ethical to offer them a placebo (dummy medication). Thus, our findings only have relevance for children with low vitamin D status who have not developed bone complications. The importance of adequate vitamin D intake for prevention of rickets should not be ignored, and UK government guidance recommending a daily intake of 400 IU vitamin D remains important and should still be followed.”

Face Recognition Technology Becoming More Sophisticated — And Better at Tracking You


Viewed as a part of the long history of people-tracking, face recognition technology’s incursions into privacy and limitations on free movement are carrying out exactly what biometric surveillance was always meant to do.

face recognition tracking people feature

American Amara Majeed was accused of terrorism by the Sri Lankan police in 2019.

Robert Williams was arrested outside his house in Detroit and detained in jail for 18 hours for allegedly stealing watches in 2020.

Randal Reid spent six days in jail in 2022 for supposedly using stolen credit cards in a state he’d never even visited.

In all three cases, the authorities had the wrong people. In all three, it was face recognition technology that told them they were right.

Law enforcement officers in many U.S. states are not required to reveal that they used face recognition technology to identify suspects.

Face recognition technology is the latest and most sophisticated version of biometric surveillance: using unique physical characteristics to identify individual people.

It stands in a long line of technologies — from the fingerprint to the passport photo to iris scans — designed to monitor people and determine who has the right to move freely within and across borders and boundaries.

“The Wuhan Cover-Up” by Robert F. Kennedy Jr.

Order Now

In my book, “Do I Know You?: From Face Blindness to Super Recognition,” I explore how the story of face surveillance lies not just in the history of computing but in the history of medicine, race, psychology, neuroscience and the health humanities and politics.

Viewed as a part of the long history of people-tracking, face recognition technology’s incursions into privacy and limitations on free movement are carrying out exactly what biometric surveillance was always meant to do.

The system works by converting captured faces — either static from photographs or moving from video — into a series of unique data points, which it then compares against the data points drawn from images of faces already in the system.

As face recognition technology improves in accuracy and speed, its effectiveness as a means of surveillance becomes ever more pronounced.

Accuracy improves, but biases persist

Surveillance is predicated on the idea that people need to be tracked and their movements limited and controlled in a trade-off between privacy and security. The assumption that less privacy leads to more security is built in.

That may be the case for some, but not for the people disproportionately targeted by face recognition technology.

Surveillance has always been designed to identify the people whom those in power wish to most closely track.

On a global scale, there are caste cameras in India, face surveillance of Uyghurs in China and even attendance surveillance in U.S. schools, often with low-income and majority-Black populations.

RFK Jr. and Brian Hooker’s New Book: “Vax-Unvax” Order Now

Some people are tracked more closely than others. In addition, the cases of Amara Majeed, Robert Williams and Randal Reid aren’t anomalies.

As of 2019, face recognition technology misidentified Black and Asian people at up to 100 times the rate of white people, including, in 2018, a disproportionate number of the 28 members of the U.S. Congress who were falsely matched with mug shots on file using Amazon’s Rekognition tool.

When the database against which captured images were compared had only a limited number of mostly white faces upon which to draw, face recognition technology would offer matches based on the closest alignment available, leading to a pattern of highly racialized — and racist — false positives.

With the expansion of images in the database and increased sophistication of the software, the number of false positives — incorrect matches between specific individuals and images of wanted people on file — has declined dramatically.

Improvements in pixelation and mapping static images into moving ones, along with increased social media tagging and ever more sophisticated scraping tools like those developed by Clearview AI, have helped decrease the error rates.

The biases, however, remain deeply embedded into the systems and their purpose, explicitly or implicitly targeting already targeted communities. The technology is not neutral, nor is the surveillance it is used to carry out.

Latest technique in a long history

Face recognition software is only the most recent manifestation of global systems of tracking and sorting. Precursors are rooted in the now-debunked belief that bodily features offer a unique index to character and identity.

This pseudoscience was formalized in the late 18th century under the rubric of the ancient practice of physiognomy.

Early systemic applications included anthropometry (body measurement), fingerprinting and iris or retinal scans. They all offered unique identifiers. None of these could be done without the participation — willing or otherwise — of the person being tracked.

The framework of bodily identification was adopted in the 19th century for use in criminal justice detection, prosecution and record-keeping to allow governmental control of its populace.

The intimate relationship between face recognition and border patrol was galvanized by the introduction of photos into passports in some countries including Great Britain and the U.S. in 1914, a practice that became widespread by 1920.

Face recognition technology provided a way to go stealth on human biometric surveillance. Much early research into face recognition software was funded by the CIA for border surveillance.

It tried to develop a standardized framework for face segmentation: mapping the distance between a person’s facial features, including eyes, nose, mouth and hairline.

Your support helps fund this work, and CHD’s related advocacy, education and scientific research. Donate Now

Inputting that data into computers lets a user search stored photographs for a match. These early scans and maps were limited, and the attempts to match them were not successful.

More recently, private companies have adopted data harvesting techniques, including face recognition, as part of a long practice of leveraging personal data for profit.

Face recognition technology works not only to unlock your phone or help you board your plane more quickly, but also in promotional store kiosks and, essentially, in any photo taken and shared by anyone, with anyone, anywhere around the world.

These photos are stored in a database, creating ever more comprehensive systems of surveillance and tracking.

And while that means that today it is unlikely that Majeed, Williams, Reid and Black members of Congress would be ensnared by a false positive, face recognition technology has invaded everyone’s privacy.

It — and the governmental and private systems that design, run, use and capitalize upon it — is watching, and paying particular attention to those whom society and its structural biases deem to be the greatest risk

Are anxiety and depression social problems or chemical disorders?


Two anthropologists question the chemical imbalance theory of mental health disorders.

Twentieth-century science was supposed to change everything. Indeed, thanks to vaccinations, antibiotics, and improved sanitation, humans thrived like never before. Yet in that mix was thrown pharmacological treatments for mental health disorders. On that front, little progress has been made.

It can be argued—it is being argued, in a new paper in American Journal of Physical Anthropology—that we’re regressing in our fight against mental health problems. As Kristen Syme, a PhD student in Evolutionary Anthropology, and Washington State University anthropology professor Edward Hagen argue, psychopharmacological treatments are increasing alongside mental health disorder diagnoses. If the former worked, the latter would decrease.

There are numerous problems with the current psychiatric model. Journalist Robert Whitaker has laid out the case that antidepressants, antipsychotics, and other pharmacological interventions are the real culprit behind chemical imbalances in the brain—a psychiatric talking point that’s been challenged for over a half-century. Patients suffering from minor anxiety and depression are placed on ineffective drugs, often being placed on a cocktail of pills. With many consumer advocacy groups being funded by pharmaceutical companies, we’ve reached a tipping point in mental health protocols.

As Syme and Hagan write, consumer advocacy groups are not the only compromised organizations. One review of 397 clinical trials discovered 47 percent of these studies reported at least one conflict of interest. As Whitaker has written about before, when pharmaceutical companies don’t like the results of their trials, they’re scrapped until more suitable results are recorded.

Humans proliferated as never before during the last century, yet technological innovation does not always equate to better outcomes. We live by a perpetual illusion of progress. Severing ties with nature has had profound consequences on our health. This drives to the heart of Syme and Hagan’s paper: brain chemistry is heavily influenced by society. Sure, some people are born with genetic-based developmental dysfunctions. This doesn’t account for increasing numbers of people on Zoloft and Xanax and dozens of other medications today.

Physical anthropology and evolutionary biology are essential fields of study when contemplating all facets of health. Historical perspective is important. The authors point to a previous battle: in 1900, roughly half of all deaths in the U.S. were attributed to infectious diseases. A century later, the number of deaths due to such diseases was negligible.

That’s because the etiologies of a number of infectious diseases were discovered thanks to germ theory. There has never been a holistic etiology of anxiety or depression, however. Psychiatrists, in coordination with pharmaceutical companies, exploited that fact by creating and marketing a singular etiology—the chemical imbalance theory—and selling the world on pharmacology.

Think about the basic framework of this proposition: an animal that has evolved for millions of years, roughly 350,000 in the present form, experiences its greatest century to date in terms of population expansion, while simultaneously billions of our brains are suddenly chemically compromised. This narrative boggles the mind, yet it’s exactly what’s being sold by psychiatrists and medical doctors around the world.

As the authors write, the chemical imbalance theory, first widely discussed in the late forties, became part of a public health campaign designed to destigmatize mental health issues in the aughts. In reality, the campaign accomplished the opposite.

“First, a systematic review found that an endorsement of biogenetic causes of mental disorders does not reduce stigma and, in fact, might even increase stigmatizing attitudes among mental health professionals and the mentally ill themselves. Second, there is little evidence that psychopharmaceuticals correct specific chemical imbalances or neurobiological deficits.”

While mental health is a broad term with numerous categories, the authors divide disorders into four subsets:

  • Disorders which are genetic-based developmental dysfunctions
  • Disorders associated with senescence/aging
  • Disorders caused by a mismatch between modern and ancestral environments
  • Disorders which are adaptive responses to adversity, however undesirable

The first two account for many common diseases, such as dementia, autism, and schizophrenia. The second pair represent disorders that modern psychiatry has exploited. By failing to consider environmental, racial, economic, familial, and societal forces, we’ve been sold a story that we’re broken from birth.

This story serves a purpose: the global antidepressant industry is expected to reach $16B by 2023. Thanks to concerted marketing and lobbying efforts, an uptick in prescriptions coincides with an increasing number of disorders—and increasing numbers of children on these drugs. When one market is exploited, create another.

Pharmacological interventions for Parkinson’s, Alzheimer’s, and autism might be valuable to patients of these disorders. The problem isn’t with drug development, which is a necessary field of research for combating such confounding diseases. As has long been known—since at least the 19th century, though likely much longer—most anxiety and depression alleviates with time, especially when interventions such as proper dietexercise, and improved economic conditions are put into place. As Syme and Hagan conclude,

“A final group of disorders, such as anxiety, depression, and PTSD, have low heritability, are caused by adversity, and involve symptoms that seem to be adaptive responses to adversity. Because they are relatively common throughout adult life, they account for a substantial fraction of disease burden attributable to mental illness. These might not be disorders at all, however, but instead aversive yet adaptive responses to adversity.”

That is, anxiety and depression are largely social problems, not medical disorders. The authors write that it would be unethical to prescribe pain medication for a broken bone without first setting the bone. Why then do psychiatrists and doctors churn out scripts without identifying the source of suffering that brought the patient into the office in the first place?

Though we don’t yet have reliable etiologies of most mental health disorders, the authors conclude that they could be within reach. Their discovery relies not on brain chemistry alone, but on epigenetics, behavioral observation, cross-population comparisons, cultural transmission, evolutionary theory, and much more.

Humans are complex animals. Perhaps Occam’s razor isn’t as sharp as we believe.

The Illusion of Antibiotic-Free Meat


The grocery aisle makes a lot of promises, but are industrial meat farms living up to their claims?

White Oak Pastures is a sixth-generation, 156-year-old farm in Bluffton, Georgia, that made the bold decision nearly 30 years ago to transition away from industrial agriculture. After decades of crowded barns and antibiotic-laden feed, fourth-generation farmer Will Harris was ready to undertake some radical revisions.

Today, the farm’s animals roam free, grazing on grassy fields and rooting their stout noses in the dirt. All White Oak livestock is raised without antibiotics or hormones. Their products sport labels like Non-GMO, Certified Humane, and Raised Without Antibiotics.

Harris‘ ethics reflects a deeper shift in industrial farming practices that began showing face late in the 21st century. While antibiotics are often attributed to illness prevention, their use became popular in the 1940s as a growth enhancer in livestock feed. Farmers could rapidly raise animals double the size, while using a fraction of the resources.

This “non-therapeutic” antibiotic use was economically rewarding, but concerns about drug residues and antimicrobial resistance began to rise in the 1960s. Antibiotics in food, water, and soil can make their way into our bodies, leading to resistance in human medicine. Antibiotic resistance has been identified by the WHO as “one of the biggest threats to global health, food security, and development today.”

Calls for the gradual phasing out of antibiotics in livestock started back in the ‘60s, though it took some time to see regulatory movement. Today, the use of these drugs in American livestock is managed by both the Food and Drug Administration (FDA) and the U.S. Department of Agriculture (USDA). The USDA is in charge of testing meat and managing on-package labeling, while the FDA regulates non-medical antibiotic use.

These regulations led to the 1996 establishment of the National Antimicrobial Resistance Monitoring System (NARMS), which tracks antimicrobial resistance in foodborne and enteric bacteria. Over the past two decades, the FDA has also banned unapproved uses of cephalosporins, used to treat diseases like pneumonia, and fluoroquinolones, used in respiratory and urinary tract infections, as well as other medically important antibiotics. Additionally, antibiotics are now forbidden to be used for growth promotion (though the jury is out on how effective the rule has been). Today, thanks to FDA intervention, many over-the-counter, medically important antibiotics now require prescriptions or veterinary authorization for use.

Through the USDA, adding labels like “Raised Without Antibiotics” or “No Antibiotics Ever” to meat is a voluntary process. Producers may submit a one-time application to be reviewed by USDA inspectors, who may conduct further testing in the future.

Yet, a 2022 study by the Milken Institute found that 15 percent of “Raised Without Antibiotics” beef samples came from feedlots that tested positive for antibiotics — critics argue USDA’s oversight is lacking in this area.

This June, USDA revealed it would be taking steps to ensure greater accuracy of antibiotic-related labels on meat products. The agency now plans to conduct a sampling project in the “Raised Without Antibiotics” market. It also said its Food Safety and Inspection Service (FSIS) will strengthen the guidelines related to the documentation companies submit when verifying animal-raising claims.

“Tyson’s change is not some high-minded reevaluation about what it is to have medically responsible use.“

Meanwhile, Tyson Foods, America’s largest beef, chicken, and pork producer, recently reported it would begin reintroducing certain antibiotics into its chicken supply chain, dropping its long-held “No Antibiotics Ever” tagline. The Wall Street Journal reported Tyson’s decision is based on data indicating that the drugs they use, called ionophores, are not considered by the World Health Organization (WHO) to be medically important for treating human illnesses. Ionophores are mainly used in poultry production to control a disease called coccidiosis.

Yet, Michael Hansen, a biologist and senior staff scientist at Consumer Reports, said in an interview that new data is becoming available that links ionophores to medically important antibiotics. “Even if [an antibiotic] is not used in human medicine, if it’s in the same family, then [scientists] are concerned about it,” he said. “Because antibiotics in the same family all work fundamentally the same way.” He continued, “When one becomes resistant, the others do too.”

Hansen said the USDA’s improvement goals are positive, spurred by consumer sentiment and increased scientific data. “People had a suspicion things were just being said, that weren’t necessarily true,” he said. Now, with increased testing and inspections, Hansen believes we’ll see greater verification of label claims.

Grocery giant Whole Foods is also facing a class-action lawsuit after an independent laboratory found antibiotic residue in “antibiotic-free” meat purchased from one of its stores. Since 1981, Whole Foods has claimed that every animal within its supply chain is raised without antibiotics.

Andrew deCoriolis, executive director of Farm Forward, an animal advocacy non-profit that first brought the lawsuit against Whole Foods, said it’s no coincidence that Tyson’s backpedaling on its antibiotic-free claims comes when the USDA is taking steps to strengthen its labeling policy. He noted that, “Tyson’s change is not some high-minded reevaluation about what it is to have medically responsible use. [It’s] purely recognition that they knew antibiotics were in their supply chain, and they weren’t going to pass a rigorous testing program.“

Scientists believe that 60 percent of known infectious diseases, as well as 75 percent of all emerging infectious diseases, are transmitted between animals and humans.

DeCoriolis added that responses like Tyson’s are a step forward, insofar as the antibiotic-free label will be meaningful to consumers. That said, he believes it’s a step back for public health and animal welfare.

The real problem, said deCoriolis, is that companies like Tyson are unwilling to change their farming practices in the ways necessary to stop using antibiotics. This means that a significant amount of antibiotics will continue to be used, and the underlying conditions leading to the need for antibiotics — such as overcrowding, poor sanitation, and bad genetic health — will remain unaddressed.

Hansen agrees that while there have been calls for the elimination of antibiotics in meat products for decades, the fundamental problem rests within industrial meat production. “If you wanted to raise the animals in a way that minimizes the use of antibiotics, you wouldn’t be packing them together in such large numbers, or having genotypes that go from egg to harvest in 35 days.”

He continued, “Some of these chickens grow so big, and their breasts are so large, they can’t even stand up. They have leg defects. They’re packed together, so the high ammonia levels in the air cause respiratory problems, and then antibiotics are used to treat that bacteria.”

“[If] you can give the animals adequate space … that minimizes the chance they’ll be stressed out, or living in conditions that increase their risk of illness,” he said.

For decades, scientists have warned that current industrial farming practices run the risk of exposing new superbugs and sparking a long-term zoonotic pandemic. Scientists believe that 60 percent of known infectious diseases, as well as 75 percent of all emerging infectious diseases, are transmitted between animals and humans.

“Consumers have an imagination for what good animal farming should look like.”

White Oak’s Will Harris said free-range feeding allowed his animals to stay healthy without antibiotics. “When I made the first decision [to go free-range], the second decision was made for me,” said Harris, speaking on his shift to organic, open-pasture farming. “I didn’t need [antibiotics] anymore. I just turned those cows out and let them start eating grass. The freedom from antibiotics was a bonus.“

For Harris, antibiotic labeling is just one part of a larger, corrupt food cycle. As industrial farms get bigger, and food prices fall, we push costs over into other areas, said Harris, — the environment, the welfare of animals, the livelihood of farmers, and the economic productivity of rural America.

DeCoriolis echoes this sentiment, saying there should be more to consumer and producer values than just antibiotics. Despite their fallibility, he said, “Catch-all claims like ‘All-Natural’ and ‘Antibiotic-Free,’ … invoke something for consumers about the wholesomeness of the product. Consumers have an imagination for what good animal farming should look like.”

“Unfortunately,” said deCoriolis, “antibiotic-free labels are not going to ensure [good animal farming] alone. Even if the verification goes into place, and [antibiotic-free] becomes a meaningful label, that may just mean that you’ve got a chicken company that’s willing to let 30 percent of its birds die so they can market you a premium product.”

ARTEMIS – UCLA’s most advanced humanoid robot – gets ready for action


Robotics lab, already a 5-time RoboCup champ, will bring its newest invention to international competition in July

ARTEMIS humanoid robot kicking a soccer ball
RoMeLa at UCLA ARTEMIS’ major innovation is that its actuators — devices that generate motion from energy — were custom-designed to behave like biological muscles.

Mechanical engineers at the UCLA Samueli School of Engineering have developed a full-sized humanoid robot with first-of-its-kind technology.

Named ARTEMIS, for Advanced Robotic Technology for Enhanced Mobility and Improved Stability, the robot is scheduled to travel in July to Bordeaux, France, where it will take part in the soccer competition of the 2023 RoboCup, an international scientific meeting where robots demonstrate capabilities across a range of categories.

The robot was designed by researchers at the Robotics and Mechanisms Laboratory at UCLA, or RoMeLa, as a general-purpose humanoid robot, with a particular focus on bipedal locomotion over uneven terrain. Standing 4 feet, 8 inches tall and weighing 85 pounds, it’s capable of walking on rough and unstable surfaces, as well as running and jumping. ARTEMIS is able to remain steady even when strongly shoved or otherwise disturbed.

During tests in the lab, ARTEMIS has been clocked walking 2.1 meters per second, which would make it the world’s fastest walking humanoid robot, according to the UCLA researchers. It is also believed to be the first humanoid robot designed in an academic setting that is capable of running, and only the third overall.

The robot’s major innovation is that its actuators — devices that generate motion from energy — were custom-designed to behave like biological muscles. They’re springy and force-controlled, as opposed to the rigid, position-controlled actuators that most robots have.

“That is the key behind its excellent balance while walking on uneven terrain and its ability to run — getting both feet off the ground while in motion,” said Dennis Hong, a UCLA professor of mechanical and aerospace engineering and the director of RoMeLa. “This is a first-of-its-kind robot.”

Another major advance is that ARTEMIS’ actuators are electrically driven, rather than controlled by hydraulics, which uses differences in fluid pressure to drive movement. As a result, it makes less noise and operates more efficiently than robots with hydraulic actuators — and it’s cleaner, because hydraulic systems are notorious for leaking fluids.

ARTEMIS’ ability to respond and adapt to what it senses comes from its system of sensors and actuators. It has custom-designed force sensors on each foot, which help the machine keep its balance as it moves. It also has an orientation unit and cameras in its head to help it perceive its surroundings.

To prepare ARTEMIS for RoboCup, student researchers have been testing the robot on regular walks around the UCLA campus. In the coming weeks, they will fully test the robot’s running and soccer-playing skills at the UCLA Intramural Field. Researchers also will evaluate how well it can traverse uneven terrain and stairs, its capacity for falling and getting back up, and its ability to carry objects. RoMeLa’s Twitter account is regularly sharing information about the robot’s testing results and posting the routes for its campus walks, giving Bruins the chance to catch ARTEMIS in action and chat with researchers.

“We’re very excited to take ARTEMIS out for field testing here at UCLA and we see this as an opportunity to promote science, technology, engineering and mathematics to a much wider audience,” Hong said.

Taoyuanmin Zhu and Min Sung Ahn, both of whom recently earned doctorates in mechanical engineering at UCLA, developed ARTEMIS’ hardware and software systems, respectively.

RoMeLa, which has been making humanoid robots for more than two decades, has had earlier robots win the RoboCup competition five times already; the engineers are hoping ARTEMIS brings home trophy number six.

ARTEMIS’ development was funded in part by 232 donors who contributed more than $118,000 through a UCLA Spark crowdfunding campaign. Additional support came from an Office of Naval Research grant.

ARTEMIS Fun Facts

  • Named in honor of the Greek goddess of the hunt, wild animals, chastity and childbirth, RoMeLa members refer to ARTEMIS as “she.”
  • RoMeLa researchers have joked that ARTEMIS could also stand for “A Robot That Exceeds Messi In Soccer.”