6 Ways to Reverse Hair Loss


A condition plaguing over 80 million men and women in the United States alone, hair loss a condition that not only indicates possible health issues but can also profoundly diminish quality of life. Here we provide in-depth, actionable information that can provide hair loss sufferers with tools they need to reverse hair loss

Hair loss, although not a life-threatening condition, can be a significant source of social stigma and pose a profound detriment to quality of life. However, it is readily reversible in many cases once the underlying cause is identified and remediated.

In many cases, it can be rectified by optimizing vitamin and mineral status, since vitamins and minerals play a critical role in the hair cycle and in particular in the rapid turnover of the matrix cells residing within the follicle bulb (1). In fact, many of the overt nutritional deficiency diseases are associated with prominent hair loss, showcasing how crucial our diets are not only for prevention of pathology, but also for visible signs of health such as healthy hair.

For example, in the eighteenth century, James Lind documented that scurvy, the vitamin C deficiency disease that historically afflicted sailors on long voyages, was linked to dermatological symptoms including skin hemorrhage and hair loss (1). Other conditions of protein-energy malnutrition including kwashiorkor and marasmus, can also be associated with profound hair loss, underscoring how important diet is to hair quality (1).

Although hair loss is multifactorial, there are several common-sense steps that can be systematically examined and addressed in order to fix it, as outlined below. Many of the approaches elucidated are diet-related, as hair is a rapidly proliferating organ, which therefore requires much of the blood supply and the provision of adequate nutrition.

1. Eat a nutrient-dense diet replete in B vitamins

Human hair, which is comprised of approximately 100,000 hair follicles, grows in three phases, including the active growing or “anagen phase,” which represents 90% of the hairs, the degeneration or “catagen phase,” representing less than 10% of hairs, and the resting or “telogen phase” in which hair is shed, representing 5% to 10% of the hairs (2). During the anagen phase in particular, essential dietary elements including vitamins, minerals, and protein are required.

Deficiencies in the water-soluble vitamin B complex are particularly implicated, since B vitamins play a foundational role in cell metabolism. Because folate and vitamin B12, for example, are active in the production of nucleic acids, they may play a role in the highly proliferative hair follicle (1). Although research is inconsistent, some findings are confirmatory of these deficiencies in conditions of hair loss, such as one study which showed that mean red blood cell folate was significantly suppressed in patients with alopecia areata (AA) (Yousefi 3). Folate is present in a variety of foods, with some of the highest food sources including beef liver, spinach, black-eyed peas, asparagus, and Brussels sprouts.

Vitamin B12 sufficiency is also quintessential to hair growth since it affects the synthesis of almost 100 different substrates including RNA, DNA, and proteins (1). In some investigations, as many as half of the individuals eating a vegan diet were categorized as vitamin B12 deficient (Gilsing 4), since B12 is found exclusively in animal products. Foods naturally rich in vitamin B12 include clams, beef liver, trout, and sockeye salmon, and fortified nutritional yeasts also contain B12 (6 Health Prof Fact Sheet B12). 

Deficiencies in riboflavin (vitamin B2) may likewise contribute to hair loss. A precursor of flavin cofactors of the electron transport chain known as flavin mononucleotide (FMN; also known as riboflavin-5′-phosphate) and flavin adenine dinucleotide (FAD), riboflavin is instrumental to the series of redox reactions within the inner mitochondrial membrane that serve to generate cellular energy. Riboflavin is important not only for metabolism of macronutrients but it is also important for cell function, growth, and development and is used as an antioxidant for the immune system and for healthy skin and hair (5).

Pregnant and lactating women, vegans, older adults, alcoholics, people with certain disorders such as Brown-Vialetto-Van Laere syndrome (BVVL), and women on oral contraceptive pills are at risk for riboflavin deficiency (5). The best non-fortified sources of riboflavin are organ meats, with beef liver topping the list. Lean meats, clams, mushrooms, almonds, eggs, quinoa, and salmon also contain moderate amounts.

Lastly, biotin (vitamin B7) is one of the most recommended supplements for reversing hair loss, due to its role as a coenzyme for carboxylase enzymes which play a role in metabolic reactions critical to maintenance of healthy hair such as fatty acid synthesis, gluconeogenesis, and branched-chain amino acid catabolism (6). Although skin rashes, brittle nails, and hair loss are all signs of biotin deficiency, researchers state that large-scale studies do not support the efficacy of biotin supplementation (1).

Even so, biotin deficiency warrants exploration in cases of hair loss, especially where risk factors for biotin deficiency, such as alcoholism, pregnancy, malabsorption, and use of medications including valproic acid and isotretinoin, which disrupt activity in the enzyme biotinidase, are present (1). Excessive consumption of raw eggs can also deplete biotin, because avidin in the egg whites attaches to biotin and interferes with its absorption (1). Cooking eggs can circumvent this problem since it destroys the avidin particles.

A foods-first approach is the safest bet, as long as absorptive processes are intact, because foods contain synergistic nutrients to prevent the genesis of imbalances which can occur with supplementation–and especially with the megadose supplementation which is oftentimes recommended. Beef liver again tops the list as the richest source of biotin followed by whole eggs, with pink salmon, pork chops, hamburger patties, sunflower seeds, and sweet potato also containing fair but comparatively lower amounts.

Other than B vitamins, there are many other essential nutrients for hair growth that go well beyond the scope of this article. However, optimizing digestion and eating a phytonutrient-rich diet consistent with evolutionarily compatible eating principles–with a plate predominated by organic fruits and vegetables, roots and tubers, nuts and seeds, pseudograins such as amaranth, buckwheat, and quinoa where tolerated–alongside sustainably and humanely raised animals and seafood for those who eat from an omnivore template–will help to buffer against nutrient deficiencies.

2. Address Thyroid Dysfunction

It has been well-elucidated within the scientific literature that various endocrine disorders, including hypothyroidismhyperthyroidism, and parathyroid disorders, can contribute to hair loss (7). Disorders of the thyroid gland, the butterfly-shaped gland that sits at the base of the neck, are implicated in hair loss since thyroid hormone is critical to development and maintenance of the hair follicle (7).

Hashimoto’s thyroiditis, an autoimmune thyroid condition, is the most common cause of hypothyroidism, and should always be investigated in cases of hair loss. To assess thyroid status, a comprehensive thyroid panel including the following biomarkers should be ordered by a holistic physician well-versed in thyroid lab interpretation and optimization:

  • free T3 (fT3)
  • free T4 (fT4)
  • reverse T3
  • thyroid peroxidase (TPO) antibodies
  • thyroglobulin (TG) antibodies

Although a multifaceted, holistic approach is required to treat Hashimoto’s thyroiditis, foundational first steps include removal of pro-inflammatory food proteins such as gluten. Not only does celiac disease occur twelve times more frequently in individuals with Hashimoto’s relative to the general population (8), but antibodies produced against gluten can cross-react with the thyroid gland.

Gluten is also known to produce leaky gut syndrome, wherein the selective gates between intestinal cells become excessively permeable, allowing for the trafficking of undigested food proteins, microbial byproducts, endotoxins, and foreign agents into systemic circulation (9). This influx of inflammatory components provokes an immune response that can culminate in autoimmune response against the thyroid and other tissues.

The approach to treating Hashimoto’s thyroiditis is complex, and is best done in conjunction with a naturopathic doctor or functional medicine practitioner. However, as is the approach for all autoimmune diseases and chronic illnesses more broadly, addressing the pillars of health and restoring an evolutionarily compatible lifestyle with sun exposure, nature, grounding, community, restorative sleep, and an anti-inflammatory, low-antigen diet free of immunoreactive foods such as gluten, dairy, and soy–as well as replete in thyroid-supportive micronutrients–should top the list of interventions.

3. Check Your Vitamin D Levels

Vitamin D is a fat-soluble vitamin necessary for the balance of the three distinct arms of the immune system: Th1, Th2, and Th17. Generally, polarization to the Th1 or Th17 immune responses away from the Th2 response results in excessive inflammation, tissue damage, and possible autoimmunity (10). In other words, without adequate vitamin D, the scales may be tipped in the direction of an inflammatory cascade that leads to immune dysfunction. Thus, vitamin D’s immunomodulatory effects are undeniably essential for healthy immune responses (11).

A robust body of literature not only informs us of the presence of vitamin D receptors in the hair follicle but also demonstrates a clear connection between vitamin D deficiency and the development of alopecia areata (AA), a form of hair loss with autoimmune implications (12). One hallmark feature of AA is an increased production of Th1-type signaling molecules like interleukin-2 (IL-2) and interferon γ (IFNγ) in and around the hair follicle. Specifically, IFNγ is known to be a potent inhibitor of hair growth (13). These findings give us a deeper understanding as to how vitamin D deficiency and the corresponding immune dysfunction could be involved with the pathogenesis of AA.

However, conflicting findings in the research suggest that AA may not always boil down to a vitamin D deficiency itself. In some cases of AA, a decrease in vitamin D receptor expression and a resultant decrease in vitamin D activity might also contribute (12).

Treating a vitamin D deficiency is usually quite simple. Sun exposure provides a cost-free abundance of vitamin D. However, in winter months or in northern latitudes, supplementation and proactive food consumption might be needed to bolster vitamin D levels. In this case, supplementation with the bioavailable vitamin D3 as opposed to vitamin D2 is prudent. Fatty cold-water fish such as wild-caught salmon, sardines, and mackerel and organic, pasture-raised ghee provide a moderate dose of vitamin D3 in each serving while minimizing the possibility of food intolerances (ghee excludes most immunoreactive dairy proteins).

4. Include Antioxidant-Rich and Anti-Inflammatory Foods In Your Diet

Antioxidant-rich and anti-inflammatory foods are revered as pillars of health for a myriad of reasons. However, their possible applications as treatments in hair loss may give them a more vanity-driven, yet vital application.

Dermal papilla cells (DPCs), cells located at the base of the hair follicle, are responsible for signaling the division of hair follicle cells that leads to hair growth (14). Supporting the integrity and signaling capacity of these cells, then, is of the utmost importance in maintaining hair growth and preventing hair loss.

In cell culture studies, loss of proliferative capacity of DPCs–or the ability of these cells to multiply–is associated with increased markers of oxidative stress, or inflammation (6). Additionally, topical application of lipid peroxides (fats damaged by free radicals) causes cell death of hair follicle cells, leading to the onset of the catagen or shedding phase in the hair cycle. These findings are mirrored by elevated biomarkers of oxidative stress in AA and androgenetic alopecia (AGA), also known as pattern hair loss (15, 16).

What this all means, then, is that inflammation is a potent mediator of hair loss. In addition to the aforementioned Th1 polarization in AA, specific inflammatory molecules are also observed in AGA scalps. In AGA-affected hair follicles, there also seems to be elevated levels of the bacteria strain P. acnes. This particular strain incites the immune system through a release of byproducts called porphyrins (17).

The ensuing response is an activation of IL-1α, a pro-inflammatory cytokine known to inhibit hair growth, and TNF-α, a potent inducer of the inflammatory transcription factor NF-κB which is the gateway to other mediators of inflammation such as prostaglandins, thromboxanes, and leukotrienes. All of this culminates in the production of inflammation-spawning signaling molecules that wreak havoc on hair growth.

By including antioxidant-rich and anti-inflammatory foods as a focus in your diet, you can assist your body in lowering inflammation and mitigating oxidative stress. Try “eating the rainbow” to ensure a diversity of phytonutrients and incorporating fatty fish, turmeric, ginger, and even herbal medicines like cannabinoid-rich full-spectrum hemp oil into your daily routine. 

Some of the plant-based foods with the highest antioxidant potential include berries, such as dried varieties of amla (Indian gooseberry), dog rose, and bilberries, fresh black currantsblackberriescranberries, crowberries, goji berries, strawberries, and zebeck (red sour berries) (18). An analysis of 581 fruits and vegetables found the following plant foods to contain high levels of antioxidants (18):

  • Artichokes
  • Green and red chili peppers
  • Lemon skin
  • Curly kale
  • Okra flour
  • Apples
  • Plums 
  • Apricots

5. Rule out Anemia

Iron deficiency is the leading global cause of anemia, responsible for 30%-50% of anemia in children and other groups. It is more common in women, with studies illuminating that two billion people worldwide are iron deficient (19). Iron deficiency anemia occurs when the balance of iron intake, iron stores, and iron losses become disturbed and can no longer support production of the body’s red blood cells, which carry iron-rich proteins known as hemoglobin.⁣ Hemoglobin is important as a result of its role in tissue oxygenation, or the delivery of oxygen to tissue.⁣

Iron deficiency anemia is more common in multiparous women as well as in minority and low-income populations, but other risk factors include heavy menstrual bleeding, malabsorptive disorders, overexercise, pregnancy, presence of microorganisms that use up iron stores or inhibit its absorption, and consumption of substances which impede iron absorption, such as caffeine, antacid medication, calcium supplements, and dairy⁣.

Although low iron is frequently touted as a cause of hair loss, findings vary, but some research does document a relationship between iron deficiency and conditions of hair loss including female pattern hair loss (FPHL), alopecia areata, alopecia universalis or totalis, and telogen effluvium (20). One study found a significant decrease in hair loss and improvement in serum ferritin concentration in subjects with telogen effluvium who received oral iron therapy (21). Another analytical case-control study found that serum ferritin levels below or equal to 30 ng/mL are strongly associated with telogen hair loss in women of childbearing age without systemic inflammation or other underlying disorders (22). 

To rule out iron deficiency, the following tests should be ordered. Serum ferritin concentration is especially important, as it is reflective of total body iron stores and can signify the very early stages of iron deficiency (20).

  • Serum Iron⁣
  • Ferritin⁣
  • Total iron binding capacity (TIBC)⁣
  • Unsaturated iron binding capacity (UIBC)⁣
  • Transferrin saturation percentage⁣

Although iron supplementation or even intravenous iron infusions may be required in extreme cases, investigating why iron levels are low merits examination. Barring absorptive disorders, restoring iron levels through dietary strategies is oftentimes attainable.

Bioavailability of iron is dependent upon dietary context, as certain components in food can either inhibit or enhance iron absorption. For example, pairing iron with vitamin C-rich foods such as Bell pepper, broccoli, grapefruit, kiwi, oranges, and strawberries can enhance iron absorption. Consuming iron away from certain substances which inhibit iron absorption, such as tannins in coffee, oxalates in tea, polyphenols in cocoa, and phytate in soy will also render it more bioavailable. The “heme iron” present in animal foods, such as organ meats, beef, lamb, turkey, oysters, and clams, is generally more usable than “non-heme iron” in plant foods, including leafy greens, olives, quinoa, pumpkin seeds, legumes, and dark chocolate.

6. Enhance Your Insulin Sensitivity

The metabolic dysfunction known as insulin resistance is a contributor to elevated androgen levels, which is a mediating factor in the pattern hair loss known, again, as androgenetic alopecia (AGA) (17). A loss of cellular sensitivity to insulin forces an elevation in blood insulin levels, also known as hyperinsulinemia. In ovarian cell culture studies, hyperinsulinemia has been shown to increase the expression and enhance the activity of the 5α-reductase enzyme responsible for converting testosterone into powerful androgen dihydrotestosterone (DHT) (23).

DHT can stimulate the proliferation of sebaceous gland cells, the microscopic exocrine glands in the skin that secrete an oil, waxy substance known as sebum that lubricates the skin and hair. The result is increased sebum production (17), allowing pro-inflammatory P. acnes bacteria to readily colonize the hair follicle, leading to inflammatory responses, free radical production, and the need for tissue repair.

This tissue repair is initiated by a growth factor called TGF-β1 responsible for creating fibrous scar tissue. Continual stimulation of this process, as in chronic inflammation and oxidative stress, can lead to a process called perifollicular fibrosis. This can cause restricted growth space inside the hair follicle and choked-off blood supply, eventually leading to hair loss.

To prevent this deleterious sequence of events, restoring insulin sensitivity should be the priority. Inhibiting the insulin resistance cascade prevents insulin-associated DHT spikes, which may help reverse and prevent AGA hair loss. The cause of insulin resistance is still elusive, but there is an emerging trend within the literature that suggests inflammation and oxidative stress are inextricably linked to reduced insulin sensitivity (24). 

Some natural agents with insulin-sensitizing properties include berberine from goldenseal, bitter melon, cinnamon, and alpha lipoic acid. Curcumin, an active constituent within the golden-hued Indian culinary spice turmeric, is particularly effective, with one study showing that it is 400 to 100,000 times more effective than the prescription drug Metformin at activating the mechanism behind glucose uptake (25).

Omega-3s, the polyunsaturated fatty acids within cold-water fatty fish, also possess potent anti-inflammatory properties that may warrant their use for recovering insulin sensitivity. In one systematic review, researchers concluded: “Short-term fish oil supplementation is associated with increasing the insulin sensitivity among those people with metabolic disorders” (26).

Other Variables in Hair Loss

As illustrated, hair loss is a multi-factorial condition that requires a multi-pronged approach. In addition to the aforementioned factors, it is important to manage stress, since stressors can disrupt the finely-orchestrated hormonal symphony and change hair follicle biochemistry. Equally significant is minimizing exposure to environmental toxicants. 

Phthalate-laden, endocrine-disrupting shampoos, conditioners, and styling products as well as chlorinated and fluoridated water not only irritate the hair follicles and can precipitate shedding, but also displace iodine in the thyroid gland and can suppress thyroid function, creating a vicious cycle of hair loss. 

Hair is merely outward manifestation of health, such that we can glean insights into an individual’s underlying state of health by assessing hair quality. Luckily, however, because the evidence-based interventions presented here are based in removing the impediments to and restoring the contributors to health–the approaches that bolster hair quality are also oftentimes accompanied by the fortunate byproduct of improving your quality of life and wellness overall.

Is black coffee healthy?


Coffee, a beloved beverage enjoyed by millions worldwide, often sparks discussions about its impact on health. While preferences for coffee vary, many individuals wonder if consuming black coffee is a healthy choice. In this comprehensive guide, we’ll explore the potential health benefits associated with black coffee, offering insights into its nutritional value and impact on well-being.

  • Minimal Calories and Nutrient-Rich:

Black coffee is a low-calorie beverage that packs a punch in terms of flavor. It contains minimal calories, making it an excellent option for those watching their calorie intake. Additionally, black coffee is rich in antioxidants, such as chlorogenic acid, which contribute to its potential health-promoting effects.

  • Antioxidant Properties:

Antioxidants play a crucial role in combating oxidative stress in the body, which is linked to aging and various chronic diseases. The antioxidants found in black coffee may help neutralize free radicals, supporting overall health. Regular consumption of antioxidants is associated with potential benefits for heart health and disease prevention.

  • Improved Mental Alertness:

One of the well-known benefits of black coffee is its ability to enhance mental alertness. The caffeine content in coffee acts as a natural stimulant, temporarily warding off fatigue and improving focus. This can be particularly beneficial for individuals who need a cognitive boost, such as during work or study sessions.

  • Physical Performance Enhancement:

Caffeine, a natural component of coffee, is recognized for its performance-enhancing properties. It stimulates the release of adrenaline, which can increase physical performance. Athletes often consume black coffee before workouts to experience improved endurance and heightened alertness.

  • Metabolism Boost:

Black coffee has been linked to a temporary boost in metabolism. Caffeine can increase the metabolic rate, encouraging the body to burn calories more efficiently. This effect may contribute to weight management efforts when combined with a healthy diet and regular physical activity.

  • Potential Cardiovascular Benefits:

Emerging research suggests that moderate coffee consumption, including black coffee, may be associated with certain cardiovascular benefits. These include a potential reduction in the risk of heart disease and stroke. However, individual responses to coffee can vary, and excessive caffeine intake should be avoided.

  • Type 2 Diabetes Risk Reduction:

Some studies have indicated that regular black coffee consumption may be associated with a reduced risk of developing type 2 diabetes. The antioxidants in coffee, along with other bioactive compounds, may contribute to improved insulin sensitivity and glucose metabolism.

  • Bone Health Considerations:

While black coffee offers various health benefits, it’s essential to consider its potential impact on bone health. Excessive caffeine intake may interfere with calcium absorption, leading to a potential risk for bone thinning over time. It’s advisable to maintain a balanced diet rich in calcium and vitamin D to support bone health.

  • Moderation is Key:

While black coffee can be a healthy addition to your routine, moderation is key. Excessive caffeine intake can lead to side effects such as insomnia, jitteriness, and increased heart rate. Individual tolerance to caffeine varies, so it’s crucial to listen to your body and adjust your consumption accordingly.

In conclusion, black coffee can be a health-conscious choice when consumed in moderation. Its antioxidant-rich nature, potential mental and physical performance benefits, and associations with reduced risks of certain diseases make it a beverage worth considering as part of a balanced lifestyle. As with any dietary choice, it’s advisable to consult with healthcare professionals, especially for individuals with specific health conditions or concerns. Enjoying your cup of black coffee mindfully can be a flavorful and potentially beneficial addition to your daily routine.

What To Know About Vitamin K2 and Its Health Benefits


Vitamin K2 is gaining recognition for its effects on blood clotting, heart health and bone health

Gouda cheese, blue cheese, eggs and fermented soy.

You may have heard about vitamin K. It plays a big role in blood clotting, bone health and heart health.

What you may not know is that vitamin K is actually a name given to a class of vitamins. What we commonly think of as vitamin K includes vitamin K1 (also called phylloquinone), as well as vitamin K2 (menaquinone). They work differently in your body and come from different food sources.

Vitamin K1 comes from plant sources, like leafy greens and blueberries. While vitamin K2 is more common in animal products, fermented foods and some kinds of cheese. It stays in your body longer than vitamin K1 and holds the potential for some serious health benefits that are just now starting to come to light.

“I think we’ve always known that there’s a vitamin K2, I just don’t think we’ve really ever given it enough credit for how much work it does in the body,” says registered dietitian Julia Zumpano, RD, LD.

Zumpano helps us understand the health benefits of vitamin K2 and how to get more of it in your diet.

What is vitamin K2?

Think of vitamin K as a collection of vitamins that play a similar role in your health. That class is made up of vitamin K1, as well as vitamin K2 and vitamin K2’s 10 subtypes, known as MK-4 to MK-13.

“Vitamin K is a class of vitamins, like citrus fruits are a class of fruits,” Zumpano says. “Think of oranges, grapefruits, limes and lemons. They may grow on different trees and have different tastes, but at the end of the day, they’re all citrus fruits. It makes sense for them to be grouped together because they share a lot of common traits.”

The same is true of vitamin K. K vitamins are fat-soluble, meaning they dissolve in fats and oils. And they play important roles in blood clotting, bone health and heart health.

Vitamin K isn’t unique in this regard. Lots of vitamins are named by a single letter but have various subtypes. B vitamins, for example, are broken down into eight subtypes.

What are the benefits of vitamin K2?

Zumpano says the exciting thing about vitamin K2 is that it’s absorbed by your body more slowly than vitamin K1. Whereas vitamin K1 is quickly filtered out of your blood — often within a matter of hours, vitamin K2 has the potential to be longer-lasting. That means it stays in your body longer — several days even — and has more time to do its good work.

“The absorption of vitamin K1 is pretty low because of its structure. It’s a shorter chain, so it gets filtered through your liver more quickly,” Zumpano explains. “The thinking is that vitamin K2 has the potential to have more influence on your body because it’s a longer chain, so your body is slower to absorb and digest it.”

There’s still much to be understood about the differences between vitamin K1 and vitamin K2, and research is ongoing. Zumpano details what we do know about K vitamins, and vitamin K2 specifically, including some of its most important benefits.

Aids in blood clotting

One of the main functions of K vitamins is to allow your blood to clot. In fact, the “K” in vitamin K is in reference to the German word “koagulation,” which translates to “coagulation” or the ability to clot (or thicken) blood.

Blood clots may sound like a bad thing — and they can be. After all, blood clots can travel to your brain and cause strokes. And clots in your arteries cause heart attacks.

But a certain amount of clotting-ability in your blood is important for your health. The ability for your blood to clot is what keeps you from bleeding out after an injury. Blood that’s too thin can make you bruise more easily and even leave you at risk for dangerous internal bleeding.

Vitamin K can help keep your blood not too thick and not too thin. In the words of Goldilocks, it keeps your blood just right.

At this point, researchers have yet to determine if vitamin K1 or vitamin K2 are equally responsible for clotting or if one is any more effective than the other when it comes to blood clotting.

Builds healthy bones

When you think of strong and healthy bones, you may be tempted to think of calcium as the main nutrient to prevent fractures and osteoporosis.

And it’s true that calcium is an important part of bone health. But research is showing that calcium doesn’t act alone.

“Having low levels of vitamin K is associated with a higher risk of bone fractures,” Zumpano notes. “We’ve always put so much emphasis on calcium for bone health. But in reality, vitamin D, vitamin K and calcium all actually work together.”

Vitamin K helps activate a protein called osteocalcin, which binds to calcium to build bones. That makes vitamin K an essential component of bone health.

Some early studies are showing that vitamin K2 supplements may reduce fractures and improve bone quality in people with osteoporosis.

In Japan and other parts of Asia, one kind of vitamin K2 (MK-4), is used as a treatment for osteoporosis.

Improves heart health

In addition to its positive effects on blood clotting and strong bones, vitamin K helps keeps your heart healthy. That’s because of the way it acts to clear out calcium from your blood vessels.

When calcium builds up in your body, it can lead to hardening (or calcification) of your tissues, organs and blood vessels. Calcium deposits in your arteries can lead to high blood pressurekidney disease and more.

“Vitamin K has been shown to help activate a protein that helps prevent calcium from depositing in your arteries,” Zumpano explains. “Calcium deposits contribute to the development of plaque, so vitamin K does a lot of good for your heart health.”

Some early research has shown that vitamin K2 may be more effective at clearing out calcium than vitamin K1.

One study found that people who took in at least 32 micrograms per day of vitamin K2 in their diet were 50% less likely to die from heart disease related to hardened arteries. People in that study didn’t consume any vitamin K1.

Other research showed that women and people assigned female at birth (AFAB) who had a high intake of vitamin K2-rich foods (but not vitamin K1) were less likely to experience cardiovascular events, like heart attacks and strokes. For every 10 micrograms of vitamin K2 they consumed per day, their risk of heart disease decreased by 9%.

The European Food Safety Authority has approved a health claim for vitamin K, noting that “a cause-and-effect relationship has been established between the dietary intake of vitamin K and the maintenance of normal bone.”

Getting your fill of vitamin K2

While food safety organizations in parts of Asia and Europe officially acknowledge some health benefits of vitamin K, the U.S. Food and Drug Administration (FDA) hasn’t authorized a health claim for vitamin K. Essentially, that means more research needs to be done for the FDA to back vitamin K as a significant contributor to health.

But if you’re looking to add more vitamin K2 to your diet, know that some of the top sources of vitamin K-rich foods aren’t ones we typically see as part of a healthy diet. Whereas vitamin K1 is abundant in leafy greens and other “health foods, vitamin K2 is found in a lot of foods that aren’t typically recommended as part of a heart-healthy diet.

“Vitamin K2 is often found in animal products and fermented foods, as opposed to natural, plant food sources,” Zumpano notes. “But that’s not a hard-and-fast rule. There are some foods that are both naturally healthy and rich in vitamin K2.”

Some of the foods highest in vitamin K2 include:

  • Nattō (fermented soy).
  • Gouda cheese.
  • Blue cheese.
  • Egg yolks.

People who take blood thinners should talk with their healthcare provider before increasing their intake of vitamin K1 or vitamin K2.

Additionally, some research suggests that vitamin K2 supplements may be beneficial for some people. Vitamin K1 is most useful for your body when it’s eaten in its natural form. That’s because of its shorter absorption time, Zumpano says.

Vitamin K2, however, may be an effective supplement because it isn’t used up by your body as quickly. That means that the supplements could have a chance to work, rather than simply pass through your system as waste.

While much is still to be learned about vitamin K2, the signs so far point to an underutilized and underappreciated powerhouse for our bodies.

“It’s exciting to see that we’re learning more all the time about vitamin K2 and its potential,” Zumpano says. “It really does seem to be something that will get more attention and will make a difference for a lot of people’s health as we learn more.”

Are there foods that improve sleep?


Yes, there are certain foods that can help improve sleep quality. Sleep is essential for overall health and well-being, and a balanced diet plays a significant role in promoting good sleep. In this response, we will explore some foods that are known to aid in better sleep.

1. Kiwi: Kiwi is a fruit rich in antioxidants, vitamins, and minerals. It also contains serotonin, a hormone that helps regulate sleep. Consuming kiwi before bed has been shown to improve sleep quality and duration, making it a great option for a bedtime snack.

2. Cherries: Cherries are a natural source of melatonin, a hormone that regulates sleep-wake cycles. Eating cherries or drinking cherry juice can increase melatonin levels in the body, promoting better sleep. Tart cherry juice, in particular, has been found to improve sleep duration and quality.

3. Bananas: Bananas are high in potassium and magnesium, which are known to relax muscles and promote sleep. They also contain tryptophan, an amino acid that the body converts into serotonin and melatonin. Consuming a banana before bed can help relax the body and prepare it for sleep.

4. Almonds: Almonds are a good source of magnesium, which can help improve sleep quality. Magnesium has been shown to reduce cortisol, a stress hormone that can interfere with sleep. Snacking on a handful of almonds before bed may help promote relaxation and better sleep.

5. Warm Milk: Warm milk has long been associated with promoting sleep. Milk contains tryptophan, an amino acid that aids in the production of serotonin and melatonin. Additionally, the warmth of the milk can have a soothing effect, making it a comforting bedtime beverage.

6. Herbal Teas: Certain herbal teas, such as chamomile, valerian root, and lavender, have calming properties that can help induce sleep. These teas contain compounds that promote relaxation and reduce anxiety, making them a popular choice for those struggling with sleep issues.

7. Fatty Fish: Fatty fish, such as salmon, tuna, and mackerel, are rich in omega-3 fatty acids. These healthy fats have been linked to improved sleep quality by increasing the production of serotonin, reducing inflammation, and regulating the sleep-wake cycle.

8. Whole Grains: Whole grains, such as oats and brown rice, have a low glycemic index, which means they are digested slowly and provide a steady release of energy. Consuming whole grains in the evening can help stabilize blood sugar levels, preventing spikes and crashes that can disturb sleep.

9. Herbal Supplements: Some herbal supplements like melatonin and valerian root extract are commonly used to promote sleep. Melatonin is a hormone that regulates the sleep-wake cycle, while valerian root has sedative properties. It is important to consult with a healthcare professional before taking any supplements.

10. Dark Chocolate: Dark chocolate contains small amounts of caffeine, but it is also rich in magnesium. Consuming a small piece of dark chocolate a few hours before bed may help relax the body and mind, promoting better sleep.

While these foods can aid in better sleep, it is essential to note that individual responses may vary. Additionally, maintaining a consistent sleep schedule, creating a sleep-friendly environment, and practicing good sleep hygiene are equally important for optimal sleep. If you are experiencing chronic sleep issues, it is always recommended to consult with a healthcare professional for a proper evaluation and guidance.

What foods contain Vitamin K?


Vitamin K is an essential nutrient that plays a critical role in various bodily functions, including blood clotting, bone health, and heart health. Incorporating foods rich in vitamin K into your diet is crucial for maintaining optimal health. In this comprehensive guide, we will explore the importance of vitamin K, its different forms, the recommended daily intake, and a diverse range of foods that are abundant sources of this essential nutrient.

  1. Understanding Vitamin KVitamin K is a fat-soluble vitamin that exists in two primary forms: K1 (phylloquinone) and K2 (menaquinone). Vitamin K1 is commonly found in plant-based foods, while vitamin K2 is mainly synthesized by bacteria in the gut and is also present in certain animal-based and fermented foods.Roles of Vitamin K:Blood Clotting: Vitamin K is essential for the synthesis of clotting factors that help prevent excessive bleeding.Bone Health: It plays a role in bone metabolism by assisting in the synthesis of proteins that regulate bone mineralization.Heart Health: Some research suggests that vitamin K may contribute to heart health by preventing the calcification of arteries.
  2. Recommended Daily IntakeThe recommended daily intake of vitamin K varies depending on factors such as age, gender, and specific health conditions. The recommended dietary allowances (RDAs) for vitamin K are as follows:Adult Men: 120 micrograms (mcg) per dayAdult Women: 90 mcg per dayPregnant Women: 90 mcg per dayBreastfeeding Women: 90 mcg per day
  3. Foods High in Vitamin KThere is a diverse range of foods that are excellent sources of vitamin K. Incorporating these foods into your diet can help ensure that you meet your daily vitamin K requirements.1. Leafy Green Vegetables:Kale: This nutrient-dense green is one of the top sources of vitamin K1, offering around 547 mcg per cup.Spinach: A versatile leafy green that provides approximately 145 mcg of vitamin K1 per cooked cup.Swiss Chard: Rich in vitamins and minerals, Swiss chard contains about 299 mcg of vitamin K1 per cooked cup.Collard Greens: A staple in Southern cuisine, collard greens offer roughly 360 mcg of vitamin K1 per cooked cup.2. Cruciferous Vegetables:Broccoli: This popular vegetable contains around 89 mcg of vitamin K1 per cooked cup.Brussels Sprouts: These small, nutrient-packed vegetables provide approximately 156 mcg of vitamin K1 per cooked cup.Cabbage: Both green and red cabbage are sources of vitamin K1, contributing about 72 mcg per cooked cup.3. Herbs:Parsley: A versatile herb that adds flavor to dishes, parsley offers approximately 246 mcg of vitamin K1 per half cup.Basil: Widely used in Mediterranean cuisine, basil provides around 106 mcg of vitamin K1 per half cup.4. Green Vegetables:Green Beans: These legumes offer approximately 32 mcg of vitamin K1 per cooked cup.Asparagus: Rich in vitamins and minerals, asparagus contains about 55 mcg of vitamin K1 per cooked cup.5. Animal-Based Sources:Dairy Products: Dairy foods like cheese and milk contain small amounts of vitamin K2, contributing to bone health.Egg Yolks: Egg yolks are sources of both vitamin K1 and vitamin K2.6. Fermented Foods:Natto: A traditional Japanese dish made from fermented soybeans, natto is a potent source of vitamin K2.Fermented Cheeses: Certain aged cheeses, such as gouda and brie, contain vitamin K2 due to the fermentation process.7. Meats and Poultry:Chicken: Skinless chicken breast provides a moderate amount of vitamin K2.Beef: Certain cuts of beef, particularly liver, offer vitamin K2.8. Fish:Salmon: Fatty fish like salmon are sources of vitamin K2.
  4. Maximizing Vitamin K AbsorptionWhile incorporating vitamin K-rich foods into your diet is essential, there are factors that can affect the absorption of this nutrient.Enhancing Absorption:Healthy Fats: Consuming healthy fats along with vitamin K-rich foods can enhance absorption. Consider adding olive oil or avocado to your salads.Balanced Diet: Maintaining a balanced diet rich in a variety of nutrients supports overall health and nutrient absorption.
  5. Consulting a Healthcare ProfessionalIf you have specific health concerns, medical conditions, or are taking medications, it’s advisable to consult a healthcare professional before making significant dietary changes or adding supplements to your routine.

Vitamin K plays a vital role in various physiological functions, making it essential for maintaining overall health. Incorporating a diverse range of vitamin K-rich foods into your diet can help you meet your daily nutritional requirements and support blood clotting, bone health, and heart health. From leafy greens and cruciferous vegetables to animal-based sources and fermented foods, there is a wide array of options to choose from. By making informed dietary choices and ensuring a balanced intake of vitamin K, you can contribute to your overall well-being and enjoy the numerous benefits this essential nutrient offers.

Exercise for the treatment of depression


According to the World Health Organization, more than 300 million people worldwide have depression.1 When individuals recover from a major depressive episode, they have a high probability of relapse, and in some cases a tendency towards chronicity.2 Depression results in a considerable deterioration in quality of life for affected individuals and their families.3 Globally, more than 700 000 people die by suicide each year,1 and mortality from other physical illnesses such as diabetes, heart diseases, and cancer increases by 50% when those affected have depression.4 Individuals with depression can face difficulties finding employment, and among those who are employed, depression is associated with reduced productivity, higher rates of absenteeism, and an increased risk of job loss.5 All this emotional, quality of life, work related, and economic impact affects individuals and their families, as well as the efficiency of health services, businesses, and society in general. Moreover, this effect increased from 1990 to 2019,3 and during the covid-19 pandemic the prevalence of depressive disorders increased by almost 28%.6

Reasonably effective psychological and drug treatments are available,7 and in recent years, research has shown that exercise is also effective.8 Important questions remain, however, about the role of exercise in the treatment of depression, including what type of exercise works best, at what intensity and frequency, in what format (individual or group), and for which patient.

In a linked paper, Noetel and colleagues (doi:10.1136/bmj-2023-075847) report a network meta-analysis of randomised controlled trials that answers some of these questions.9 Walking or jogging, yoga, and strength training appeared to be more effective than other types of exercises. Overall, a dose-response association was found between exercise intensity and greater effectiveness, but even low intensity exercises such as walking and yoga conferred meaningful benefit. Conversely, the authors found no association between the effectiveness of exercise and the severity of depression at baseline. In general, group exercise was not more effective than individual exercise, except for yoga. Strength training and the combination of aerobic and strength exercises were more effective for individuals than groups. The effect size of exercise was comparable to that of cognitive behavioural therapy, but the quality of evidence supporting such therapy was higher. The effect of exercise appeared superior to antidepressants, although when exercise was combined with antidepressants, the effect of the drugs improved.

Noetel and colleagues’ network meta-analysis included 218 randomised controlled trials in a total of 14 170 participants from across multiple countries—although African countries were underrepresented. Other limitations were the low quality of evidence and the almost total absence of randomised controlled trials with long term follow-up (one year or more).

Primary care clinicians can now recommend exercise, psychotherapy, or antidepressants as standalone alternatives for adults with mild or moderate depression. The final choice depends on patient preference and other considerations, including any barriers to access. Clinicians and patients should also take into account the benefits of exercise in preventing or treating chronic conditions such as type 2 diabetes, overweight and obesity, cardiovascular disease, cancer, and cognitive impairment. Notably, physical exercise has also been shown to help prevent depression.10 For adults with severe or treatment resistant depression, the available evidence currently favours combined psychological and drug treatment.7

Taking regular exercise can be challenging for people with depression, as they often experience symptoms of fatigue, low energy, and poor motivation. Most of the randomised controlled trials included in this new network meta-analysis were conducted in a highly simulated and standardised context. Therefore, implementation studies (pragmatic randomised controlled trials and observational studies) are needed to evaluate physical activity programmes for people with depression using real world data. Many people have no access to exercise facilities, or they live in neighbourhoods where it is unsafe to walk or jog.

Health services and local and national administrations should provide enough resources to make individualised and supervised exercise programmes accessible to the entire population. For example, the European Union, through the NextGenerationEU funds, has committed to promoting exercise across member states.11 Using these funds and own funds, the Spanish government and the regional government of Andalusia in Spain have recently launched a programme prescribing personalised physical exercise programmes supervised by sports professionals in coordination with primary care. This programme encourages the participation of citizens and promotes exercise by recommending health assets in the neighbourhoods where citizens live.

Reintroduction of Radiotherapy Delaying Chemotherapy Followed by Craniospinal Radiotherapy for Infants With Medulloblastoma


Medulloblastoma, the most common brain cancer in childhood, has a median age at diagnosis of 6 years, and approximately 40% of cases occur in children <5 years of age. In this issue, Bagchi et al1 eloquently outline a comprehensive review of medulloblastoma in infants and young children (<6 years of age). Their review includes studies spanning different therapeutic eras, from the 1970s to recently reported completed clinical trials. The authors also performed a large retrospective cohort study and pooled data from 5 international genome-wide and epigenome-wide studies with 329 infants and young children with medulloblastoma.

Medulloblastoma has been shown to be a highly heterogenous disease, consisting of a collection of molecularly distinct diseases with 4 main molecular subgroups; Wingless type (WNT), Sonic hedgehog (SHH), Group 3 (G3), and Group 4 (G4). DNA methylation profiling has revealed further molecular heterogeneity within these 4 subgroups, resulting in a total of 13 subgroups. The SHH group is further divided into 4 subgroups (SHH-1, SHH-2, SHH-3, and SHH-4), and the non-WNT/non-SHH group is separated into 8 molecular subgroups (denoted G3/4-I to G3/4-VIII). Notably, each group has unique clinical and prognostic characteristics. In infants (<3 years of age), 2 molecular subgroups predominate—SHH (specifically SHH-1 and SHH-2) and G3 (specifically G3/4-III and G3/4-IV)—with a small proportion (∼5%) of cases in the G4 group and none in the WNT group.1

Postoperative craniospinal irradiation (CSI) was introduced in the 1950s to prevent the inevitable metastatic relapses throughout the central nervous system (CNS) that many patients sustained, cementing this modality as the backbone of medulloblastoma therapy. For children aged >3 years, major strides in survival have been made using risk-stratified CSI. For patients classified as average-risk (children aged ≥3 years with <1.5 cm2 of residual tumor and no metastatic disease), the 5-year overall survival is now 80% to 85% using 23.4 Gy CSI and adjuvant chemotherapy,2,3 and approximately 70% for those with high-risk disease (children aged ≥3 years with ≥1.5 cm2 of residual tumor and/or with metastatic disease) who receive a CSI dose of 36 Gy and adjuvant chemotherapy.3,4

To permit more precise risk-stratified therapy, medulloblastoma clinical trials are now embedding a molecular risk-adapted approach integrating traditional clinical and histologic criteria with molecular information (ClinicalTrials.gov identifiers: NCT01878617, NCT05535166, NCT02724579, NCT02066220). Given the ongoing concerns around the negative long-term consequences of CSI, to decrease morbidity, therapy reduction strategies are applied for patients with the lowest risk of relapse, such as those with average-risk WNT medulloblastoma (NCT02066220, NCT02724579, NCT01878617). In contrast, intensified and experimental therapies are used for high-risk groups, such as those with G3 MYCC-amplified medulloblastoma, to increase survival (NCT01878617).

Bagchi et al1 conclude that after 4 decades of clinical trials applying a multitude of strategies trying to avoid CSI, a cure is achievable in the vast majority of infants and young children with the SHH group of medulloblastoma using chemotherapy alone. Notably, patients with the SHH-1 subtype of medulloblastoma have a worse outcome using lower-intensity chemotherapy than those with SHH-2, an effect that can be nullified using more-intensive chemotherapy strategies.1 This highlights that even in the age of molecular characterization with integrated clinical and molecular risk stratification, the most important predictor of survival is therapy. Conversely and perturbingly, for infants with G3 medulloblastoma, a realistic chance of cure can only be obtained with the use of CSI. The authors cogently contend that infants with G3 medulloblastoma currently effectively undergo a “double whammy” of therapy, because high-intensity chemotherapy treatment approaches fail to prevent relapse and the inevitable subsequent use of salvage CSI, thereby multiplying toxicity and long-term morbidity.

These conclusions form the basis for their recently opened medulloblastoma study called SJiMB21 (ClinicalTrials.gov identifier: NCT05535166). This trial, exclusively for infants and young children with medulloblastoma, adopts a sophisticated stratification system combining established clinical risk factors with molecular characteristics to more precisely tailor therapy. Thus, patients with SHH medulloblastoma receive a chemotherapy-only strategy, risk stratified according to the underlying SHH subtype. Patients with SHH-1 disease receive more intensive intraventricular methotrexate–based chemotherapy. In contrast, patients with SHH-2 disease receive less-intense chemotherapy1 in an attempt to minimize the long-term neurocognitive sequelae associated with intraventricular methotrexate. In stark contrast, for infants with G3 medulloblastoma, the study will reintroduce radiotherapy-delaying chemotherapy followed by risk-adapted CSI, with the dose of CSI varying from 18 to 36 Gy, in patients reaching 3 years of age. Some patients will also receive concurrent carboplatin during radiotherapy, because this therapy has recently been shown to significantly improve survival exclusively in G3 disease.4 This approach is based on the reasonable but unproven premise that improved survival will be seen if CSI is given up-front rather than as salvage therapy after relapse for this population.

The treatment of infants with medulloblastoma has remained a major challenge. The initial application of CSI-based therapy in infants led to the confronting realization of the impacts of this approach on the especially vulnerable developing CNS of infants, with the resultant dire consequences on cognition, growth, and development. Fittingly, the “price for cure” was regarded as unacceptable, which led investigators to explore alternative therapeutic strategies. This highlights the balance between life and death, treatment and life-time harm is so finely balanced. Consequently, in the mid-1970s, a team at The University of Texas MD Anderson Cancer Center demonstrated that chemotherapy alone could effect cure in some infants with medulloblastoma and that these patients retained cognitive function.5 This observation paved the way for a paradigm shift in the management of infants with medulloblastoma and heralded the era of radiotherapy-delaying studies in the mid-1980s. A notable example was the seminal Baby POG-1 study conducted by the Pediatric Oncology Group,6 which aimed to delay radiotherapy to age 3 years with the administration of multidrug chemotherapy. For patients with no evidence of disease, the study planned to deliver reduced-dose radiotherapy on completion of planned chemotherapy. The trial experienced 2 significant issues. First, many children experienced progressive disease during therapy. Second, a significant proportion of patients who completed chemotherapy without evidence of disease did not receive the planned radiotherapy as intended due to parental concerns. These observations led to the generation of radiotherapy avoidance approaches with strategies centered on intensification of chemotherapy using high-dose chemotherapy with autologous stem cell rescue or intensive intraventricular methotrexate–based therapies.1

The review by Bagchi et al1 and their recently opened SJiMB21 study illustrates that although our knowledge of the underlying biology has dramatically increased and now permits a much more refined stratification system, the unravelling of the medulloblastoma genome has not yet yielded the anticipated novel targeted therapies for the vast majority of patients with medulloblastoma. Inhibitors targeting upstream signalling pathway mutations in SHH-driven medulloblastoma initially generated optimism for replacing conventional therapies. However, short-lived effectiveness combined with the major complication of growth impairment has significantly restricted use to skeletally mature patients. Thus, with the exception of SHH-1 and SHH-2 medulloblastoma in infants and young children, the sobering reality is that radiotherapy remains the most potent therapy against medulloblastoma. This reality re-emphasizes that, ultimately, adequate treatment is the most powerful determinant of survival. This was recently highlighted in a pilot trial that attempted to completely omit radiotherapy for patients with WNT medulloblastoma, the most favorable medulloblastoma subgroup. The trial was terminated early because all patients (n=3) experienced rapid relapse.7

Bagchi et al’s pragmatic but potentially controversial approach reveals how, given the absence of effective novel therapies, we have come full circle for some infants with medulloblastoma, with the reintroduction of radiotherapy-delaying strategies, as was done in first-generation infant medulloblastoma trials in the 1980s.6 The success of this strategy will be predicated on 2 main factors: first, that the chosen preradiotherapy chemotherapy is adequate to prevent relapse/progression before patients reach 3 years of age; and second, that history does not repeat itself and parents (and/or physicians) accept the planned CSI when the time comes, especially for children stratified to receive high-dose CSI (36 Gy). Despite significant advancements in radiotherapy techniques, such as proton beam therapy, with early reports showing reduced CNS toxicity,8 many parents and physicians may still see this approach as unpalatable.

Given the controversy around delivering CSI to young children, several radiotherapy-sparing medulloblastoma studies that use intensive high-dose chemotherapy strategies leave the decision to use CSI to the treating physician’s (and ergo also the family’s) discretion. This has inadvertently led to inadequately collected radiotherapy-free survival data in some trials, hampering evaluation of the effectiveness of the strategies. This issue combined with the small numbers of patients included in these studies can therefore limit interpretation. However, although infants with G3 medulloblastoma have dismal survival rates, a proportion of patients do survive without radiotherapy. In particular, 2 recent reports (albeit one in abstract form only9) suggest more promising survival for this group of patients.9,10 Both trials adopted the same high-dose chemotherapy backbone, but one trial also included high-dose methotrexate during induction.9 Importantly, these studies have also reported the use of radiotherapy. The Pediatric Brain Tumor Consortium PBTC-026 trial on the feasibility of incorporating noncytotoxic therapy demonstrated a 5-year progression-free survival of 43%,10 and the Children’s Oncology Group ACNS0334 study reported a 5-year overall surival of 80% for the 10 infants with G3 disease who were randomized to high-dose methotrexate during induction.9 Based on these encouraging results, the Children’s Oncology Group will continue to adopt high-dose chemotherapy as the backbone for building new therapeutic strategies. Additionally, to significantly advance risk stratification, they will also adopt an integrated histologic, clinical, and molecular characterization in future studies in infants.11 A pooled analysis of infants with G3 disease who experienced relapse after radiotherapy-sparing treatment identified CSI as an effective salvage therapy.12 To try to develop predictors for the G3 infants that can be cured without radiotherapy, an analogous analysis to assess this group could yield important novel therapeutic insights, as well as additional molecular refinement.

Finally, this review and the SJiMB21 trial highlight the importance of ongoing preclinical research to identify novel therapies for these patients. The currently applied medulloblastoma therapy evolved from the empirical refinement of CSI in combination with multiagent chemotherapy. The extraordinary progress in unravelling the molecular pathogenesis of medulloblastoma achieved in the past decade provides an opportunity to develop novel therapeutic approaches tailored to each molecular subtype of medulloblastoma to improve survival while minimizing toxicities. Indeed, many more potential novel anti-cancer therapies exist now, which more precisely target molecular abnormalities in cancer cells that drive tumor growth, as well as immunotherapies. Consequently, preclinical modelling is a critical step that can direct the field to agents active in the CNS against specific medulloblastoma subgroups, and preclude the investigation of ineffective or minimally active agents in the clinic. We now have high-throughput drug screening platforms, tumor organoids, and an array of sophisticated medulloblastoma animal models, which more closely mimic the clinical characteristics of the disease in children than historic models. In addition, advanced preclinical radiotherapy platforms that precisely target tissues of interest while sparing normal healthy tissue permit the assessment of potential radiosensitizers. To assist the prioritization of therapies for clinical translation with the best chance of success, international clinical and preclinical consortia have developed consensus pediatric brain cancer preclinical testing guidelines.13 These guidelines provide a collaborative framework for preclinical testing that supports validation in multiple different institutions to increase rigor and reproducibility. The hope is that novel more-targeted/subgroup-specific therapies will enable the reduction, or preferably omission, of CSI.

Reduction in Breast Cancer Death With Adjuvant Chemotherapy Among US Women According to Race, Ethnicity, and the 21-Gene Recurrence Score


Background

The Oncotype 21-gene breast recurrence score (RS) is the most commonly ordered multigene breast cancer biomarker in the United States,1 and the NCCN Clinical Practice Guidelines in Oncology (NCCN Guidelines) for Breast Cancer base its recommendations on the RS for patients with estrogen receptor (ER)–positive, HER2-negative tumors (henceforth referred to as ER-positive) that do not have axillary lymph node metastases.2 The 21-gene RS was developed as a predictive biomarker because it is intended to provide information on which patients are (or are not) likely to derive benefit from adjuvant chemotherapy.3,4 An important step in evaluating a cancer biomarker is to determine the clinical validity of the test, which is its ability to predict the clinically defined condition of interest.5 The 21-gene RS is considered clinically validated as a predictive biomarker if it identifies a group of patients with ER-positive breast cancer who do not benefit from adjuvant chemotherapy and another group who do.610

Differences in the underlying hazard rate of breast cancer death among patients with breast cancer according to race/ethnicity11 could influence the performance of the 21-gene RS in diverse populations, which may ultimately impact the clinical validity of the RS for racial and ethnic minority women. This concern prompted us to examine the prognostic accuracy of the RS12 using the SEER Oncotype DX database.13,14 We found that the RS has lower discriminatory accuracy for determining breast cancer–specific mortality in non-Hispanic Black (NHB) compared with non-Hispanic White (NHW) women with ER-positive, axillary node–negative tumors. This raises an important question regarding whether the 21-gene RS has been adequately validated as a predictive biomarker in racial and ethnic minority patients. Fewer than 10% of participants in the validation studies of the RS were NHB,4,15,16 underscoring this concern. A predictive biomarker that performs poorly in NHB women could perpetuate racial disparities in ER-positive breast cancer survival17 by misguiding treatment recommendations. This study examined the clinical validity of the RS as a predictive biomarker among racially and ethnically diverse women with ER-positive, axillary node–negative breast cancer. We conceptualized race and ethnicity as a social construct for this study.

Discussion

This study of ER-positive, axillary node–negative breast cancer validated the 21-gene RS as a predictive biomarker for NHB, Hispanic, and NHW women but not for API women. This is the first study we are aware of to validate a genomic predictive biomarker specifically for racial/ethnic minority women with breast cancer. Notably, we found significant racial differences in the association between chemotherapy treatment and breast cancer death; NHB women aged ≤50 years had a greater reduction in breast cancer death with chemotherapy compared with their NHW counterparts. An exploratory subgroup analysis among women aged ≤50 years suggested that there may be a reduction in breast cancer death with chemotherapy at a lower RS cutoff for NHB compared with NHW women. This finding is exploratory and needs to be confirmed in an adequately powered prospective study. If replicated, it would indicate that the RS threshold for recommending adjuvant chemotherapy may need to be lower for young NHB women than for women from other racial/ethnic groups in future practice guidelines.

Our results are broadly consistent with the main findings of the TAILORx randomized trial.8 A secondary analysis of that trial examined outcomes according to race and ethnicity for women with an intermediate-risk RS (11–25) and also found no improvement in survival with chemotherapy for any racial/ethnic group.16 Current NCCN Guideline recommendations offer chemotherapy as an option for premenopausal women with a risk RS ≥16 due to a lower rate of distant recurrence with chemotherapy for that subgroup in TAILORx.2,8 However, the TAILORx investigators did not report results according to race for women aged ≤50 years with an intermediate-risk RS, presumably due to the small sample size of young NHB women in that risk category (471 NHB women of all ages in the RS 11–25 category). We found a nonsignificant numeric reduction in breast cancer death with chemotherapy for NHB women aged ≤50 years with an RS of 16–25 (HR, 0.43; 95% CI, 0.11–1.64) but no apparent benefit for NHW women (HR, 0.92; 95% CI, 0.58–1.47). Despite a 7-fold larger sample size of NHB women with an intermediate-risk RS in our study compared with TAILORx16 (3,498 vs 471), the number of events was still small among young NHB women, and confidence intervals are wide, so we cannot draw any firm conclusions regarding racial differences in treatment effect for young women with an RS of 16–25.

Strength of this study are the large number of women from racial and ethnic minority groups, the availability of data on breast cancer–specific death, the use of propensity score weighting to reduce confounding, and a population-based sample. However, this study has limitations. We were unable to determine whether differences in the use of adjuvant endocrine therapy across different racial and ethnic groups influenced the results, because the SEER registry does not include data on the use of adjuvant endocrine therapy. However, TAILORx and a cooperative group trial reported that racial differences in the use of endocrine therapy do not explain the survival disparity among NHB women with ER-positive breast cancer.16,26 A limitation of every observational study is the possibility that variables other than the predictor variable affect outcome and are unevenly distributed, leading to confounding.27 Propensity score weighting adjusts for confounding due to measured variables,20,27 but there is no analytic technique to adjust for confounding due to unmeasured variables.27 Only an adequately powered randomized trial can definitively determine the effect of chemotherapy on breast cancer survival.28 Chemotherapy data are incomplete in SEER.23,24 Our sensitivity analysis indicated that any bias introduced by misclassification of chemotherapy status would likely skew the results toward the null and should therefore not change the main findings. However, this is an important limitation of the analysis, and comparisons between different racial and ethnic group must be interpreted with caution. Confirmation is needed using datasets with more complete data on chemotherapy administration. HER2 status is unknown for patients included in SEER prior to 2010. The Oncotype test is not indicated for patients with HER2-positive tumors, so the rate of contamination of the analytic cohort by HER2-positive tumors is expected to be extremely low. Based on the number of HER2-positive/borderline tumors among ER-positive patients with an Oncotype score available in SEER diagnosed in 2010 through 2015 (excluded from this analysis), we estimate that only 1.2% of the entire analytic cohort had HER2-positive tumors, which is highly unlikely to introduce any meaningful bias. Finally, a median follow-up time of 56 months is relatively short for a study of ER-positive breast cancer. Unfortunately, SEER has not updated the survival data for this specialty database.14

Conclusions

This population-based study clinically validated the RS as a predictive biomarker for NHB, Hispanic, and NHW women with ER-positive, axillary node–negative breast cancer, but it also raises the possibility that the RS may underestimate the benefit of chemotherapy for NHB women. If confirmed, the RS cutoff for recommending adjuvant chemotherapy for young NHB women with ER-positive, axillary node–negative breast cancer may need to be lower than for other women. This study also underscores the need to account for the racial and ethnic diversity of the target population in the development and validation of cancer biomarkers.