Ancient Power Unlocked: Scientists Discover 2.5 Billion-Year-Old Bacterial Energy Source.


In the late 1980s, scientist Bernhard Schink predicted that a microorganism could produce energy from phosphite. Decades later, a new species was discovered in a sewage plant, which proved his theory. This organism, which forms a new genus of bacteria, uses phosphite oxidation for energy, a process that could date back 2.5 billion years, providing insights into early biochemical evolution and potential life in extreme environments (Artist’s concept). Credit: SciTechDaily.com

Biologists from Konstanz have unveiled a unique and ancient phosphorus-based bacterial metabolism. Central to this discovery are four elements: an analytical calculation dating back to the 1980s, a modern sewage treatment facility, the identification of a novel bacterial species, and a remnant from around 2.5 billion years ago.

Our story begins at the end of the 1980s, with a sheet of paper. On this sheet, a scientist calculated that the conversion of the chemical compound phosphite to phosphate would release enough energy to produce the cell’s energy carrier – the ATP molecule. In this way, it should therefore be possible for a microorganism to supply itself with energy. Unlike most living organisms on our planet, this organism would not be dependent on energy supply from light or from the decomposition of organic matter.

The scientist actually succeeded in isolating such a microorganism from the environment. Its energy metabolism is based on the oxidation of phosphite to phosphate, just as predicted by the calculation. But how exactly does the biochemical mechanism work? Regrettably, the key enzyme needed to understand the biochemistry behind the process remained hidden – and thus the mystery remained unsolved for many years. In the following three decades, the sheet stayed in the drawer, the research approach was put on the back burner. Yet the scientist couldn’t get the thought out of his head.

The scientist is Bernhard Schink, a professor at the Limnological Institute of the University of Konstanz. Three decades after he made the calculation on paper, an unexpected discovery set the ball rolling again …

A sewage plant, an unexpected find, and a new species

What had been in the back of his mind for many years was finally found: of all places, in a sewage plant in Konstanz, only a few kilometers from Bernhard Schink’s laboratory. Zhuqing Mao, a biology doctoral researcher from Konstanz, examined a sewage sludge sample and discovered a second microorganism that also gets its energy from phosphite. The Konstanz biologists led by Bernhard Schink placed this bacterium in an environment in which it had only phosphite as a food source. And indeed: the bacterial population grew.

“This bacterium subsists on phosphite oxidation, and as far as we know, exclusively on this reaction. It covers its energy metabolism this way, and can build up its cell substance from CO2 at the same time,” explains Schink. “This bacterium is an autotrophic organism, like a plant. It does, however, not need light like a plant, as it draws its energy from phosphite oxidation”. Surprisingly, it turned out that the bacterium is not only a new species, but actually forms an entirely new genus of bacteria.

Tracking down the molecular mechanism

From that point on, things happened very quickly. A whole network of Konstanz researchers dedicated themselves to unraveling the mystery, including Bernhard Schink, Nicolai Müller, David Schleheck, Jennifer Fleming, and Olga Mayans. They produced a pure culture of this new bacterial strain, in which they were finally able to identify the key enzyme that triggers the oxidation of phosphite to phosphate.

“The breakthrough came with Nicolai Müller and his enzyme experiments”, says David Schleheck. Nicolai Müller succeeded in clearly demonstrating the enzyme’s activity, thereby uncovering the biochemical mechanism behind the key enzyme. Olga Mayans and Jennifer Fleming created a three-dimensional model of its enzyme structure and active center to understand the reaction pathway.

“What was very surprising was that during its oxidation, phosphite is apparently coupled directly to the energy-carrier precursor AMP, whereby the energy carrier ADP is created. In a subsequent reaction, two of the generated ADPs are converted to one ATP, on which the organism ultimately lives,” Nicolai Müller outlines the reaction pathway.

Finally, everything came together: The original sheet became a whole pile of papers, resulting in a publication in the scientific journal PNAS.

A remnant from 2.5 billion years ago

The discovery of a new type of energy metabolism is in itself a great scientific success. However, the research team thinks that this type of metabolism is by no means new, but very old, even ancient: around 2.5 billion years old.

“It is assumed that in the early days of evolution, when the Earth was cooling down, phosphorus was still present to a large extent in a partially reduced form and was only later gradually oxidized. The metabolism we have now discovered fits very well into the early phase of the evolution of microorganisms,” Bernhard Schink explains.

The biochemical mechanism that the bacterium uses for its metabolism is therefore not new, but has most probably been preserved from the primeval times of our planet: back when life on our planet began and the first microorganisms had to feed on inorganic compounds such as phosphite. Thus the new scientific findings provide clues to the early biochemical evolution on our planet. In addition, they provide the key to a biochemical mechanism that makes life possible in very hostile places, possibly even on alien planets.

Who would have thought at the end of the 1980s that a piece of paper would set all this in motion…

Reference: “AMP-dependent phosphite dehydrogenase, a phosphorylating enzyme in dissimilatory phosphite oxidation” by Zhuqing Mao, Jennifer R. Fleming, Olga Mayans, Jasmin Frey, David Schleheck, Bernhard Schink and Nicolai Müller, 3 November 2023, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2309743120

MIT’s new AI can make holograms in real-time


It’s efficient enough to run on smartphones — no supercomputers necessary.

make holograms

Holograms could make virtual reality more immersive, improve our 3D printers, and even help doctors diagnose and treat patients — if only they weren’t so difficult to create.

“It’s often been said that commercially available holographic displays will be around in 10 years, yet this statement has been around for decades,” MIT researcher Liang Shi told MIT News.

Now, Shi and his colleagues at MIT have developed a technique to generate holograms in real-time — and it’s so efficient, it could be done on a laptop or smartphone.

What Is a Hologram?

Pop culture has made the term “hologram” synonymous with deceased rapper Tupac Shakur’s “performance” at Coachella 2012, but that wasn’t technically a hologram (it was a mirror-based optical illusion called “Pepper’s Ghost”).

hologram is a flat image that appears to be three-dimensional.

Consider a standard photograph of an apple. No matter how you move your head or angle the photograph, it’s always going to look like the same flat image of an apple.

Flat pictures like this are created by recording the light waves reflecting off whatever is in a camera’s view when its shutter was clicked.

But a hologram of an apple would have depth. As you moved your head or the hologram itself, you’d feel like you were seeing new angles of the fruit.

Holograms use both the brightness and the phase of each light wave to give the viewer the sense that they’re looking at something three-dimensional — even though they aren’t.

A Hologram-Making AI

Traditionally, holograms were created using laser beams, but the images resulting from that technique could only be displayed as hard copies that were difficult to reproduce. The method couldn’t translate to video, either — only static images.

Computer-based techniques for making holograms can overcome those limitations, but they involve running physics-based simulations, which require a lot of processing power.

We are amazed at how well it performs.


WOJCIECH MATUSIK

It could take a supercomputer cluster minutes to generate a single hologram, and the final result still might not be photorealistic.

To speed up the process, while cutting down on the computational burden, MIT has developed a new AI-based technique they call “tensor holography.”

The researchers started by creating a training dataset of 4,000 computer-generated images and their matching photorealistic holograms. Each of the images included the color and depth information for every pixel.

They then trained an AI with this data, teaching it how to generate a hologram for almost any given 2D image. By the time it was finished, it could produce a photorealistic hologram in milliseconds.

“We are amazed at how well it performs,” researcher Wojciech Matusik told MIT News.

The AI requires less than 1 MB of memory — a fraction of what’s on most smartphones. The tech needed to calculate depth information also comes standard on many of today’s phones, meaning the devices could easily support the hologram-making process.

“It’s a considerable leap that could completely change people’s attitudes toward holography,” Matusik said. “We feel like neural networks were born for this task.”

New Drug Discovery Could Make Re-Growing Teeth Possible


Get a daily digest of the latest news in tech, science, and technology, delivered right to your mailbox. Subscribe now.

Losing a tooth in adulthood can be hard since humans can only renew their teeth once in childhood. After replacing the milk teeth with permanent teeth, we lose the tooth renewal capability, and currently, when any part of a tooth dies, it cannot be brought back to life, except with fake teeth.

Scientists at Kyoto University and the University of Fukui have come up with a new study that offers some hope — according to the paper published in Science Advances, an antibody for one gene, uterine sensitization associated gene-1 or USAG-1, was able to stimulate tooth growth in animal studies, Medical Xpress reports.

The researchers state that the molecules behind tooth development were already known. “The morphogenesis of individual teeth depends on the interactions of several molecules including BMP, or bone morphogenetic protein, and Wnt signaling,” explains Katsu Takahashi, one of the lead authors of the study and a senior lecturer at the Kyoto University Graduate School of Medicine. USAG-1 interacts with both BMP and Wnt, and the researchers knew that suppressing it would benefit tooth growth. “What we did not know was whether it would be enough.”

The researchers examined the effects of several monoclonal antibodies, which are often used to treat cancers and in vaccine development, for USAG-1. One of the antibodies was able to disrupt the interaction of USAG-1 with BMP only, without any side effects. 

When the researchers experimented with this antibody, it was seen that BMP signaling is essential for determining the number of teeth in mice, and amazingly, a single administration was able to generate a whole tooth.

These findings are promising since ferrets have similar dental patterns to humans. As the next step, the researchers want to test antibodies on animals such as pigs and dogs.

“Conventional tissue engineering is not suitable for tooth regeneration. Our study shows that cell-free molecular therapy is effective for a wide range of congenital tooth agenesis,” concluded Manabu Sugai, another author of the study.

The Hidden Dangers of Statins: How Cholesterol Drugs May Damage Heart Health


Statins rank among the most prescribed drugs globally, taken by 1 in 4 adults over 45 in the U.S. alone[1]. For decades, patients have relied on these cholesterol-lowering medications to prevent heart disease. But emerging research indicates statins may inadvertently accelerate coronary artery calcification and impair heart muscle function[2],[3].

Statins: The Cholesterol Myth and A Billion Dollar Industry

The medical establishment heralded statins as heroes in the “war on heart disease” based on the disproved hypothesis that cholesterol accumulation is the primary driver of atherosclerosis. Statins suppress cholesterol production which drug makers, national guidelines panels (largely run by pharma insiders)[4], and many doctors insist equates to cardiovascular disease protection no matter the collateral damage incurred[5].

But as the American public pops over a quarter billion statin pills annually, evidence against this “cholesterol myth” continues mounting. No one disputes statins effectively lower cholesterol. The question is – at what cost for the promised benefit?

Over 300 Adverse Effects: A “Magic Pill” Too Good to Be True

As early as the 1990’s, research revealed over 100 adverse health effects associated with statin medications – today numbering over 300[6] ranging from muscle damage to diabetes. Yet sales reached $25 billion in the U.S. last year alone[7]

But recent studies uncover statins may achieve the opposite of their intended effect by accelerating coronary artery calcification that triggers heart attacks. One expert review describes statins potentially acting as “mitochondrial toxins” that damage muscles including the heart itself[2].

Statins Deplete Heart & Cardio-Protective Nutrients

The study authors warn long-term statin use inhibits coenzyme Q10 synthesis – critical for energy production in muscles[8]. Consider the fact that our heart is a muscle that never stops exerting itself, and therefore has the highest requirement for energy synthesis and coenzyme Q10 in the body. It is no surprise, therefore, that studies link low coQ10 to worsening heart failure[9]. Statins also deplete or impair vitamins K2[10], selenium[11], and minerals like zinc[12] proven to prevent vascular calcification and protect heart function[13]. Nutrient depletion apparently outweighs purported anti-inflammatory effects in the cardiovascular risk/benefit scale.

Research: Do Statins Worsen Heart Disease Risks?

According to experts, most doctors fail to recognize statin-induced cardiomyopathy and instead attribute muscle damage symptoms to aging versus drug toxicity[2]. Meanwhile research shifting focus to heart muscle impacts continues building:

  • A 2022 study published in Arteriosclerosis directly linked statins with accelerated coronary artery calcium deposition within vessels[3].
  • Another report found patients halting statins and supplementing depleted coenzyme Q10 reversed stiffness and dysfunction in over half studied – confirming statins as the culprit.[14]
  • A 2019 study warned statins accelerate calcification of heart valves and blockage of ventricular veins among other effects rarely monitored[15].
  • A 2017 study in Expert Review of Clinical Pharmacology concluded that “statin therapy can no longer be defended as the final word in prevention of cardiovascular disease”[16].

Who Benefits? Weighing Statin Heart Risks Versus Rewards 

Considering their demonstrated broad spectrum toxicity and newly discovered adverse impacts on the heart itself, experts argue patients require fully informed consent before starting long-term statin use. With over 30 million Americans diagnosed with heart disease or type 2 diabetes at elevated CVD risk, demand continues growing[17] alongside expanding research on the cardiotoxic footprint of aggressively lowering lipids as national policy.

For individuals at high risk of vascular events due to uncontrolled hypertension, obesity or insulin resistance – dietary changes, exercise habits, and metabolism-regulating nutraceuticals (like berberine or fish oil) may optimize cholesterol levels without compromising coenzyme Q10 status long-term like statins[18]. Those undergoing short-term statin treatment post-heart attack require coenzyme Q10 replenishment to avoid cardiomyopathy. The conventional standard of care still largely favors limited statin use for secondary prevention – but the tide may be turning.

While the scale now tips towards statin avoidance as a precautionary approach, there are countless patients who are still not aware of the overblown benefits versus the underreported risks of using cholesterol-lowering medications. But one certainty persists in light of accumulating safety signals: patients (and their doctors) deserve truth transparency around the possibility that using cholesterol lowering pills may actually do more harm than good. 

Gastrogastric intussusception and acute pancreatitis caused by a large pyloric gland adenoma treated with endoscopic submucosal dissection


42 Factors That Affect Blood Glucose?! A Surprising Update


Adding 20 new factors, a whole new category on behavior and decisions, and research on unexpected things that impact blood sugar and diabetes.

Ever heard someone explain diabetes with a frustrating level of simplicity? “You’ll have on-target blood sugars as long as you eat right, exercise, and take your medicine.”

If only it were that easy, and if only vague advice was all we needed to hear.

One of our missions in diaTribe – and in this column – has been to debunk this myth.

Of all the topics I’ve covered in Adam’s Corner, one article has resonated with readers more than any other: “How many factors actually affect blood glucose?” Since its publication in 2014, over 400,000 people have viewed that initial list of 22 factors – quadruple the next most-viewed column (Low Carb vs. High Carb).

Why has it been so popular? I think the 22 factors really speak to the complexity of living with diabetes: even if I “eat right,” “exercise,” and “take my medication,” there are so many blood-sugar-related variables in play at any given time. Plus, all these factors interact in infinitely complicated ways: “What happens to my blood glucose after five hours of sleep, a low-carb breakfast, lots of exercise, high stress, and a big cup of coffee?”

I bring this up because it’s the 150th issue of diaTribe, and I’ve often thought about what I would add to that list of 22 factors if I were to revisit the article – especially given all I’ve learned since then from so many people with diabetes, healthcare providers, and scientists. This article expands on the original – sharing 20 additional factors and a whole new category of factors to consider. See the complete list of 42 factors below, followed by descriptions of the new additions.

I know what you’re thinking – 42 factors that affect blood glucose? Are you kidding?!

Yes, it is indeed daunting, but I also hope it’s a reminder of what each of us takes on daily: A LOT! Plus, this list reveals many levers we can pull when trying to improve. For my toolkit to manage some of this madness, get Bright Spots & Landmines as a free PDF (or name your own price) or at Amazon for $2-$6.

The arrows above show the general effect these factors seem to have on blood glucose based on: (i) my own experience; (ii) available research; and (iii) what I’ve learned from others with diabetes. A sideways arrow indicates a neutral effect. Not every individual will respond in the same way (and even within the same person, you may be different from day-to-day or over time). Certain factors may also apply more to type 1 vs. type 2 diabetes (or the other way around). Factors with up and down arrows are of course the most challenging – they may increase blood glucose or decrease it. The best way to see how a factor affects you is through personal experience – check your blood glucose more often or wear CGM and look for patterns.

New FOOD Factors

#2

Carb type: ratio of fiber to total carbs, sugar to total carbs, and liquid vs. solid. When we published the original 22 factors piece, we only mentioned “carbohydrates” in the food category. But it should be two factors in one: the number of carbs AND the type of carbs. As I’ve written before, not all carbs are created equal. My personal major sources of carbs – green veggies, nuts, seeds, chia pudding, low-carb/high-fiber tortillas, Quest Bars, berries – tend to have 50%-80% of the carbs from fiber and very low sugar. For example, a serving of chia pudding has 20 grams of carbs, with 16 grams from fiber. Foods with a high fiber to total carbs ratio have a lower impact on my blood glucose versus foods with the same amount of total carbs and no fiber. In addition, the more grams of carbs that come from sugar, the higher the impact on blood glucose ­- even if total carbs are the same. Last, format also matters so much – liquid carbs will usually increase blood glucose more quickly than solid carbs, even if the overall carbs are equal.  

#7

Meal timing, especially dinner. I’ve found eating a large late-night dinner, no matter what’s in it, often results in high overnight blood sugars. The converse is also true: a lighter, earlier dinner seems to improve my overnight numbers. Separately, I’ve also tried intermittent fasting (“time-restricted eating”) more than 35 times now – first meal at 12 noon, last meal at 8pm. Stay tuned for a future column on this topic – this is in no way a recommendation, just noting something new that I’m trying. In short, meal timing is both a Bright Spot and a Landmine, depending on how this factor is implemented.

#8

Dehydration. Over the years, several diaTribe readers have emailed sharing that dehydration seems to increase their blood glucose levels. Indeed, in a randomized, controlled 2001 study, dehydration did raise blood glucose levels for those in a fasted state. The New York Times also reported in 2012 that dehydration increases levels of the hormone vasopressin, which pushes the liver to produce blood sugar – their headline was that drinking water can reduce the risk of diabetes! Luckily, it’s becoming easier and easier to find water, so this is a great way to be proactive. 

#9

Personal microbiome (gut bacteria)? There is a lot of ongoing research on the impact of gut bacteria (“microbiome”) on blood glucose levels and insulin sensitivity. Dr. Eran Segal’s lab at the Weizmann Institute has done some compelling work on this topic with CGM and microbiome data, summarized in this TED talk: What is the best diet for humans? They have published a number of high profile papers, including one on how gut bacteria predicts an individual’s glucose response to bread (Cell Metabolism 2017), another on personalized nutrition with CGM and microbiome data (Cell 2015), and a third on how artificial sweeteners affect the microbiome and glucose responses (Nature 2014).

New MEDICATION FACTORS

#13

Steroid administration (e.g., prednisone).  Our original 22 factors piece called out “medication interactions,” which technically would include steroids. But after many frustrated diaTribe reader messages on this topic, I really wanted to call this one out separately – it’s critical! Steroids like prednisone can significantly increase blood glucose levels, in part by telling the liver to increase glucose production. This Joslin article reports that once prednisone is stopped, blood glucose levels usually return to normal fairly quickly.

#14

Niacin (Vitamin B3). Another factor I learned from diaTribe readers, and indeed, studies show niacin does increase blood glucose levels modestly. Niacin is typically prescribed to improve blood lipid levels, including HDL cholesterol and triglycerides (i.e., to improve heart health). According to a Medscape article (based on a 2005 study), the increased blood glucose levels seen with niacin did not translate into worse heart outcomes. Still, this is another factor to keep in mind if your glucose levels are higher than expected.

New ACTIVITY Factors

#17

Level of fitness/training. Back when I started cycling in 2011, I would see dramatic drops in blood glucose. Now that I’m more accustomed to it, I see smaller glucose drops for the same amount of time cycling. This factor is especially important for someone starting a new activity (or starting any exercise); you may see profound blood glucose drops initially, which may get smaller over time as level of fitness improves.

#18

Time of day. I find morning exercise causes a smaller drop in blood glucose, at least relative to other times of day. I tend to be more insulin resistant in the morning, which could explain the effect. Kelly (diaTribe’s Editor-in-Chief) is the opposite and is much more insulin sensitive in the morning! Your mileage may vary, and understanding more about your morning sensitivity is especially possible through CGM.

#19

Food and insulin timing before exercise. Though this factor is also captured in others, it’s critical to call out in the exercise category – even with the exact same exercise on two identical days, how I time my food and insulin makes an enormous difference on post-exercise blood glucose. I discuss three big timing mistakes in Bright Spots & Landmines, and the one shown at right is a common one: eating too close to starting exercise, which can cause low glucose during activity (food has not been absorbed) followed by a significant high afterwards. The other two insulin-related mistakes, not reducing bolus insulin enough and suspending basal insulin (pump) immediately before activity, are covered in chapter 3 of the book

New BIOLOGICAL Factors

#22

Recent hypoglycemia. This phenomenon is sometimes called “hypoglycemia begets hypoglycemia.” For instance, if I’ve experienced hypoglycemia in the past 12 hours, I’m more likely to experience hypoglycemia again. One reason for this “vicious cycle,” as explained in a 1993 article by world renowned hypoglycemia expert Dr. Philip Cryer, is that recent hypoglycemia impairs the body’s defense mechanisms against lows. When another low comes up, it’s harder to recognize the symptoms and/or the body has a harder time avoiding it. This was seen in the study testing the MiniMed 530G’s low glucose suspend feature, as well as in at least one CGM study.

#23

Overnight blood sugars? My overnight blood sugars seem to have a big impact on my next-day time-in-range (70-140 mg/dl): if I spend all night high – especially over 180 mg/dl, I’m more likely to fight high blood sugars the whole next day. Conversely, when I spend most of the night in range, the next day gets off to a far better start, and I seem to spend more time in range. In the 22 factors article, I talked about the dawn phenomenon (4:00-5:00 AM rise in blood glucose from hormones), but my blood sugars over the course of the entire night sleeping is a distinct factor. Here’s an example from my own data last month that illustrates what I’ve seen: 24-hour time-in-range was just 24% (left) vs. 79% (right). As shown on the left, spending all night high made the next day a challenge. I could not find research on overnight time-in-range driving next-day time-in-range, but my own CGM data has confirmed this time and time again. 

#27

Intramuscular insulin delivery. Here’s another one for people on insulin! In the previous 22 factors piece, I mentioned two injection/infusion set factors, but this third one is equally relevant. Injecting or pumping insulin into a muscular/low-body-fat area can increase the risk of hypoglycemia – especially if it happens before activity. For instance, when I go for a bike ride, I see a far bigger drop in blood sugar when my pump site is on my leg vs. other locations. My friends on injections have shared similar sentiments.

#31

Puberty. High levels of hormones secreted during puberty – growth hormone, testosterone, estrogen, cortisol – can increase insulin resistance. According to some estimates, adolescents with diabetes may need as much as 30%-50% more insulin than adults to keep their numbers within range. JDRF has a valuable teen toolkit with more details.

#32

Celiac disease. Untreated celiac, leading to a damaged small intestine, can increase the risk of hypoglycemia because the small intestine may no longer be able to absorb nutrients properly. Beyond Celiac also notes that untreated celiac may contribute to “irregular blood glucose levels.” According to the Celiac Disease Foundation, ~6% of people with type 1 diabetes have celiac, which is six times higher than the general population. There is no established link between type 2 diabetes and celiac disease.

New ENVIRONMENTAL Factors

#36

Outside temperature (especially for type 2 diabetes)? This is one I’ve heard from diaTribe readers, though most online articles narrowly focus on how hot weather leads to dehydration which leads to high blood glucose. (See the dehydration factor above.) There seems to be more to the story. This scientific review on body temperature regulation in diabetes (2016) summarizes a lot of research (192 citations!), and I was struck by some evidence that cold exposure can improve insulin sensitivity in type 2 diabetes (“it appears that mild cold exposure may be useful for the management of type 2 diabetes”). Some people with diabetes also report that sitting in the sun drops their blood glucose – this blog post from Columbia’s Dr. Jane K. Dickinson hypothesizes that the blood vessel dilation (expansion) from heat might be responsible (similar to the effect in a hot shower or hot tub). In the comments section, others confirm that cold weather drives their glucose numbers down – perhaps related to the body working harder (shivering) to stay warm. More research is clearly needed, especially to disentangle temperature from other related factors (exercise, hydration, meter accuracy, expired insulin), but there is more than a signal that outdoor temp is an independent biological factor that affects BG.

#37

Sunburn. diaTribe readers, along with many sources (CDC, Cleveland Clinic, ADA), note that sunburn stresses the body and can increase blood glucose. I covered “stress/illness” in the previous 22 factors, but sunburn is so common and non-obvious that I wanted to call it out specifically.

New Category! Behavioral and Decision-Making Factors

Our original article covered the above five categories of factors, but this sixth category is equally important: my behavior and decision-making really impact my blood glucose, both directly and indirectly, and for better and for worse. Here’s what I would add:  

#39

Frequency of glucose checks. More frequent glucose data ensures I’m driving my diabetes safely and able to steer my blood glucose back onto the road (in range) if I’m going high or low. Plus, glucose data provides useful feedback on what I did well or what I might do differently next time. In studies of both fingersticks and CGM, the more frequently someone obtains a glucose reading, the better they do (time-in-range, hypoglycemia, A1c, etc.) – see the plot that Abbott (Freestyle Libre) shared last year, showing the relationship between frequency of daily glucose checks vs. A1c and hypoglycemia in over 237,000 people. Some might believe that glucose-check frequency is simply a marker of diabetes motivation (correlation), but I personally disagree. Glucose data is a key factor in diabetes decision making and learning.

#40

Default options/choices. Small tweaks in the environment – or the way different options are presented – can really impact our choices. In turn, those choices directly impact our blood glucose, especially related to eating and insulin dosing. Here are two I think about a lot, based on great books I’ve read on this topic (Nudge, Switch, Mindless Eating, and The Undoing Project):

  • Plate/bowl size: As I noted in Bright Spots & Landmines, the same amount of food on a smaller plate looks more filling – see the image below. This has implications for carb counting and the amount of food eaten at one time. At diaTribe, we recommend switching to 8-inch dinner plates. The same is also true of bowls in my experience – the larger the bowl, the more I will eat. 
  • Visual prompts: When there is junk food sitting around – especially if it’s highly visible on the counter – I’m more likely to eat it. When bread is brought to the table at a restaurant, I’m more likely to have a nibble, and then two, and then finish half a loaf. When I store glucose tabs everywhere in my life, I’m far more likely to treat my low with a predictable correction, rather than opening the fridge and binging. In short, the visual prompts in my environment directly impact what I choose, driving big changes in blood glucose.

#41

Decision making biases. I think these five biases have an especially notable impact on choices and blood glucose levels – it might sound a bit geeky, but stay with me here! I’m also excited to know what you think on this, so please share comments here.

  • Present bias (Hyperbolic Discounting): When given two similar rewards, most of us prefer the reward that arrives sooner rather than later. Would you prefer $5 right now or $10 in three weeks? This is arguably the biggest challenge in diabetes – it’s easier to prioritize what feels good now (e.g., sugary foods), rather than the longer-term benefits of keeping blood sugars in range. This affects food choices in a huge way, but also motivation to exercise. It’s easy to make the wrong choice – and see resulting high blood sugars – when the feel-good benefits of doing so are immediate.
  • Loss aversion – We tend to prefer avoiding losses rather than acquiring equivalent gains: it’s better to not lose $10 than to find $10. Hypoglycemia is the best example of a loss that everyone with diabetes wants to avoid. This can lead to accepting more time with high blood sugars, especially for those without access to CGM or strips. Loss aversion impacts all sorts of diabetes choices, such as what glucose target to aim for (e.g., 150 vs. 100 mg/dl), when to correct a high blood sugar, what blood sugar to go to sleep with, and more.
  • Negativity bias – It’s easier to recall unpleasant memories compared with positive memories. This is the thesis of Bright Spots & Landmines – we all tend to focus on the things going wrong, which puts our focus on mistakes and self-blame. This focus on the negative is sometimes helpful, but can also lead to a lot of guilt, frustration, and depleted motivation. Finding Bright Spots – those things that are working and should be replicated – helps overcome the negativity bias and has had a big positive impact on my blood sugars.  
  • Selective matching – People often perceive patterns where none exist. For those with arthritis, this might mean looking for changes in the weather when experiencing increased pain, but paying little attention to the weather when pain is stable. As Dr. Amos Tversky summarized in 1995, “[A] single day of severe pain and extreme weather might sustain a lifetime of belief in a relation between them.” Since there are so many factors that affect blood glucose, it’s easy to draw inaccurate conclusions about cause-and-effect. For instance, the one time I ate ____, my blood sugar stayed in-range – therefore, this is a great food to eat. In reality, I always try to confirm my blood-sugar hypotheses with multiple experiments and tests.
  • Representative bias – When evaluating a situation, the most immediate examples that come to mind often drive the decision – even if those aren’t representative of the real trend. For instance, after one night spent with low blood sugars, I’m often tempted to change my whole insulin regime, even if the previous six nights had no lows.

#42

Family relationships/social pressures. At the recent JDRF Mission Summit, psychologist Dr. Weissberg-Benchell pointed out the link between diabetes distress and blood glucose outcomes. Interestingly, she mentioned how high levels of family conflict and family distress are linked to higher A1c’s in kids/teens with diabetes. On this point I’d add social pressures, as I mentioned in a recent column about choosing what to eat at a pizza restaurant. To me, this is another factor worth calling out – a challenging family or social environment can directly impact blood sugars. 

Gut-Brain Circuits For Sugar and Fat Cravings Combine to Trigger Overeatin


Understanding why we overeat unhealthy foods has been a long-standing mystery. And while we recognize that food acts as a strong reinforcer that guides our food decisions, the precise circuitry in our brains behind this is unclear. A study headed by scientists at Monell Chemical Senses Center has now unravelled internal neural wiring that reveals separate fat and sugar craving pathways, and suggests the concerning result, that combining these pathways overly triggers our desire to eat more than usual.

The newly reported research provides new insights into what controls “motivated” eating behavior, suggesting that a subconscious internal desire to consume a diet high in both fats and sugar has the potential to counteract dieting efforts. The findings may also offer up a viable therapeutic target for obesity. “Food is nature’s ultimate reinforcer,” said Monell research lead Guillaume de Lartigue, PhD. “But why fats and sugars are particularly appealing has been a puzzle. We’ve now identified nerve cells in the gut rather than taste cells in the mouth are a key driver.  We found that distinct gut-brain pathways are recruited by fats and sugars, explaining why that donut can be so irresistible.”

De Lartigue and colleagues reported on their results in Cell Metabolism, in a paper titled “Separate gut-brain circuits for fat and sugar reinforcement combine to promote overeating.” In their paper the team concluded, “These data provide additional support that there are separable circuits for fat and sugar reinforcement and suggest that foods rich in both fats and sugars combine to additively recruit both sugar and fat reward circuits, promoting higher levels of motivation to consume obesogenic diets.”

Food is a powerful natural reinforcer that guides feeding decisions, the authors wrote. “The sharp rise in global obesity rates has been attributed to changes in the food environment that promote overconsumption of palatable calorie-dense foods that are rich in fats and sugars.”  The vagus nerve sends internal sensory information from the gut to the brain about the nutritional value of food. But, the molecular basis of the reward in the brain associated with what we eat has been incompletely understood. “Fats and sugars both increase vagal firing in nerve recording experiments,” they noted, “however, it remains unknown whether different subpopulations of vagal neurons sense specific macronutrients.”

For their reported study the team used cutting-edge technology to directly manipulate fat or sugar neurons in the vagus nerve system in mice, and demonstrated that both types of neurons cause a dopamine release in the brain’s reward center. They discovered two dedicated vagus nerve pathways: one for fats and another for sugars. These circuits, originating in the gut, relay information about what we have eaten to the brain, setting the stage for cravings.  Their data, they noted, “… clearly indicate that fats and sugars recruit separate peripheral vagal circuits that convey information about the type, and possibly concentration, of nutrients from the gut to the brain.”

In this illustration, fat, sugar, and the combination of both (chocolate) navigate a gut-brain maze. The blue path represents the sugar route, the green path signifies the fat route, and the yellow path represents the combined impact of fats and sugars. Each path leads to the brain, but the combined route has a greater impact, triggering heightened dopamine release in the reward circuits, emphasizing the synergistic effect of fat-sugar combinations on neural responses.
In this illustration, fat, sugar, and the combination of both (chocolate) navigate a gut-brain maze. The blue path represents the sugar route, the green path signifies the fat route, and the yellow path represents the combined impact of fats and sugars. Each path leads to the brain, but the combined route has a greater impact, triggering heightened dopamine release in the reward circuits, emphasizing the synergistic effect of fat-sugar combinations on neural responses. [Isadora Braga, de Lartigue lab, Monell Center]

To determine how fats and sugars affect the brain, the team stimulated gut vagal nerves with light. This, in turn, induced the mice to actively seek stimuli, in this case food, that engage these circuits. The results indicated that sugar and fat are sensed by discrete neurons of the vagus nerve and engage parallel but distinct reward circuits to control nutrient-specific reinforcement. “Using activity-dependent genetic capture of vagal neurons activated in response to gut infusions of nutrients, we demonstrate the existence of separate gut-brain circuits for fat and sugar sensing that are necessary and sufficient for nutrient-specific reinforcement,” they stated.

But the story doesn’t end there. The team also found that simultaneously activating both the fat and sugar circuits creates a powerful synergy. “The activation of separate fat or sugar circuits can increase motivated feeding behavior, but strikingly, the combination of fats and sugars synergize to supra-additively increase the activity of neurons along the gut-reward circuit, increase dopamine release, and promote overeating independently of calories,” they stated. “It’s like a one-two punch to the brain’s reward system,” said de Lartigue. “Even if the total calories consumed in sugar and fats stays the same, combining fats and sugars leads to significantly more dopamine release and, ultimately, overeating in the mice.”

This finding sheds light on why dieting can be so challenging. Human brains may be subtly programmed to seek out high-fat, high-sugar combinations, regardless of conscious efforts to resist. “The communication between our gut and brain happens below the level of consciousness,” said de Lartigue. “We may be craving these types of food without even realizing it.”

The authors further explained, “Our data support the idea that increased caloric intake of Western diet results from hijacking interoceptive circuits that separately reinforce fats and sugars. Because interoceptive signaling occurs below the level of consciousness, the motivation to consume obesogenic diets may occur without cognitive perception.”

The team predicts that this line of research offers hope for future development of anti-obesity strategies and treatments. Targeting and regulating gut-brain reward circuits could offer a novel approach to curb unhealthy eating habits. “…manipulating interoceptive gut-reward circuits may provide a viable therapeutic target for treating obesity by promoting voluntary reduction in the consumption of obesogenic diets,” they concluded. “Understanding the wiring diagram of our innate motivation to consume fats and sugars is the first step towards rewiring it,” added de Lartigue. “This research unlocks exciting possibilities for personalized interventions that could help people make healthier choices, even when faced with tempting treats.”

14 Evidence-Based Medicinal Properties of Coconut Oil


While coconut oil has dragged itself out of the muck of vast misrepresentation over the past few years as a ‘deadly saturated fat,’ it still does not get the full appreciation it truly deserves. Not just a “good” fat, coconut oil is an exceptional healing agent as well, with loads of useful health applications

Some examples of this “good” saturated fat’s therapeutic properties include: 

  • Fat-Burning: Ironic, isn’t it? A saturated fat which can accelerate the loss of midsection fat (the most dangerous kind). Well, there are now two solid, human studies showing just two tablespoons a day (30 ml), in both men and women, is capable of reducing belly fat within 1-3 months.
  • Brain-Boosting: A now famous study, published in 2006 in the journal Neurobiology of Aging, showed that the administration of medium chain triglycerides (most plentifully found in coconut oil) in 20 subjects with Alzheimer’s disease or mild cognitive impairment, resulted in significant increases in ketone bodies (within only 90 minutes after treatment) associated with measurable cognitive improvement in those with less severe cognitive dysfunction.[i]
  • Clearing Head Lice: When combined with anise spray, coconut oil was found to be superior to the insecticide permethrin (.43%).[ii]
  • Healing Wounds: Coconut has been used for wound healing since time immemorial. Three of the identified mechanisms behind these healing effects are its ability to accelerate re-epithelialization, improve antioxidant enzyme activity, and stimulate higher collagen cross-linking within the tissue being repaired.[iii] Coconut oil has even been shown to work synergistically with traditional treatments, such as silver sulphadizine, to speed burn wound recovery.[iv]
  • NSAID Alternative: Coconut oil has been demonstrated to have anti-inflammatory, analgesic and fever-reducing properties.[v]
  • Anti-Ulcer Activity: Interestingly, coconut milk (which includes coconut oil components), has been shown to be as effective as the conventional drug sucralfate as an NSAID-associated anti-ulcer agent.[vi]
  • Anti-Fungal: In 2004, 52 isolates of Candida species were exposed to coconut oil. The most notorious form, Candida albicans, was found to have the highest susceptibility. Researchers remarked: “Coconut oil should be used in the treatment of fungal infections in view of emerging drug-resistant Candida species.”[vii]
  • Testosterone-Booster: Coconut oil was found to reduce oxidative stress in the testes of rats, resulting in significantly higher levels of testosterone.[viii]
  • Reducing Swollen Prostate: Coconut oil has been found to reduce testosterone-induced benign prostate growth in rats.[ix]
  • Improving Blood Lipids: Coconut oil consistently improves the LDL:HDL ratio in the blood of those who consume it. Given this effect, coconut oil can nolonger be dismissed for being ‘that saturated fat which clogs the arteries.’
  • Fat-Soluble Nutrient Absorption: Coconut oil was found to be superior to safflower oil in enhancing tomato carotenoid absorption.[x]
  • Bone Health: Coconut oil has been shown to reduce oxidative stress within the bone, which may prevent structural damage in osteoporotic bone.[xi] [Note: Osteoporosis is a Myth, as presently defined by the T-Score]
  • Sunscreen: Coconut oil has been shown to block out UV rays by 30%. Keep in mind that this is good, insofar as UVA rays are damaging to the skin, whereas UVB rays are highly beneficial (when exposure is moderate).[i] Make sure to check this list of other sun-blocking oils.
  • Insect Repellant: Amazingly, coconut oil was found to be more effective than DEET at repelling insects. Read our article on the topic here: Coconut Oil Beats Toxic DEET at Repelling Insects.

Of course, when speaking about coconut oil, we are only looking at one part of the amazing coconut palm. Each component, including coconut hull fiber, coconut protein and coconut water has experimentally confirmed therapeutic applications.

Osteoporosis Is Scurvy of the Bone, Not Calcium Deficiency


Osteoporisis Is Scurvy of the Bone, Not Calcium Deficiency

It saddens me to see older women diagnosed with “osteopenia” or “osteoporosis” listening to their doctors and taking supplemental calcium and even problematic drugs called bisphosphonates. These are irrational, dogmatic, harmful approaches to the problem of degrading bone as we age

“A joyful heart is good medicine, but a broken spirit dries up the bones.”~Proverbs 17:22

It saddens me to see older women diagnosed with “osteopenia” or “osteoporosis” listening to their doctors and taking supplemental calcium and even problematic drugs called bisphosphonates. These are irrational, dogmatic, harmful approaches to the problem of degrading bone as we age. In my time practicing nephrology and internal medicine, I saw numerous patients suffering from vascular disease while taking the recommended doses of calcium. X-rays revealed perfect outlines of calcified blood vessels and calcified heart valves. 

Osteoporisis Is Scurvy of the Bone, Not Calcium Deficiency

Pictured here is a calcified breast artery, often seen in women who are being treated for hypertension. The primary drug used in high blood pressure, a thiazide diuretic, causes the body to retain calcium and lose magnesium and potassium. We incidentally note these types of calcifications in the large arteries of the entire body, not just the breasts. I believe these problems are avoidable.

The matrix of bone will incorporate calcium and nutrients where they belong as long as the proper hormones and nutrients are present. Needless to say gravitational force in the form of weight bearing exercise is essential and should be the foundation to a healthy skeleton. Don’t be afraid to exercise with some weight in a backpack if you have no disk disease or low back pain.

You still have to look at what you can do nutritionally, and in interpersonal relationships to help your body heal itself. Supplements are no replacement for good nutrition. After all, scientists are constantly discovering new things about food and its interaction with the body that we don’t know.

The first thing to do is either google or look in your reference books to find foods right in Vitamin C, Vitamin K2, magnesium and minor minerals such as boron and silica. Silica is also important for bones. Remember too, that depression has many causes. Sometimes the cause can be nutritional deficiencies and sometimes depression can result from entrapment in unhealthy family dynamics. Controversially, I would also say that depression can also have spiritual origins.

But if time feels of the essence, then supplementation is one route which could be taken. While the medical profession supplements with calcium and fosomax, in my opinion, a more constructive supplementation regimen could include Vitamin C, Vitamin K2, vitamin D3( in winter months, sun in summer) and boron, silica and magnesium. These are all far more important to preventing fracture and keeping bone healthy than calcium. 

Calcium will ultimately land in the muscles of the heart, the heart valves and the blood vessels, leading to cardiovascular disease. However if you are getting enough vit C, D3 and K2, your body will direct the calcium you ingest from your food, to where it belongs, not in your heart and blood vessels.

Vitamin C does several things to strengthen bones

  1. It mineralizes the bone and stimulates bone forming cells to grow.
  2. Prevents too much degradation of bone by inhibiting bone absorbing cells.
  3. Dampens oxidative stress, which is what aging is.
  4. Is vital in collagen synthesis.

When vitamin C is low, just the opposite happens. Bone cells that degrade bone called octeoclasts proliferate, and bone cells that lay down mineral and new bone called osteoblasts are not formed.

Studies have shown that elderly patients who fractured bones had significantly lower levels of vitamin C in their blood than those who haven’t fractured.[1] Bone mineral density- the thing that the tests measure, is higher in those who supplement with vitamin C, independent of estrogen level.[2],[3]

Vitamin K2 is well known among holistic practitioners to be important in cardiovascular and bone health. Supplementing this is also a good idea if bone or heart issues are a concern. Read more here.

And of course good old vitamin D3 with a level around 50-70 mg/ml will help keep the immune system functioning well and the bones strong.

This may seem like a lot of supplementing, yet to me is a worthwhile endeavor that will keep much more than the bones strong. These days getting enough vitamin C is not so easy with diet alone. With the toxic load we all have, even with the most pristine diets, we are requiring more vitamin C internally than our ancestors did. Adults would do well to take 2-5 grams per day of sodium ascorbate as a general supplement. If you have active kidney stones, or kidney disease please check with your doctor first.

Humans, monkeys and guinea pigs don’t make any vitamin C. This leaves us on our own to get our needs met. Cats weighing only about 10-15 pounds, synthesize more than 15 times the RDA of vit C recommended for humans. Goats are about the size of a human adults, and under no stress they synthesize 13G per day. Under stress it can rise to 100G. Do not fear taking vitamin C. It is the one of the most non-toxic and safe supplements known. Use liposomal vitamin C, sodium ascorbate or ascorbic acid, never Ester-C or calcium ascorbate.If you prefer a natural plant-based source, camu-camu is very high in C. However its harvest does threaten the rainforest.

Depression: It’s Not Your Serotonin


Depression: It's Not Your Serotonin

Millions believe depression is caused by ‘serotonin deficiency,’ but where is the science in support of this theory?

“Depression is a serious medical condition that may be due to a chemical imbalance, and Zoloft works to correct this imbalance.”

Herein lies the serotonin myth.

As one of only two countries in the world that permits direct to consumer advertising, you have undoubtedly been subjected to promotion of the “cause of depression.” A cause that is not your fault, but rather; a matter of too few little bubbles passing between the hubs in your brain! Don’t add that to your list of worries, though, because there is a convenient solution awaiting you at your doctor’s office…

What if I told you that, in 6 decades of research, the serotonin (or norepinephrine, or dopamine) theory of depression and anxiety has not achieved scientific credibility?

You’d want some supporting arguments for this shocking claim.

So, here you go:

The Science of Psychiatry is Myth

Rather than some embarrassingly reductionist, one-deficiency-one-illness-one-pill model of mental illness, contemporary exploration of human behavior has demonstrated that we may know less than we ever thought we did. And that what we do know about root causes of mental illness seems to have more to do with the concept of evolutionary mismatch than with genes and chemical deficiencies.

In fact, a meta-analysis of over 14,000 patients and Dr. Insel, head of the NIMH, had this to say:

“Despite high expectations, neither genomics nor imaging has yet impacted the diagnosis or treatment of the 45 million Americans with serious or moderate mental illness each year.”

To understand what imbalance is, we must know what balance looks like, and neuroscience, to date, has not characterized the optimal brain state, nor how to even assess for it.

A New England Journal of Medicine review on Major Depression, stated:

” … numerous studies of norepinephrine and serotonin metabolites in plasma, urine, and cerebrospinal fluid as well as postmortem studies of the brains of patients with depression, have yet to identify the purported deficiency reliably.”

The data has poked holes in the theory and even the field of psychiatry itself is putting down its sword. One of my favorite essays by Lacasse and Leo has compiled sentiments from influential thinkers in the field – mind you, these are conventional clinicians and researchers in mainstream practice – who have broken rank, casting doubt on the entirety of what psychiatry has to offer around antidepressants:

quotations

Humble Origins of a Powerful Meme

In the 1950s, reserpine, initially introduced to the US market as an anti-seizure medication, was noted to deplete brain serotonin stores in subjects, with resultant lethargy and sedation. These observations colluded with the clinical note that an anti-tuberculosis medication, iproniazid, invoked mood changes after five months of treatment in 70% of a 17 patient cohort. Finally, Dr. Joseph Schildkraut threw fairy dust on these mumbles and grumbles in 1965 with his hypothetical manifesto entitled “The Catecholamine Hypothesis of Affective Disorders” stating:

“At best, drug-induced affective disturbances can only be considered models of the natural disorders, while it remains to be demonstrated that the behavioral changes produced by these drugs have any relation to naturally occurring biochemical abnormalities which might be associated with the illness.”

Contextualized by the ripeness of a field struggling to establish biomedical legitimacy (beyond the therapeutic lobotomy!), psychiatry was ready for a rebranding, and the pharmaceutical industry was all too happy to partner in the effort.

Of course, the risk inherent in “working backwards” in this way (noting effects and presuming mechanisms) is that we tell ourselves that we have learned something about the body, when in fact, all we have learned is that patented synthesized chemicals have effects on our behavior. This is referred to as the drug-based model by Dr. Joanna Moncrieff. In this model, we acknowledge that antidepressants have effects, but that these effects in no way are curative or reparative.

The most applicable analogy is that of the woman with social phobia who finds that drinking two cocktails eases her symptoms. One could imagine how, in a 6 week randomized trial, this “treatment” could be found efficacious and recommended for daily use and even prevention of symptoms. How her withdrawal symptoms after 10 years of daily compliance could lead those around her to believe that she “needed” the alcohol to correct an imbalance. This analogy is all too close to the truth.

Running With Broken Legs

Psychiatrist Dr. Daniel Carlat has said:

“And where there is a scientific vacuum, drug companies are happy to insert a marketing message and call it science. As a result, psychiatry has become a proving ground for outrageous manipulations of science in the service of profit.”

So, what happens when we let drug companies tell doctors what science is? We have an industry and a profession working together to maintain a house of cards theory in the face of contradictory evidence.

We have aglobal situation in which increases in prescribing are resulting in increases in severity of illness (including numbers and length of episodes) relative to those who have never been treated with medication.

To truly appreciate the breadth of evidence that states antidepressants are ineffective and unsafe, we have to get behind the walls that the pharmaceutical companies erect. We have to unearth unpublished data, data that they were hoping to keep in the dusty catacombs.

A now famous 2008 study in the New England Journal of Medicine by Turner et al sought to expose the extent of this data manipulation. They demonstrated that, from 1987 to 2004, 12 antidepressants were approved based on 74 studies. Thirty-eight were positive, and 37 of these were published. Thirty-six were negative (showing no benefit), and 3 of these were published as such while 11 were published with a positive spin (always read the data not the author’s conclusion!), and 22 were unpublished.

In 1998 tour de force, Dr. Irving Kirsch, an expert on the placebo effect, published a meta-analysis of 3,000 patients who were treated with antidepressants, psychotherapy, placebo, or no treatment and found that only 27% of the therapeutic response was attributable to the drug’s action.

This was followed up by a 2008 review, which invoked the Freedom of Information Act to obtain access to unpublished studies, finding that, when these were included, antidepressants outperformed placebo in only 20 of 46 trials (less than half!), and that the overall difference between drugs and placebos was 1.7 points on the 52 point Hamilton Scale. This small increment is clinically insignificant, and likely accounted for by medication side effects strategically employed (sedation or activation).

When active placebos were used, theCochrane database found that differences between drugs and placebos disappeared, given credence to the assertion that inert placebos inflate perceived drug effects.

The finding of tremendous placebo effect in the treatment groups was also echoed in two different meta-analyses by Khan et al who found a 10% difference between placebo and antidepressant efficacy, and comparable suicide rates. A trial examining the role of “expectancy” or belief in antidepressant effect, found that patients lost their perceived benefit if they believed that they might be getting a sugar pill even if they were continued on their formerly effective treatment dose of Prozac.

The largest, non-industry funded study, costing the public $35 million dollars, followed 4000 patients treated with Celexa (not blinded, so they knew what they were getting), and found that half of them improved at 8 weeks. Those that didn’t were switched to Wellbutrin, Effexor, or Zoloft OR “augmented” with Buspar or Wellbutrin.

Guess what? It didn’t matter what was done, because they remitted at the same unimpressive rate of 18-30% regardless with only 3% of patients in remission at 12 months.

How could it be that medications like Wellbutrin, which purportedly primarily disrupt dopamine signaling, and medications like Stablon which theoretically enhances the reuptake of serotonin, both work to resolve this underlying imbalance? Why would thyroid, benzodiazepines, beta blockers, and opiates also “work”? And what does depression have in common with panic disorder, phobias, OCD, eating disorders, and social anxiety that all of these diagnoses would warrant the same exact chemical fix?

Alternative options

As a holistic clinician, one of my bigger pet peeves is the use of amino acids and other nutraceuticals with “serotonin-boosting” claims. These integrative practitioners have taken a page from the allopathic playbook and are seeking to copy-cat what they perceive antidepressants to be doing.

The foundational “data” for the modern serotonin theory of mood utilizes tryptophan depletion methods which involve feeding volunteers amino acid mixtures without tryptophan and are rife with complicated interpretations.

Simply put, there has never been a study that demonstrates that this intervention causes mood changes in any patients who have not been treated with antidepressants.

In an important paper entitled Mechanism of acute tryptophan depletion: Is it only serotonin?, van Donkelaar et al caution clinicians and researchers about the interpretation of tryptophan research. They clarify that there are many potential effects of this methodology, stating:

“In general, several findings support the fact that depression may not be caused solely by an abnormality of 5-HT function, but more likely by a dysfunction of other systems or brain regions modulated by 5-HT or interacting with its dietary precursor. Similarly, the ATD method does not seem to challenge the 5-HT system per se, but rather triggers 5HT-mediated adverse events.”

So if we cannot confirm the role of serotonin in mood and we have good reason to believe that antidepressant effect is largely based on belief, then why are we trying to “boost serotonin”?

Causing imbalances

All you have to do is spend a few minutes on https://survivingantidepressants.org/ or https://beyondmeds.com/ to appreciate that we have created a monster. Millions of men, women, and children the world over are suffering, without clinical guidance (because this is NOT a part of medical training) to discontinue psychiatric meds. I have been humbled, as a clinician who seeks to help these patients, by what these medications are capable of. Psychotropic withdrawal can make alcohol and heroin detox look like a breeze.

An important analysis by the former director of the NIMH makes claims that antidepressants “create perturbations in neurotransmitter functions” causing the body to compensate through a series of adaptations which occur after “chronic administration” leading to brains that function, after a few weeks, in a way that is “qualitatively as well as quantitatively different from the normal state.”

Changes in beta-adrenergic receptor density, serotonin autoreceptor sensitivity, and serotonin turnover all struggle to compensate for the assault of the medication.

Andrews, et al., calls this “oppositional tolerance,” and demonstrate through a careful meta-analysis of 46 studies demonstrating that patient’s risk of relapse is directly proportionate to how “perturbing” the medication is, and is always higher than placebo (44.6% vs 24.7%). They challenge the notion that findings of decreased relapse on continued medication represent anything other than drug-induced response to discontinuation of a substance to which the body has developed tolerance. They go a step further to add:

“For instance, in naturalistic studies, unmedicated patients have much shorter episodes, and better long-term prospects, than medicated patients. Several of these studies have found that the average duration of an untreated episode of major depression is 1213 weeks.”

Harvard researchers also concluded that at least fifty percent of drug-withdrawn patients relapsed within 14 months. In fact:

“Long-term antidepressant use may be depressogenic . . . it is possible that antidepressant agents modify the hardwiring of neuronal synapses (which) not only render antidepressants ineffective but also induce a resident, refractory depressive state.”

So, when your doctor says, “You see, look how sick you are, you shouldn’t have stopped that medication,” you should know that the data suggests that your symptoms are withdrawal, not relapse.

Longitudinalstudies demonstrate poor functional outcomes for those treated with 60% of patients still meeting diagnostic criteria at one year (despite transient improvement within the first 3 months). When baseline severity is controlled for, two prospective studies support a worse outcome in those prescribed medication:

One in which the never-medicated group experienced a 62% improvement by six months, whereas the drug-treated patients experienced only a 33% reduction in symptoms, and another WHO study of depressed patients in 15 cities which found that, at the end of one year, those who weren’t exposed to psychotropic medications enjoyed much better “general health”; that their depressive symptoms were much milder”; and that they were less likely to still be “mentally ill.”

I’m not done yet. In a retrospective 10-year study in the Netherlands, 76% of those with unmedicated depression recovered without relapse relative to 50% of those treated.

Unlike the mess of contradictory studies around short-term effects, there are no comparable studies that show a better outcome in those prescribed antidepressants long term.

First Do No Harm

So, we have a half-baked theory in a vacuum of science that that pharmaceutical industry raced to fill. We have the illusion of short-term efficacy and assumptions about long-term safety. But are these medications actually killing people?

The answer is yes.

Unequivocally, antidepressants cause suicidal and homicidal behavior. The Russian Roulette of patients vulnerable to these “side effects” is only beginning to be elucidated and may have something to do with genetic variants around metabolism of these chemicals. Dr. David Healy has worked tirelessly to expose the data that implicates antidepressants in suicidality and violence, maintaining a database for reporting, writing, and lecturing about cases of medication-induced death that could make your soul wince.

What about our most vulnerable?

I have countless patients in my practice who report new onset of suicidal ideation within weeks of starting an antidepressant. In a population where there are only 2 randomized trials, I have grave concerns about postpartum women who are treated with antidepressants before more benign and effective interventions such as dietary modification and thyroid treatment. Hold your heart as you read through these reports of women who took their own and their childrens’ lives while treated with medications.

Then there is the use of these medications in children as young as 2 years old. How did we ever get the idea that this was a safe and effective treatment for this demographic? Look no further than data like Study 329, which cost Glaxo Smith Klein 3 billion dollars for their efforts to promote antidepressants to children. These efforts required ghost-written and manipulated data that suppressed a signal of suicidality, falsely represented Paxil as outperforming placebo, and contributes to an irrepressiblemountain of harmdone to our children by the field of psychiatry.

RIP Monoamine Theory

As Moncrieff and Cohen so succinctly state:

“Our analysis indicates that there are no specific antidepressant drugs, that most of the short-term effects of antidepressants are shared by many other drugs, and that long-term drug treatment with antidepressants or any other drugs has not been shown to lead to long-term elevation of mood. We suggest that the term “antidepressant” should be abandoned.”

So, where do we turn?

The field of psychoneuroimmunology dominates the research as an iconic example of how medicine must surpass its own simplistic boundaries if we are going to begin to chip away at the some 50% of Americans who will struggle with mood symptoms, 11% of whom will be medicated for it.

There are times in our evolution as a cultural species when we need to unlearn what we think we know. We have to move out of the comfort of certainty and into the freeing light of uncertainty. It is from this space of acknowledged unknowing that we can truly grow. From my vantage point, this growth will encompass a sense of wonder – both a curiosity about what symptoms of mental illness may be telling us about our physiology and spirit, as well as a sense of humbled awe at all that we do not yet have the tools to appreciate. For this reason, honoring our co-evolution with the natural world, and sending the body a signal of safety through movement, diet, meditation, and environmental detoxification represents our most primal and most powerful tool for healing.