Mediterranean Diet Awarded the Best Diet Plan 6 Years in Succession


(Shutterstock)

Each year, U.S. News & World Report ranks the best diet plans. This year marks the sixth consecutive year that the Mediterranean diet has been ranked the best diet overall.

The Mediterranean diet is a healthy departure from our typical modern diet.

Daily Diet Composition

The typical diet includes:

  1. Carbohydrates. White rice, white flour, and sugars are refined (also called simple or processed) carbohydrates. Whole, unprocessed carbohydrates include whole grains, fruits, and vegetables.
  2. Protein. Plant-based protein comes from beans and nuts; animal protein comes from animals, including beef, lamb, chicken, duck, and fish.
  3. Fat. Animal and vegetable fat can be further divided into saturated fatty acids, unsaturated fatty acids, long-chain fatty acids, and short-chain fatty acids. Some fatty acids are not found in the human body and must be obtained from food; they are called “essential fatty acids.” Similarly, “essential amino acids” cannot be produced by the body and must come from food.

In addition, food contains a large number of trace elements, minerals, and vitamins.

Problems with the Modern Diet

Under normal circumstances, most of the above ingredients are necessary for our bodies.

So why are some foods considered unhealthy? It is not always that the food itself is unhealthy, but if we consume too much of the same food, we veer away from a balanced diet.

For example, carbohydrates can produce energy, and saturated fatty acids in meat are important for human cellular structure and function. However, over-eating these or any type of food is unhealthy.

What are the main problems with our diet?

  • We consume too much sugar, primarily from carbohydrates found in sweets and high-carb foods like pasta and white rice. Sugar is an addictive substance.
  • Too much fried food. Fried food is fragrant, but with oil heated at elevated temperatures, it will produce substances that are very harmful to our bodies.
  • Animal meat is not as healthy as it once was. Today, most poultry and livestock are raised artificially. They are locked in iron cages, fed food containing growth hormones and chemicals, and injected with antibiotics to prevent them from getting sick.

Unless we seek out grass-fed, free-range options, the meat we eat nowadays is no longer natural animal protein. This has resulted in many health problems, including cardiovascular and cerebrovascular diseases, tumors, and autoimmune diseases. Other metabolic diseases of hypertension, high blood sugar, and hyperlipidemia result from modern lifestyles and eating habits.

The Mediterranean diet plan offers a healthy alternative.

Introduction to the Mediterranean Diet

The most prominent feature of the diet of Mediterranean countries is that plant foods are unprocessed. Fruits, vegetables, whole grains, beans, nuts, and seeds are consumed in their natural state.

The diet includes almost completely unprocessed olive oil, called extra virgin or virgin olive oil. Its manufacturing process is relatively simple: physical pressing and oil-water separation to obtain 100 percent fresh olive oil. This olive oil, which has not been chemically modified and has nothing added, is popular in Mediterranean countries.

The Mediterranean diet includes fish, seafood, and poultry, without much red meat. Fruits serve as a sweet treat. Along with a bit of wine that goes with meals, all the above constitute the Mediterranean diet.

The Role of the Mediterranean Diet

The Mediterranean diet focuses less on animal protein and more on plant and healthy fats. It is made with very few processed ingredients and very little added sugar. Therefore, the Mediterranean diet looks particularly good within the context of modern-day eating habits and dietary content.

It all started in the 1950s when heart disease was not as prevalent in Mediterranean countries as in the United States. People began to study the diet of the region, which is reported to protect against diabetes, cardiovascular disease, and stroke and to prevent cognitive decline and Alzheimer’s disease. The diet also protects against certain cancers, including breast, stomach, liver, prostate, and cervical cancers. Not only that, but it can help people lose weight and reduce the incidence of arthritis.

When people practice this way of eating, they avoid other eating habits that can induce diseases.

Some say the Mediterranean diet is good for health, mainly because of its healthier fats, such as olive oil. Olive oil provides monounsaturated fatty acids that lower total cholesterol and low-density lipids (LDL). Monounsaturated fatty acids are found in olive oil, avocados, and many nuts.

Fish rich in “good fats” (such as sardines, tuna, salmon, and other deep-sea fatty fish) contain omega-3 fatty acids, which can fight inflammation, lower blood lipids, and reduce the risk of stroke and heart failure.

Studies on the Mediterranean Diet

In a study published in the May 2022 edition of The Lancet, researchers divided more than 1,000 Spanish patients with coronary heart disease into two groups. One group was put on a Mediterranean diet, and the other on a low-fat diet. Both diets emphasize eating fruits and vegetables, but the Mediterranean diet emphasizes foods that are higher in monounsaturated fatty acids, namely olive oil, nuts, beans, whole grains, poultry, and fish rich in omega-3 fatty acids. A low-fat diet contains fewer monounsaturated fatty acids and includes lean fish and poultry.

The two diets also differ in the proportion of carbohydrates. Low-fat diets contain more carbohydrates, such as potatoes.

Tracking the two groups of people for seven years to compare the risk of cardiovascular disease and stroke, the researchers found that people who followed the Mediterranean diet had a 22 percent lower risk of cardiovascular disease and stroke. This was especially evident in the men in each group.

Daily Application

In conclusion, the Mediterranean diet is easy to start and can provide long-term benefits to your health.

  • Remember to eat fish twice a week. The fish must be deep-sea wild-caught fish, not fish obtained from polluted water sources. Farm-raised fish may be polluted. For example, farm-raised salmon may absorb all kinds of heavy metals, which are harmful to the body.
  • Whenever you feel hungry, eat a small piece of whole wheat bread with two teaspoonfuls of unprocessed olive oil to relieve hunger.
  • Fresh fruit serves as a dessert in the Mediterranean diet. Be careful not to eat fruits that are too sweet.

It is beneficial to move our eating habits closer to the Mediterranean diet and, whenever possible, to choose organic and unprocessed foods.

3 Breathing Exercises to Calm the Brain, Reduce Stress and Cure Anxiety


In Taoist philosophy it's taught that "The wise man/woman breathes from his/her heels." (tache/Shutterstock)

In Taoist philosophy it’s taught that “The wise man/woman breathes from his/her heels.”

When you breathe correctly, you pump cerebrospinal fluid into the brain to reduce stress and cure anxiety.

You’ve probably heard the expression, “just breathe through it.” When a situation is stressful, breathing deeply and evenly can help to cure anxiety and reduce stress. Why is that so? There’s a very important link between feeling calm, nasal breathing, better sleep, and brain health.

In Taoist philosophy, it’s taught that “The wise man/woman breathes from his/her heels.” Physically speaking, this phrase is a reference to the fact breathing deeply into the body is incredibly good for health. Today research is revealing how breathing affects the brain.

The human brain is bathed with crystal clear liquid called cerebrospinal fluid (CSF). CSF carries oxygen and nutrients to brain cells while removing waste products. Recent studies using magnetic resonance imaging show a link between CSF flow and breathing.

In this article, we’ll explore the process of breathing and how it affects the brain.

Cerebrospinal Fluid–The Brain’s Life Blood

Perhaps the most important fluid in your body is the 250ml of cerebrospinal fluid that flows around a system of pipes in the brain called the ventricles. Cerebrospinal fluid is produced by the choroid plexus in the third ventricle, and from there it circulates through the brain via the ventricles and then to the spinal cord.

Each day the entire volume of CSF is replaced four times. During sleep, the blood-brain barrier relaxes to let CSF into the neurons and flush out the build-up from the day. This is a big reason why sleep is so important.

How Breathing Affects the Brain and CSF

A good night’s sleep is easier said than done for some, but there are ways to get better sleep by breathing consciously. Breathing influences CSF flow dynamics by changing pressure in the chest. Recent studies have shown how breath can affect the flow of CSF directed through the ventricles of the brain. This is important because you need to make sure your brain gets the most CSF it can.

If you sleep badly or wake up feeling tired or anxious, then it might indicate a sleep disorder. Conditions such as snoring, sleep apnea, and other issues are known to affect CSF flow to the brain.

Studies show that pressure in the chest influences the pressure in vessels like arteries and veins. It was previously thought that changes in CSF flow responded to arterial pressure during deep inhaling, however, it was recently discovered that the direct change of pressure in the chest during breathing is likely responsible.

Diaphragmatic breathing affects the pressure of the veins around the thoracic vertebrae (located in the mid-back), and the veins in the chest respond to these changes in pressure by pumping CSF into the spinal cord.

Breathing Shifts CSF Via Pressure in Chest Veins

The veins around the chest vertebrae column transmit pressure upwards to the brain. They make up a sprawling network of tinier veins that extend up into the epidural venous system of the spinal canal called the venous plexus.

During inhalation and exhalation, the chest rises and falls. The change in pressure flows upward to the CSF dynamics around the brain. Here’s how it works:

Breath in (inspiration): Lowers chest pressure and empties the venous plexus. CSF flows down the spine.

Breath out (expiration): Increases chest pressure and fills the venous plexus, pushing CSF up the spine into the head.

As you can see, breathing conducts a rhythm of flow of CSF up and down the spinal cord.

Deep Breathing and the Brain

Most veins in the body have valves to stop blood from flowing backward. However, the thoracic plexus is valveless, and any pressure will cause a flow in either direction. More pressure from deep breathing causes more CSF to flow into the brain.

2013 study showed that the depth of breathing can even change the rate of CSF movement through the brain, with deeper breaths pushing CSF further up into the brain. Researchers also tested breath holding and found it also produces increased CSF flow.

Pressure changes of CSF likely then push CSF into the lymphatic system, so with each breath, CSF is flowing into your brain. The body then exits it into the lymphatic system to be met by the immune system.

Easy Breathing Tips for Better Sleep and Stress Reduction

Now we know how breathing bathes the brain in CSF, it’s important to know that how you breathe during your waking hours will be reflected in your breathing pattern while you slumber. Priming your body for good breathing during sleep may help nourish the brain in CSF.

For better sleep and a healthier brain, and to reduce stress and anxiety, practice the following breathing exercises.

Step 1: Deep Breath to Reduce Stress

  • Lay on the floor with two hands over your stomach.
  • Seal the tongue firmly to the roof of the mouth, seal the lips and breathe deeply through the nose.
  • Breathe deeply into the diaphragm. Your hands should rise as the stomach expands. Breathe in for 4 seconds.
  • Take a slow exhale for 8 seconds.
  • Continue for 30 breaths and repeat 3 times.

Step 2: Expand Your Breathing Capacity

  • Repeat the steps above, and when you reach your capacity, make a conscious effort to extend your breathing.
  • Lengthen the exhale to 10-12 seconds.
  • Feel the rush of CSF to your brain. As you expand you feel comfortable in slow, deep breathing.

Step 3: Improve Your Spinal Posture

Remember, CSF moves up the spine into the brain as you breathe. Your spinal posture will influence that pathway. Here’s an exercise to increase core mobility with standing Chi Gung. Hold the following posture for two minutes:

  • Draw the body’s weight to the middle of the feet, slightly away from the heels.
  • Extend your arms in front of the body.
  • With every breath as the chest expands, shift your body weight forward, taking additional weight off the heels.
  • To balance the forward motion, extend the spine and stretches through the heels.
  • Ensure the downward stretch and forward motion are exactly balanced so that there is no visible movement of the heels.
  • To an observer the heels appear to be in contact with the ground, but internally they are engaged in a downward stretch with each breath.
  • Feel the stability of the spine and visualize CSF flowing up the spine.

Your brain depends on deep breathing patterns to help bath it in cerebrospinal fluid. Using your diaphragm to maximize pressure shifts in the chest cavity will help to boost the flow of CSF to the brain.

Study Finds Link Between Common Chemicals and Inflammatory Skin Disease


Common chemicals may be to blame for your eczema. (pumatokoh/Shutterstock)

Common chemicals may be to blame for your eczema.

A new government study has found an association between diisocyanates, a class of widely used chemicals, and eczema, a chronic inflammatory skin disease.

People affected by eczema experience severe itching, skin redness, oozing from the skin, and scaly rashes, according to the National Institute of Allergy and Infectious Diseases (NIAID). These symptoms can be painful, and can suddenly appear without any obvious trigger.

Eczema affects up to 20 percent of children and up to 10 percent of adults in industrialized countries. In the United States, the prevalence of eczema is three to six times higher than it was in the 1970s.

In a study published on Jan. 6 in Science Advances, a team of NIAID scientists tried to find out how certain environmental pollutants may be contributing to this increase. They focused on diisocyanates, which are used nationwide in polyurethane products such as foams, spray paints, and glues. The active portion of the diisocyanate molecule, the isocyanate side chain, is also a component known to trigger eczema.

For the study, the scientists conducted experiments on mice to explore exactly why diisocyanates exposure could directly induce eczema in mouse skin. The findings suggest that this has to do with how skin bacteria deal with isocyanate.

Specifically, scientists found that when bacteria living on healthy skin are exposed to isocyanate, they must adapt to survive the new environment. When they adapt, these bacteria shift their metabolism away from producing the lipids, or oils, that the skin needs to stay healthy.

This also means, according to the study, that it be may possible to treat eczema by replacing post-exposure skin bacteria with healthy ones that behave normally.

“To our knowledge, this is the first report demonstrating that pollution may promote disease by disrupting metabolism in commensal microbiota,” the scientists wrote.

With that said, scientists noted they need to further validate the association between environmental exposure to isocyanate or diisocyanates and eczema, and to determine whether this mechanism identified in mice works the same way in humans.

According to the American Academy of Dermatology (AAD), the best way to relieve itchy eczema is to get eczema under control, which takes time. However, the professional organization also says there are things parents can do to offer temporary relief from the itch on their kids’ skin, including:

  • Soak a clean towel in cool water and apply a cool compress to itchy skin.
  • Add colloidal oatmeal to bath water and let the child soak.
  • Soak the child in a bath and smear on ointment.
  • Distract the child by telling a story or playing with a toy.
  • Calm the stressed child.
  • Pinch skin near eczema to relieve itch.

To avoid making eczema itchier for children, the AAD advises parents to not tell their children to stop scratching. “This rarely works and can leave your child feeling stressed. Stress can cause eczema to flare,” the organization says.

The AAD also advises against anti-itch products unless the child’s dermatologist recommends one.

Catheter-associated superior vena cava syndrome


KEY POINTS

  • Catheter-associated thrombosis is the most common noninfectious complication of implantable venous access devices and can cause superior vena cava syndrome.
  • The diagnosis can be confirmed with Doppler ultrasonography or contrast-enhanced computed tomography.
  • Anticoagulation with or without catheter removal is the initial treatment of choice; endovascular intervention is reserved for patients who do not respond to anticoagulation.
  • Prophylactic approaches to catheter-associated thrombosis are not recommended, and the use of superior vena cava filters in deep vein thrombosis of the upper extremities should be avoided.

A 53-year-old man presented to the emergency department with a 3-week history of throbbing headaches, dizziness and cyanosis, with worsening symptoms over the previous 7 days. The patient had a history of colon cancer 2 years previously and had undergone a total colectomy complicated by a high-output ostomy. A reverse ostomy had been delayed until his body mass index decreased to reduce the risk of complications. He had an implantable venous access device in the right side of his chest that had been implanted for chemotherapy administration, frequent blood work and weekly hydration. Five weeks before presentation, his port had become blocked and he was therefore unable to receive intravenous hydration. His home care nurse had tried unsuccessfully to unblock it with tissue plasminogen activator. While he was waiting for an appointment with his family physician, he had been advised to increase his daily fluid intake to about 2.5–4.0 L per day.

On presentation, the patient had marked orthostatic changes (supine blood pressure 135/85 mm Hg, heart rate 70–90 beats/min; upright sitting blood pressure 80/50 mm Hg, heart rate 140 beats/min). His oxygen saturation was 84% on room air, improving to more than 94% on 3 L of supplemental oxygen. He was afebrile with normal mental status. He had a normal voice tone, and no signs of respiratory distress or stridor. He had marked cyanosis, facial and neck plethora, distended neck veins and engorged superficial chest wall vessels (Figure 1). The port in his right upper chest was tender to palpation. Cardiovascular and respiratory examinations were otherwise normal.

Figure 1:

Photographs of the head and upper chest of a 53-year-old man with catheter-associated superior vena cava syndrome, showing (A) facial and neck plethora, and (B) a prominent superficial venous pattern on the chest.

The patient’s complete blood count, coagulation profile, electrolytes, and creatinine were all within the normal range. An ultrasonogram of the upper extremities showed extensive thrombus in the right jugular vein, with minimal residual flow seen on colour Doppler ultrasonography. A computed tomography (CT) chest scan with contrast showed no pulmonary embolism and confirmed thrombus within the lumen of the right internal jugular vein. The superior vena cava (SVC) appeared almost completely occluded immediately above the right atrium distal to the catheter tip, with extensive venous collaterals in the mediastinum, suggestive of SVC syndrome (Figure 2).

Figure 2:

(A) An axial contrast-enhanced computed tomography scan of the chest of a 53-year-old man showed obstruction of the superior vena cava secondary to the indwelling catheter (arrow) and the adherent thrombus (*). (B) A coronal contrast-enhanced computed tomography scan of the chest showed extensive thrombus in the right brachiocephalic vein (short arrow), left brachiocephalic vein (empty arrow) and superior vena cava (long arrow), as well as enlarged mediastinal collateral veins.

We started the patient on intravenous crystalloid fluids and subcutaneous low-molecular-weight heparin. We removed the indwelling port as he was able to maintain hydration through oral intake. While in hospital, he underwent a malignancy workup, including CT of his abdomen, testing of his carcinoembryonic antigen level and a colonoscopy, all of which were negative. He continued to do well and was discharged on a 3-month course of edoxaban. At a 6-month follow-up, the patient was symptom-free with substantial improvement in his facial plethora and in his enlarged neck and chest veins (Figure 3).

Figure 3:

Photographs of a 53-year-old man, taken 6 months after his initial presentation with catheter-associated superior vena cava syndrome, showing improvement in his (A) head and neck vein distention, as well as (B) his superficial chest wall collateral veins.

Discussion

Thrombotic occlusion of central venous catheters can occur from the formation of a fibrin sheath around the catheter tip, a blood clot inside the catheter lumen, a partial or complete extraluminal venous thrombosis, or any combination of these.13 A thrombosis that occludes the vein is referred to as deep vein thrombosis (DVT). Central venous catheters predispose patients to thrombotic vascular occlusion from endothelial damage caused by friction from the catheter, turbulent blood flow or cytotoxic medications.2,4

Superior vena cava syndrome results from blood flow obstruction within the SVC. Malignant occlusion or compression is the most common cause of acquired SVC obstruction. However, with the more frequent use of intravascular devices such as catheters and pacemakers, nonmalignant causes now account for 28% of cases of SVC syndrome.1,2 Our patient’s SVC syndrome was caused by thrombosis associated with his central venous catheter.

Clinical presentation of SVC syndrome depends on the severity and location of the obstruction, and the development of collateral veins. Typical signs and symptoms of SVC syndrome include facial, neck, trunk or upper extremity swelling and pain (40%–100%); shortness of breath (54%–83%); jugular venous distension (27%); dilated anterior chest collateral veins (40%); and hoarseness, chemosis and plethora. Less commonly, patients can have symptoms of cerebral edema, including headache, confusion, dizziness and altered mental status.1,2 Duplex ultrasonography is generally the first investigation when catheter-associated thrombosis is suspected; however, signs and symptoms of SVC syndrome require prompt evaluation with contrast-enhanced CT.1 In some cases, magnetic resonance or direct contrast venography may be required.2,3 The usefulness of D-dimer testing to exclude device-associated DVT is limited for patients with central venous catheters or pacemakers.1

Management of fibrin sheath formation

Fibrin sheath formation is the most common cause of catheter dysfunction and is classically identified by being able to inject into the device, but having difficulty aspirating from it.3 The firstline treatment is administration of tissue plasminogen activator, a thrombolytic agent, into the port chamber or catheter, allowing 30–120 minutes of dwell time. The tissue plasminogen activator catalyzes the conversion of clot-bound plasminogen to plasmin, which then activates the fibrinolysis cascade.5 Thrombolytic agents successfully restore the patency of the central venous catheter in 87%–90% of cases.2,3 If thrombolytics fail, consultation with an interventional radiologist is recommended for consideration of fibrin sheath stripping or investigation for other complications of central venous catheters.6

Catheter-associated thrombosis

Catheter-associated thrombosis is the most common noninfectious complication of implantable venous access devices, and occurs at a rate of 0.76 (among patients with devices for nonmalignant causes) to 1.71 (among patients with malignant disease) thromboses per 1000 catheter-days.7 In patients with malignant disease, the incidence varies between 1.2% and 13%.7

Catheter-associated thromboses account for about 5%–10% of all cases of DVT, with incidence rising owing to increasing use of central intravenous catheters.6 The incidence of pulmonary embolism from a catheter-associated thrombosis in an upper extremity has been estimated at 10%–15% of all cases.8 Post-thrombotic syndrome presenting with upper extremity pain and swelling has been reported in 7%–37% of patients.2,6,8

Patients at higher risk for catheter-associated thrombosis have left-sided device placement, concomitant infection, larger diameter catheters with multiple lumens, peripherally inserted central catheters, catheter tip malposition, a history of DVT or hereditary thrombophilias.13 For patients with malignant disease, risk increases with higher-grade and later-stage cancer, as well as with use of the catheter for chemotherapy.9

Management of catheter-associated thrombosis

The treatment of catheter-associated thrombosis improves symptoms, prevents embolization, decreases long-term morbidity and prevents chronic venous occlusion, loss of vascular access, recurrent thrombosis and post-thrombotic syndrome.1 Anticoagulation is the initial treatment for catheter-associated thrombosis involving proximal upper extremity deep veins. Low-molecular-weight heparin reduces the rate of post-thrombotic syndrome and is the preferred initial agent for the treatment of catheter-associated thrombosis.

Long-term anticoagulation with low-molecular-weight heparin is currently recommended for patients with active cancer, given its superiority over vitamin K antagonists, direct oral anticoagulants and warfarin in preventing recurrent thrombosis.1,3,6 Patients without malignant disease can be transitioned to oral anticoagulants after symptomatic improvement. If warfarin is used, its use should overlap with low-molecular-weight heparin for a minimum of 5 days or until a therapeutic international normalized ratio is reached. Limited data support the use of direct oral anticoagulants for catheter-associated thrombosis; however, given comparable outcomes to warfarin in the management of nonmalignant thrombosis in most other contexts, it is a reasonable option.1

Duration of therapy for catheter-associated thrombosis remains controversial. The current guideline recommends treatment for a minimum of 3 months after catheter removal, and longer if the catheter remains in place.1,3 If the catheter is functioning well and is still required, it need not be removed, and anticoagulation should continue while the catheter remains in place.1,3,6 Catheter removal is indicated if the device is no longer needed, is not functioning properly or is associated with infection. If symptoms persist or worsen despite anticoagulation, endovascular management of the thrombus can be considered.1

In patients with extensive catheter-associated thrombosis refractory to anticoagulation, catheter-directed thrombolysis or thrombectomy may be required for symptom management or to preserve the vascular access site. Patients who do not respond to treatment, typically those with little or no improvement after 3 months of anticoagulation, should be referred to medical centres with expertise in interventional radiology and vascular surgery.1 The benefits of endovascular therapy include a high rate of technical success, low risk of restenosis and low occurrence of procedural complications. In a recent meta-analysis, the patency rate for endovascular therapy in patients with benign SVC syndrome was between 75.8% and 86.3%.10

Superior vena cava filters should be considered only in patients with contraindications for anticoagulation as they have been associated with a 3.8% risk of major complications, including 2% risk of pericardial tamponade and 1% risk of aortic perforation.11

Prevention of catheter-associated central venous thrombosis

Routine use of thromboprophylaxis in patients with central venous catheters is not recommended.1 Thromboprophylaxis may be considered in high-risk patients with cancer when the perceived risk of thrombosis outweighs the risk of bleeding and the burden of anticoagulation. 1,9 Central venous catheters should be used only when necessary, and the smallest catheters should be used, with removal when no longer needed.

The Evidence on Red Meat: Is it Carcinogenic or Healthy?


(Michael Dechev/Shutterstock)

For a long time, people have had different perceptions about red meat. Some believe that eating red meat causes cancer, and some who follow a carnivore diet believe that red meat has cured many of their diseases and made their bodies the best they have ever been. So should we consume red meat? How much, and how should we eat it?

Definition of Red Meat

Generally speaking, red meat is called “red” because it contains a protein called myoglobin. It keeps the muscles oxygenated, and the ferrous ions it contains turn the muscles red.

The amount of myoglobin determines the color of the meat.

Myoglobin has three different forms which can also be converted into one another. Fresh beef is bright red on the outside, and purple on the inside when cut. This is because the outer layer of the meat is in contact with oxygen, which changes the form of myoglobin. The inside retains the color of myoglobin as it has not been exposed to oxygen. Cooked or air-dried beef turns brown because the cooking and drying process further changes the form of myoglobin.

Defining red meat based solely on its color is somewhat confusing. For example, the flesh of tuna is pink, and some meat that is originally pink turns white when cooked. To simplify things, the World Health Organization (WHO) has defined red meat as the muscle meat of all mammals, including beef, veal, pork, lamb, mutton, horse, and goat.

Where Did the Claim That Red Meat Is Carcinogenic Come From?

Many people are afraid to consume red meat because they have heard it can cause cancer.

The truth is, for something to be classified as carcinogenic in scientific research, the suspected carcinogen is usually tested against three criteria to see how many of them it meets. If it meets all three, it will be classified as a carcinogen without a doubt.

To determine whether red meat is carcinogenic, it needs to go through the following process:

  1. The first step is to conduct animal experiments. A common practice is to divide the laboratory animals into two groups: one that regularly consumes a certain amount of red meat through diet, and one that consumes no red meat. At the end of the experiment, the two groups are compared to see if there is any difference in their risk of developing cancer.
  2. The second step is related to the mechanism of carcinogenesis. The carcinogenicity of red meat consumption is investigated using biochemistry or molecular biology.
  3. The third step is large-scale statistical surveys of the population, and it is mainly conducted with observational studies, such as telephone surveys or questionnaires. Individuals are divided into two groups, one that consumes more red meat and one that consumes less, and the incidence of cancer in these two groups is analyzed after several years.

Throughout the process, numerous experiments must be conducted and reach the same or similar conclusions before results can be considered sufficient evidence. It is inaccurate to conclude whether red meat causes cancer based on individual studies.

The WHO divides agents that may cause cancer into four groups according to the level of evidence. If a substance meets the above three criteria, it will be listed as a Group 1 carcinogen, alongside tobacco and alcohol.

At present, red meat does not meet the third criterion—that is, a large-scale survey of the population. In other words, there is not enough convincing experimental data to prove that the consumption of red meat is directly related to cancer, like tobacco and alcohol.

Therefore, red meat is listed by the WHO as a Group 2A carcinogen—meaning it is probably carcinogenic to humans, but its carcinogenic effect is not conclusive.

New Meta-Analysis Finds Weak Evidence of Association Between Red Meat Consumption and Cancer

A recent meta-analysis published in the Nature Medicine journal also conducted a rigorous judgment and evaluation of the carcinogenicity of red meat consumption.

Researchers at the Institute for Health Metrics and Evaluation, University of Washington, collected and analyzed 55 studies from different populations around the world. Participation in each study ranged from 600 to more than 530,000, and the follow-up time ranged from four to 32 years.

The researchers devised a five-star rating system to assess the risk of smoking, consumption of unprocessed red meat, and other factors (such as insufficient intake of vegetables) in relation to a person’s health outcomes (including breast cancer, colorectal cancer, Type 2 diabetes, ischemic heart disease, ischemic stroke, and hemorrhagic stroke). The system’s purpose was to visualize the relative likelihood of red meat causing cancer (with five stars suggesting very strong evidence of association, and one star suggesting no evidence of association).

The results of the study rated the association between consumption of unprocessed red meat and colorectal cancer, breast cancer, Type 2 diabetes, and ischemic heart disease at only two stars—that is, the evidence is weak. Additionally, the association between unprocessed red meat and ischemic or hemorrhagic stroke was also rated two stars.

Weak evidence of association between red meat consumption and cancer
The latest meta-analysis finds weak evidence of an association between red meat consumption and cancer. (The Epoch Times)

The researchers noted that while there were some studies linking the consumption of unprocessed red meat to an increased risk of disease incidence and mortality, they were “insufficient to make stronger or more conclusive recommendations.” Also, the researchers were unable to make a “strong recommendation for red meat intake level” due to wide uncertainty and the weak association between red meat consumption and cancer incidence (only two stars).

Although the results of the study are reassuring, the impact of red meat consumption on the body is worth exploring further.

Dr. Weldon Gilcrease, an associate professor in the oncology division at the University of Utah School of Medicine and a Huntsman Cancer Institute investigator, said in an interview with The Epoch Times that it is indeed hard to “tease out the impact of a single risk factor” in many large-scale epidemiological studies. However, the link between diet, lifestyle, and cancer is real. For example, he said that people who originally lived in Japan and immigrated to the United States may have an increased risk of cancer due to the influence of Western diet and lifestyle. The relationship between red meat intake and cancer risk, however, may still depend on the amount of red meat the individual consumes.

On the other hand, red meat actually offers many nutritional benefits.

Red Meat Has More Nutrients Than Iron Supplements

Nutritionist and registered dietician Amy Gonzalez mentioned in an interview with The Epoch Times that the nutrients in red meat are easily absorbed and utilized by the body. Red meat’s various nutrients are “packaged and matched,” so you can get more nutrition from a smaller “package.”

Red meat has more nutrients than iron supplements
Red meat has more nutrients than iron supplements.

1. Iron supplementation from red meat is more efficient and safe

Red meat, such as beef and lamb, is one of the richest sources of iron and zinc. According to the exposure data of one study, 100 grams of lean beef can provide about 1.8 mg of iron and 4.6 mg of zinc, accounting for about 14 percent and 42 percent of the recommended dietary intake of these nutrients. In addition, compared with the iron found in plants, the iron in meat is mostly in the form of heme iron, which is better absorbed by the body; meat protein will also enhance the body’s absorption of iron.

Gonzalez recommends people with anemia and women with heavy menstrual bleeding increase their red meat intake appropriately. This is because the iron in red meat is a “really good bioavailable source of iron.” It provides better iron supplementation than iron supplements, and it is also less likely to be consumed in excess.

The human body needs micronutrients (such as copper and zinc) and vitamin C to help it utilize iron and convert it into hemoglobin in red blood cells. Consuming red meat will also provide us with a variety of nutrients, all of which help the body utilize iron properly.

Similarly, the body absorbs zinc more efficiently from red meat than from plant-based foods. Red meat is also a good source of selenium. Every 100 grams of lean beef meat provide about 17 micrograms of selenium, which is equivalent to about 26 percent of the recommended dietary intake for this nutrient.

2. Red meat contains highly digestible protein

Red meat is rich in protein: Every 100 grams of raw red meat contains 20 to 25 grams of protein, according to findings reported in the Nutrition & Dietetics Journal of Dieticians Australia. The protein content of cooked red meat can even reach 28 to 36 grams per 100 grams due to the reduction of water content in the cooking process. Red meat is known as a “complete” protein, as it provides all the amino acids the body needs, whereas plant-based proteins are known as “incomplete” proteins because they do not contain all essential amino acids.

The digestibility of protein in red meat reaches 94 percent. The Protein Digestibility Corrected Amino Acid Score is used to assess the quality of a protein, with the highest possible score of 1.0. Red meat scores around 0.9, while most plant-based foods score between 0.5 and 0.7.

3. Red meat is a source of high-quality fatty acids

The fats in red meat include saturated fats, monounsaturated fatty acids, and polyunsaturated fatty acids. Polyunsaturated fatty acids (PUFA), also known as the “good fats,” account for 11 to 29 percent of the total fatty acids in red meat.

It is worth noting that pasture-fed beef is the first choice for those who want to obtain high-quality fatty acids from red meat, as they contain more omega-3 fatty acids. On the other hand, grain-fed beef is relatively high in omega-6 fatty acids because it is produced through grain feeding such as corn. Red meat like beef and lamb also contains more omega-3 fatty acids than chicken.

4. Red meat is rich in vitamins, especially vitamin B12

Red meat is rich in B vitamins such as B3, B6, B12, and thiamine. One hundred grams of lean beef provide 2.5 micrograms of vitamin B12, which is equivalent to 79 percent of the recommended dietary intake for this nutrient, according to the same exposure data. The older the animal, the richer the meat will be in B vitamins. Pork contains high levels of thiamine compared with other meats.

While the concentration of vitamin E is lower in red meat, it is higher in fatty meat.

According to the Nutrition & Dietetics study mentioned earlier, 100 grams of cooked beef can provide 12 percent of the daily vitamin D requirement for middle-aged and elderly people aged 51 to 70, while 100 grams of cooked lamb can provide more than 25 percent of the daily vitamin D requirement. Therefore, for the elderly who spend less time outdoors, consuming these red meats can be another effective way to obtain vitamin D.

While Red Meat Is Nutritious, a Carnivore Diet Carries Risk

Some people are afraid of eating red meat, and some people eat red meat to the extreme.

Another huge controversy revolving around red meat is the all-meat/carnivore diet.

This diet involves eating only meat or animal products (all kinds of meat, fish, and eggs), and excluding any carbohydrates.

One of the reasons that people advocate for this diet is that it was the way of our hunter-gatherer ancestors.

However, describing the diet of ancient people as consisting mainly or only of meat can be misleading and inaccurate. This is because archaeological evidence and observational studies of several surviving primitive tribes suggest that our ancestors ate a wide variety of foods, including high-carb foods such as fruits, vegetables, starchy vegetables, and honey.

Omitting fruits and vegetables from your diet without a doctor’s advice and guidance can lead to adverse consequences.

A carnivore diet is extremely low in fiber, which can cause constipation. Constipation is not just an inability to defecate—it damages the body and mind in many ways.

A carnivore diet is high in saturated fat, which raises bad cholesterol in the blood and puts a person at risk for cardiovascular diseases. Consuming a lot of meat protein can also put undue stress on the kidneys. Furthermore, many processed meats, such as bacon and luncheon meat, are high in sodium, and a high-sodium diet can lead to kidney problems and high blood pressure.

What Is The Healthiest Way To Consume Red Meat?

1. Eat a variety of red meat alternately, two or three times a week

Gonzalez said that the recommended amount of red meat intake varies from person to person. For the average person, the approximate recommended intake of red meat is two to three servings per week, and each serving is about the size of a palm (about 100 grams).

Gilcrease recommends eating red meat no more than twice a week.

We can alternate various red meats in our daily diet. In addition to red meat, we should also consume a mix of other meats, such as various poultry and seafood.

2. Avoid frying and grilling, as low-temperature cooking is healthier

Cooking methods such as low-temperature roasting and poaching not only preserve the natural flavor of red meat, but also prevent the production of toxic substances.

Gonzalez explained that improper cooking methods, such as high-temperature frying and charcoal smoking, can char the meat. Toxic byproducts, including heterocyclic aromatic amines (HAAs) and polycyclic aromatic hydrocarbons (PAHs), are produced during this process. These substances are produced in greater amounts during high-temperature cooking compared with low-temperature cooking. Meat processing methods such as curing and smoking will produce carcinogenic chemicals including N-nitroso compounds (NOCs) and PAHs.

Besides, heat-processed meat produces advanced glycation end products (AGEPs), which are commonly found in canned and deli processed-meat products. According to the Nutrition & Dietetics study, AGEPs are a normal part of the body’s metabolism, but they can become pathogenic if their levels are very high in tissues and in circulation.

In Search of the Optimal Migraine Relief


Summary: Using oral transmucosal delivery of eletriptan hydrobromide delivers faster and more effective relief for migraine sufferers.

Source: Malmo University

In order for migraine medication to be effective, it is vital that the active substance is released into the bloodstream immediately. The pills currently on the market today pass through the body’s metabolism which means the effectiveness is lessened and there is a delay to the relief.

A research team at Malmö University believes they can get around this by means of a shortcut in the mucous membrane in the mouth.

The active substances in migraine medicine are known as triptans. This is a collective name for tryptamine-based drugs that react with serotonin receptors and thereby inhibit certain signalling substances in the brain that can prompt the experience of pain. Serotonin is one of the most important signalling substances in the human nervous system and affects, among other things, sexual behaviour, appetite, sleep and pain.

In the research project Oral transmucosal delivery of eletriptan for neurological diseases, Sabrina Valetti and her research colleagues have chosen to work with eletriptan hydrobromide (EB), which is the triptan that has the least toxic effect on the heart.

“A regular triptan pill must pass through both the stomach and the liver, where a large part of the metabolism takes place. Studies show that more than half of the triptan dose is broken down on the way before it reaches the blood. We have investigated the possibility of getting EB directly into the blood vessels of the mouth through the mucosa under the tongue,” explains Valetti, who leads the project at the Biofilms Research Center for Biointerfaces.

This shows a brain
Serotonin is one of the most important signalling substances in the human nervous system and affects, among other things, sexual behaviour, appetite, sleep and pain

“We know from patient studies that it is important for the substance to reach maximum concentration in the blood within two hours in order to have an effect. So we investigated what the expected concentration of EB was with our method after this time. We saw that the expected concentration was higher in the 3D human cells than those provided by regular migraine pills. This was also the case for the pig mucosa, but only if the pH value was raised,” she says, and continues:

“Our body has a buffer system that regulates and balances temporary pH variations and we saw no toxic effect on the mucosa during a four-hour period when the pH value was increased from 6.8 to 10.4. But what we don’t know is whether this is experienced as unpleasant in the mouth or not.”

The biggest challenge lies in the fact that the mucous membrane is a relatively thick tissue and a barrier that should protect us from a variety of external attacks. Last autumn, they therefore carried out tests where they examined in detail the lipids, which are believed to play a decisive role in pig mucous membrane in order to gain a better understanding of this particular barrier effect.

Abstract

Oral transmucosal delivery of eletriptan for neurological diseases

Migraine is a highly prevalent neurological disease affecting circa 1 billion patients worldwide with severe incapacitating symptoms, which significantly diminishes the quality of life. As self-medication practice, oral administration of triptans is the most common option, despite its relatively slow therapeutic onset and low drug bioavailability.

To overcome these issues, here we present, to the best of our knowledge, the first study on the possibility of oral transmucosal delivery of one of the safest triptans, namely eletriptan hydrobromide (EB).

See also

This shows a mom and her baby

FeaturedNeuroscience

August 4, 2022

Mothers Use the Benefits of Song to Promote Infant Development

Based on a comprehensive set of in vitro and ex vivo experiments, we highlight the conditions required for oral transmucosal delivery, potentially giving rise to similar, or even higher, drug plasma concentrations expected from conventional oral administration.

With histology and tissue integrity studies, we conclude that EB neither induces morphological changes nor impairs the integrity of the mucosal barrier following 4 h of exposure.

On a cellular level, EB is internalized in human oral keratinocytes within the first 5 min without inducing toxicity at the relevant concentrations for transmucosal delivery. Considering that the pKa of EB falls within the physiologically range, we systematically investigated the effect of pH on both solubility and transmucosal permeation.

When the pH is increased from 6.8 to 10.4, the drug solubility decreases drastically from 14.7 to 0.07 mg/mL. At pH 6.8, EB gave rise to the highest drug flux and total permeated amount across mucosa, while at pH 10.4 EB shows greater permeability coefficient and thus higher ratio of permeated drug versus applied drug. Permeation experiments with model membranes confirmed the pH dependent permeation profile of EB.

The distribution of EB in different cellular compartments of keratinocytes is pH dependent. In brief, high drug ionization leads to higher association with the cell membrane, suggesting ionic interactions between EB and the phospholipid head groups. Moreover, we show that the chemical permeation enhancer DMSO can be used to enhance the drug permeation significantly (i.e., 12 to 36-fold increase).

Taken together, this study presents important findings on transmucosal delivery of eletriptan via the oral cavity and paves the way for clinical investigations for a fast and safe migraine treatment.

People With Autism Experience Pain at a Higher Intensity


Summary: People with autism experience pain at a higher intensity than those not on the autism spectrum and are less adaptable to the sensation. This revelation contradicts the prevailing belief that those with ASD tend to be indifferent to pain.

Source: Tel Aviv University

A new study has examined the pain perception among people with autism and found that they experience pain at a higher intensity than the general population and are less adaptable to the sensation.

This finding is contrary to the prevalent belief that people with autism are supposedly ‘indifferent to pain’.

The researchers expressed the hope that the findings of their study will lead to more appropriate treatment on the part of medical staff, caregivers, and parents toward people with autism, who do not always express the experience of pain in the usual way.

The study was funded by the Israel Science Foundation, and was led by four researchers: Dr. Tami Bar-Shalita of the Sackler Faculty of Medicine at Tel Aviv University who initiated the study, in collaboration with Dr. Yelena Granovsky of the Technion and Rambam Medical Center, and Prof. Irit Weissman-Fogel and Prof. Eynat Gal of the University of Haifa. This study constitutes a framework for the theses of PhD students Tzeela Hofmann and Mary Klingel-Levy, and three articles based on it have already been published or approved for publishing.

The present study has been published in the prestigious PAIN journal.

Dr. Bar-Shalita explains: “approximately 10% of the general population suffer from sensory modulation dysfunction, which means sensory hypersensitivity at a level that compromises normal daily functioning and quality of life.

“These people have difficulty, for example, ignoring or adapting to buzzing or flickering of fluorescent lights, humming of air conditioners or fans, or the crunching of popcorn by someone sitting next to them in the cinema. In previous studies in the lab we found that these people suffer from pain more than those without sensory modulation dysfunction.

“Since it is known that sensory modulation dysfunction occurs in people with autism at a rate of 70-90%, it constitutes a criterion for diagnosing autism, and is associated with its severity. We were interested in exploring pain perception in autism, so we asked: do people with autism hurt more than the general population? This question was hardly studied in the lab before we got started.”

According to the researchers, for many years the prevalent opinion was that ‘people with autism hurt less’ or that they were ‘indifferent to pain’. Actually, ‘indifference to pain’ is one of the characteristics presented in the current diagnostic criteria of autism. The proof of this was, supposedly, their tendency to inflict pain on themselves by self-harm.

Dr. Bar-Shalita: “this assumption is not necessarily true. We know that self-harm could stem from attempts to suppress pain, and it could be that they hurt themselves in order to activate, unconsciously, a physical mechanism of ‘pain inhibits pain’.”

This study is a laboratory pain study approved by the ethics committee of the academic institutions and Rambam Medical Center. The study included 52 adults with high-functioning autism (HFA) and normal intelligence – hitherto the largest reported sample in the world in studies on pain among people with autism.

The study made use of psychophysical tests to evaluate pain, commonly used in the area of pain study. These methods examine the link between stimulus and response, while the researcher, using a computer, controls the duration and intensity of stimulus and the examinee is asked to rank the intensity of the pain felt by him on a scale of 0 to 100.

The findings have proven beyond doubt that people with autism hurt more. Furthermore, their pain suppression mechanism is less effective.

The researchers: “we conducted a variety of measurements, aimed among other things at examining whether the hypersensitivity to pain derives from a sensitized nervous system or from suppression of mechanisms that are supposed to enable adjustment and, over time, reduce the response to the stimulus.

“We found that in the case of people with autism, it is a combination of the two: an increase of the pain signal along with a less effective pain inhibition mechanism.”

This shows a child crying after falling over
Actually, ‘indifference to pain’ is one of the characteristics presented in the current diagnostic criteria of autism.

Dr. Bar-Shalita concludes: “Our study constituted a comprehensive, in-depth study of the intensity of pain experienced by people with autism. The prevalent belief was that they are supposedly ‘indifferent to pain’, and there are reports that medical and other professional staff treated them accordingly.

“The results of our study indicate that in most cases, the sensitivity to pain of people with autism is actually higher than that of most of the population, while at the same time they are unsuccessful at effectively suppressing painful stimuli.

” We hope that our findings will benefit the professionals and practitioners handling this population and contribute to the advancement of personalized treatment.”

In additional articles soon to be published, the researchers have examined the brain activity of people with autism during pain stimuli, and sub-groups within this population concerning their perception of pain.

Abstract

Indifference or hypersensitivity? Solving the riddle of the pain profile in individuals with autism

Excitatory–inhibitory (E/I) imbalance is a mechanism that underlies autism spectrum disorder, but it is not systematically tested for pain processing. We hypothesized that the pain modulation profile (PMP) in autistic individuals is characterized by less efficient inhibitory processes together with a facilitative state, indicative of a pronociceptive PMP.

Fifty-two adults diagnosed with autism and 52 healthy subjects, age matched and sex matched, underwent quantitative sensory testing to assess the function of the (1) pain facilitatory responses to phasic, repetitive, and tonic heat pain stimuli and (2) pain inhibitory processes of habituation and conditioned pain modulation. Anxiety, pain catastrophizing, sensory, and pain sensitivity were self-reported.

The autistic group reported significantly higher pain ratings of suprathreshold single (P = 0.001), repetitive (46°C- P = 0.018; 49°C- P = 0.003; 52°C- P < 0.001), and tonic (P = 0.013) heat stimuli that were cross correlated (r = 0.48-0.83; P < 0.001) and associated with sensitivity to daily life pain situations (r = 0.39-0.45; P < 0.005) but not with psychological distress levels.

Hypersensitivity to experimental pain was attributed to greater autism severity and sensory hypersensitivity to daily stimuli.

Subjects with autism efficiently inhibited phasic but not tonic heat stimuli during conditioned pain modulation.

In conclusion, in line with the E/I imbalance mechanism, autism is associated with a pronociceptive PMP expressed by hypersensitivity to daily stimuli and experimental pain and less-efficient inhibition of tonic pain. The latter is an experimental pain model resembling clinical pain.

These results challenge the widely held belief that individuals with autism are indifferent to pain and should raise caregivers’ awareness of pain sensitivity in autism.

Efanesoctocog Alfa Prophylaxis for Patients with Severe Hemophilia A


Abstract

Background

Efanesoctocog alfa provides high sustained factor VIII activity by overcoming the von Willebrand factor–imposed half-life ceiling. The efficacy, safety, and pharmacokinetics of efanesoctocog alfa for prophylaxis and treatment of bleeding episodes in previously treated patients with severe hemophilia A are unclear.

Methods

We conducted a phase 3 study involving patients 12 years of age or older with severe hemophilia A. In group A, patients received once-weekly prophylaxis with efanesoctocog alfa (50 IU per kilogram of body weight) for 52 weeks. In group B, patients received on-demand treatment with efanesoctocog alfa for 26 weeks, followed by once-weekly prophylaxis with efanesoctocog alfa for 26 weeks. The primary end point was the mean annualized bleeding rate in group A; the key secondary end point was an intrapatient comparison of the annualized bleeding rate during prophylaxis in group A with the rate during prestudy factor VIII prophylaxis. Additional end points included treatment of bleeding episodes, safety, pharmacokinetics, and changes in physical health, pain, and joint health.

Results

In group A (133 patients), the median annualized bleeding rate was 0 (interquartile range, 0 to 1.04), and the estimated mean annualized bleeding rate was 0.71 (95% confidence interval [CI], 0.52 to 0.97). The mean annualized bleeding rate decreased from 2.96 (95% CI, 2.00 to 4.37) to 0.69 (95% CI, 0.43 to 1.11), a finding that showed superiority over prestudy factor VIII prophylaxis (P<0.001). A total of 26 patients were enrolled in group B. In the overall population, nearly all bleeding episodes (97%) resolved with one injection of efanesoctocog alfa. Weekly prophylaxis with efanesoctocog alfa provided mean factor VIII activity of more than 40 IU per deciliter for the majority of the week and of 15 IU per deciliter at day 7. Prophylaxis with efanesoctocog alfa for 52 weeks (group A) improved physical health (P<0.001), pain intensity (P=0.03), and joint health (P=0.01). In the overall study population, efanesoctocog alfa had an acceptable side-effect profile, and the development of inhibitors to factor VIII was not detected.

Conclusions

In patients with severe hemophilia A, once-weekly efanesoctocog alfa provided superior bleeding prevention to prestudy prophylaxis, normal to near-normal factor VIII activity, and improvements in physical health, pain, and joint health.

Source: NEJM

New Class of Factor VIII Replacement Therapy Prevents Bleeds in Severe Hemophilia A


Weekly dosing with efanesoctocog alfa reduced annualized bleeding rate by 77%

A computer rendering of red blood cells and platelets in a blood vessel.

Treatment with investigational efanesoctocog alfa (formerly BIVV001) — a new class of factor VIII replacement therapy — prevented bleeding episodes in patients with severe hemophilia A, according to results from the phase III XTEND-1 trialopens in a new tab or window.

Among 133 patients, almost all males, who received once-weekly prophylaxis with efanesoctocog alfa 50 IU/kg for 52 weeks (group A), the median annualized bleeding rate was 0 (interquartile range 0-1.04), and the estimated mean annualized bleeding rate was 0.71 (95% CI 0.52-0.97), reported Annette von Drygalski, MD, PharmD, of the University of California San Diego, and colleagues.

In an analysis involving a subgroup of 78 patients, switching from a pre-study standard-care factor VIII prophylaxis regimen to efanesoctocog alfa decreased the mean annualized bleeding rate from 2.96 to 0.69, a reduction of 77%, which showed superiority over pre-study prophylaxis (annualized bleeding rate ratio 0.23, 95% CI 0.13-0.42, P<0.001), they noted in the New England Journal of Medicineopens in a new tab or window.

In a smaller group of 26 men who received on-demand treatment with efanesoctocog alfa for 26 weeks, followed by once-weekly prophylaxis for 26 weeks (group B), the annualized bleeding rate decreased when patients switched from on-demand treatment to prophylaxis (21.42 versus 0.69).

Overall, 80% of group A and 85% of group B patients had no spontaneous bleeding episodes during the prophylaxis periods. When bleeding events did occur, a single injection of efanesoctocog alfa 50 IU/kg provided effective resolution for 97% of events.

Of note, von Drygalski and team said that once-weekly prophylaxis with efanesoctocog alfa provided mean factor VIII activity of more than 40 IU/dL for approximately 4 days after administration and of 15 IU/dL at day 7.

Furthermore, prophylaxis with efanesoctocog alfa for 52 weeks improved physical health (P<0.001), pain intensity (P=0.03), and joint health (P=0.01).

“Collectively, these results show that by maintaining high sustained factor VIII activity, once-weekly efanesoctocog alfa provided substantial improvements in clinical outcomes and quality of life for patients with severe hemophilia A,” the authors wrote.

Efanesoctocog alfa is currently under review with the FDAopens in a new tab or window for the treatment of hemophilia A, with a target action date of February 28, according to developers Sanofi and Sobi.

Breaking Through the von Willebrand Factor ‘Ceiling’

The ability to provide high, sustained factor VIII activity in patients with hemophilia A has been constrained by the von Willebrand factor-imposed half-life ceiling.

While normalizing factor VIII levels helps protect patients with hemophilia A from spontaneous and traumatic bleeding, thus preserving joint health, von Drygalski and colleagues pointed out that the interaction between factor VIII and endogenous von Willebrand factor limits the half-life of current factor VIII replacement products to 8 to 19 hours.

“Therefore, maintaining factor VIII levels in the normal range (50 to 150 IU per deciliter) or levels that are close to normal (>40 to <50 IU per deciliter) with currently available factor VIII therapies requires frequent administration, which confers a substantial treatment burden on people with hemophilia and their caregivers,” they wrote.

Efanesoctocog alfa circulates independently of endogenous Von Willebrand factor, is designed to overcome that half-life ceiling, and thus extends protection from bleeds with once-weekly dosing. In this study, efanesoctocog alfa had a long geometric mean half-life of 47.0 hours (95% CI 42.3-52.2).

In an editorial accompanying the studyopens in a new tab or window, Cindy Leissinger, MD, of Tulane University School of Medicine in New Orleans, said a major advantage of efanesoctocog alfa “is its value as a true factor VIII replacement that can be used to treat acute bleeding and can be measured by standard laboratory assays to allow for monitoring and dose adjustments when needed.”

She also pointed out that while trials of factor VIII gene therapies also show prolonged production of factor VIII, the latest results suggest that factor VIII production in most patients gradually declines over a few years to trough levels demonstrated with weekly efanesoctocog alfa.

“In a crowded field of transformative therapies for hemophilia, efanesoctocog alfa stands out as a winner — a major therapeutic advance that achieves highly protective factor VIII levels with a once-weekly infusion,” Leissinger concluded.

Study Details

For this multicenter study, previously treated patients ages 12 years and older with endogenous factor VIII activity of less than 1 IU/dL (<1%) or a documented genotype known to produce severe hemophilia A were included.

Patients were excluded if they had a positive test for factor VIII inhibitor at screening or a history of a positive inhibitor test, clinical signs or symptoms of a decreased response to factor VIII, other known coagulation disorders, a history of hypersensitivity or anaphylaxis to factor VIII therapies, or major surgery within 8 weeks before screening.

Among the 133 patients in group A, mean age was 33.9 years, 99% were males, 53% were white, and 22% were Asian. Of those with evaluable data during the efficacy period, 65% had no bleeding episodes, while 93% had 0-2 bleeding episodes.

Of the 26 patients in group B, mean age was 42.8, all were men, and all were white.

A total of 362 bleeding events occurred during the study, with the majority (74%) in group B during the on-demand treatment period.

Chronic escitalopram in healthy volunteers has specific effects on reinforcement sensitivity: a double-blind, placebo-controlled semi-randomised study


Abstract

Several studies of the effects on cognition of selective serotonin reuptake inhibitors (SSRI), administered either acutely or sub-chronically in healthy volunteers, have found changes in learning and reinforcement outcomes. In contrast, to our knowledge, there have been no studies of chronic effects of escitalopram on cognition in healthy volunteers. This is important in view of its clinical use in major depressive disorder (MDD) and obsessive-compulsive disorder (OCD). Consequently, we aimed to investigate the chronic effect of the SSRI, escitalopram, on measures of ‘cold’ cognition (including inhibition, cognitive flexibility, memory) and ‘hot cognition’ including decision-making and particularly reinforcement learning. The study, conducted at the University of Copenhagen between May 2020 and October 2021, used a double-blind placebo-controlled design with 66 healthy volunteers, semi-randomised to receive either 20 mg of escitalopram (n = 32) or placebo (n = 34), balanced for age, sex and intelligence quotient (IQ) for at least 21 days. Questionnaires, neuropsychological tests and serum escitalopram measures were taken. We analysed group differences on the cognitive measures using linear regression models as well as innovative hierarchical Bayesian modelling of the Probabilistic Reversal Learning (PRL) task. The novel and important finding was that escitalopram reduced reinforcement sensitivity compared to placebo on both the Sequential Model-Based/Model-Free task and the PRL task. We found no other significant group differences on ‘cold’ or ‘hot’ cognition. These findings demonstrate that serotonin reuptake inhibition is involved in reinforcement learning in healthy individuals. Lower reinforcement sensitivity in response to chronic SSRI administration may reflect the ‘blunting’ effect often reported by patients with MDD treated with SSRIs.

Introduction

Serotonin or 5-hydroxytryptamine (5-HT) is a monoamine neurotransmitter implicated in several cognitive and affective brain functions [1]. Drugs that target serotonin transmission, such as selective serotonin reuptake inhibitors (SSRIs) are the first-line pharmacological treatments for many neuropsychiatric disorders such as major depressive disorder (MDD), obsessive-compulsive disorder (OCD) and anxiety [2]. Understanding the modulatory role of serotonin on cognition and reinforcement learning is particularly important [3].

Many studies examining the modulatory effects of serotonin on cognition have been conducted in experimental animals [4,5,6]. In rats, impairing serotonin function disrupted reversal learning, whereas enhancing serotonin function improved reversal learning [4]. In marmoset monkeys, targeted neurotoxic serotonin depletion of the orbito-frontal cortex, but not of the caudate nucleus, consistently produced reversal deficits [5, 6]. Marmosets have also shown reduced reinforcement sensitivity following serotonin depletion [7].

In humans, the modulatory effects of serotonin on cognition have largely been examined through acute dietary tryptophan depletion (ATD) [8,9,10,11,12,13,14,15,16,17,18], or through acute SSRI administration [19,20,21,22,23]. ATD has been shown to affect measures of both ‘cold’ (rational and non-emotional), and ‘hot’ (social and emotional) cognition [13,14,15,16,17,18]. Specifically, ATD induces ‘waiting’ impulsivity and impulsive behaviours, impairs goal-directed behaviour and shifts behavioural control toward habitual responding in appetitive conditions, but goal-directed in aversive conditions [10,11,12]. Effects of ATD have also been seen on reinforcement behaviour [13], reversal learning [9, 14], learning and memory [15], affective and social cognition [9, 16, 17] and moral judgement [18]. Studies examining the acute administration of SSRIs have shown impaired probabilistic learning [19, 20], and impaired cognitive flexibility [20], but increased long-term memory recall [21], emotion recognition [22], and harm aversion for moral judgements [23]. One study showed that response inhibition improved with SSRI administration [20], whereas another showed no effect [19]. Taken together, a wide range of cognitive functions is affected by serotonin modulation in healthy volunteers.

Given that SSRIs are administered chronically in the treatment of neuropsychiatric disorders, it is particularly important to understand the long-term effects of SSRI administration on cognition. Currently, only a few studies have examined SSRIs administered sub-chronically, over approximately 7 days [24, 25]. Short-term administration of antidepressants may ameliorate the negative biases in information processing that are often present in mood and anxiety disorders [24]. A recent study examined both the acute and short-term effects of SSRIs. The results showed that acute administration did not affect reinforcement learning, but short-term administration resulted in increased learning from punishment, with reduced learning from reward [25]. However, there was no statistical difference in performance between the acute and short-term administration, and therefore these results must be interpreted with caution. In addition, studies with patients with MDD have shown that SSRIs impair learning from negative feedback, while having negligible effect on learning from positive feedback [26]. These findings demonstrate the difficulty in understanding the modulatory role of SSRI on various cognitive and motivational processes. One study gave a tryptophan-rich diet to middle-aged healthy volunteers for 19 days and showed that emotional bias to negative stimuli was reduced [27].

Understanding the acute effects of SSRIs on cognitive processes in healthy volunteers and patients with MDD is complex. This may be due to the differing possible pre- and post-synaptic actions [28]. In addition, there is some evidence that the neuroplasticity effects of SSRIs emerge only after more chronic administration (14–21 days) [29, 30]. As such, the chronic administration of SSRI may provide more robust results. Importantly, chronic SSRI administration is an experimental model that better mimics a treatment model of MDD. In addition, to our knowledge, no studies have examined the more chronic effects of SSRIs on a wide range of cognitive measures.

Escitalopram is the active S-enantiomer of the racemic SSRI citalopram (RS-citalopram) [31]. By removing the R enantiomer and only containing the pure active S enantiomer the effects of the drug are improved [28]. For example, there are no higher dose restrictions, and it also makes the lowest dose more efficacious [28]. In addition, Escitalopram shows very high selectivity for the serotonin transporter and is thus the best choice for testing pharmacologic actions of SSRIs [28, 31]. Moreover, escitalopram is an effective treatment for moderate-to-severe major depressive disorder (MDD) and is one of the best-tolerated SSRIs [28, 31].

In the present study, we used a double-blind placebo-controlled design to examine the effects of the SSRI escitalopram administered on average for 26 days, on a comprehensive set of measures of ‘cold’ and, ‘hot’ cognition, including decision-making and computational measures of reinforcement learning. We hypothesised that SSRI treatment would affect reinforcement-related behaviour, probabilistic reversal learning, and response inhibition.

Discussion

To our knowledge, this is the first study to determine the effects of chronic escitalopram administration on a broad range of measures of ‘cold’, and ‘hot’ cognition, including reinforcement learning in healthy volunteers. In this double-blind placebo-controlled study, a relatively large group of healthy volunteers received either escitalopram or placebo for an average of 26 days. The novel and important finding was that escitalopram had the specific effect of reducing reinforcement sensitivity in two independent tests, but had no effects on other measures of ‘cold’ or ‘hot’ cognition.

Reinforcement behaviour

The reinforcement sensitivity parameter, as modelled here, governs the degree to which a participant is driven by their reinforcement history [35]. Using this innovative approach, we found reduced reinforcement sensitivity in the escitalopram group in two different test paradigms, one on model-based vs model-free behaviour (Fig. 2) and the other in a standard PRL task. A previous study examining how reinforcement is influenced by serotonergic modulation showed that acute tryptophan depletion decreased reinforcement sensitivity, by impairing the representation of reward outcome value [13]. This was only the case for reward sensitivity, and there was no effect on punishment sensitivity [13]. In the present study, we did not find any effects on reward or punishment learning rates, whereas one study showed increased reward learning neural signals, specifically related to prediction error, following 2 week SSRI administration [39]. However, it is important to note the different methodologies used in these studies, which makes a direct comparison of the results difficult.

Importantly, our results are of considerable relevance when considering the patients’ experience of taking SSRIs chronically. Patients’ often report experiencing a ‘blunting’ effect [40,41,42]. This blunting effect has also been demonstrated for rewarding and punishing stimuli. Specifically, participants receiving 7 days of SSRI had lower neural processing of both rewarding and aversive stimuli [43]. In light of our own results, it is possible that the clinical effectiveness of SSRIs for MDD is due to this reduced negative affect. However, if indeed positive affect is also reduced, then this would lead to a more general blunting effect, as often reported by patients taking chronic SSRIs. This is supported by the present study, in which lower reinforcement sensitivity would suggest decreased control over behaviour by both rewarding and punishing stimuli. This may also be further supported by our findings that the escitalopram group had significantly higher dysfunction on the dimensions/phases corresponding to orgasm/completion on the CSFQ-14. It is possible that participants taking escitalopram experience greater sexual dysfunction due to experiencing less pleasure, which has been supported by previous reports [44]. However, this is speculative as there are other mechanisms that may explain this effect [45].

‘Hot’ cognition

Our results showed no effects on other measures of ‘hot’ cognition. Studies have previously shown that acute and sub-chronic SSRI intervention affects emotion recognition, specifically for recognition of fear and happiness [22]. In our chronic administration study, we did not examine emotion recognition in each emotion, but rather examined affective bias, which was not affected significantly by escitalopram. We also did not find any effects of escitalopram on moral judgements as previously reported following acute treatment [18, 23].

‘Cold’ cognition

Our results showed no significant effects on any measures of ‘cold’ cognition, thus contrasting with some of our previous data obtained following acute administration [20]. Previous studies manipulating serotonin acutely have shown alterations in both ‘cold’ and ‘hot’ cognitive measures [8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23]. However, it is important to note that our study examined SSRI administration over a longer time period, which has not previously been much studied in the context of human cognition. Contrasting with the present findings Skandali and colleagues [20], who used similar neuropsychological tests, showed that participants administered escitalopram acutely made more errors to criterion during Stage 1 and exhibited increased lose-shifting after misleading negative feedback in the PRL task. However, they did not conduct the same hierarchical Bayesian modelling employed in the present study. Such an analysis of the Skandali et al. [20] similarly shows reduced reinforcement sensitivity in the escitalopram group compared to placebo controls (unpublished findings). In addition, in the present study there was no effect of escitalopram on performance of the 3D-IED task, suggesting that this result may be specific to learning when there is greater uncertainty, as the 3D-IED is deterministic and the PRL is probabilistic in nature. We showed no effect of escitalopram on response inhibition, whereas Skandali et al. [20] showed that acute escitalopram improved stop-signal reaction time. However, in line with our results, Chamberlain et al. [19] showed no acute effect of the SSRI citalopram on response inhibition.

There are a number of points to note when interpreting the results from the present study. First, it is important to acknowledge the differences between acute and chronic SSRI administration. Previous literature has suggested that neuro-adaptive changes might represent homoeostatic mechanisms by which the brain regulates neurotransmission in response to the drug [46, 47]. This may suggest that the acute effects are not as robust as longer-term effects where this mechanism would have stabilised. In addition, acute administration of SSRIs does not seem to affect neuroplasticity, which does occur when administered chronically [29, 30]. Moreover, the synaptic mechanism of action for acute and chronic SSRI administration differs [28, 48]. A meta-analysis showed that within the first week of SSRI administration, 5-HT concentrations drop, which then increases over the following two weeks of administration, although this does vary slightly in different regions of the brain [48]. We chose the duration of 3 weeks because this duration is associated with clinical benefits in patients with MDD and with translational studies of neuroplasticity effects. However, we cannot rule out that neuroplasticity effects might be greater with a longer duration of escitalopram.

Second, the approaches for induction of changes in serotonin vary and this could result in the inconsistent findings. For example, previous studies manipulating serotonin acutely with different methods and using the same PRL task, showed inconsistent results using conventional behavioural measures [9, 19, 20]. It should be noted that currently there is no way to reliably determine interstitial serotonin concentrations non-invasively in humans, which means that interpretation of the manipulations must be inferential. Finally, it is likely that the escitalopram effects are less discernible in our cognitively high-performing (average IQ > 110) healthy volunteers than in patients with MDD. Studies on MDD have found that SSRI intervention often normalises abnormal neural processing [49,50,51,52], which in turn improves cognitive functioning and at a later time point, mood [52]. As healthy individuals are cognitively intact, it is possible that the effects may be different from those in patients with MDD. Differential effects on cognition and mood can be seen when studies are conducted with healthy volunteers or patients with MDD [8, 53]. As such the mechanism of SSRIs may be more restorative in MDD, which is unnecessary in healthy individuals. Given that SSRIs are chronically administered to patients with neuropsychiatric disorders, the present results are more clinically relevant than those of acute studies.

One possible limitation of the study was that there was a significant difference in guessing group allocation. However, the escitalopram group were at chance level for guessing group allocation (53% guessing correctly) and over 15% of the placebo group guessed they were on active substance. It is difficult to know how our results on guessing group allocation compares with other studies, as this measure is frequently not reported in the literature. The results are unlikely to have been affected by this, given the lack of cognitive changes and specificity of the effect.

Our results, importantly, showed a specific significant effect on reinforcement sensitivity, where escitalopram reduced reinforcement sensitivity, which may in part be explanatory for the blunting effect often reported by patients receiving chronic SSRI treatments. This study also highlights the need for future studies to examine chronic administration of SSRIs beyond 21 days. In addition, future studies should examine the chronic effects of SSRI administration on a similar extensive battery including ‘cold’, and ‘hot’ cognition, particularly reinforcement behaviour in patients with neuropsychiatric disorders such as MDD or OCD.

Conclusion

In contrast with previous reports on the acute effects of SSRI administration, we did not find any significant effects on ‘cold’ cognitive measures after more chronic administration (mean 26 days). Using an innovative computational modelling approach, we did find significant effects specific to reinforcement learning; chronic escitalopram reduced reinforcement sensitivity compared to placebo. These novel findings provide strong evidence for a key role of serotonin in reinforcement learning. The results have important clinical implications as they may reflect the blunting effect often reported by patients with neuropsychiatric disorders receiving chronic SSRI treatment.

Source: Nature