The World’s First Drug to Treat Untreatable ‘Bad Cholesterol’: Study


The World’s First Drug to Treat Untreatable ‘Bad Cholesterol’: Study

The new drug muvalaplin may be the world’s first drug to treat previously untreatable “bad cholesterol,” according to a new Australian study.

The study, published on Aug. 28 in the Journal of the American Medical Association (JAMA), showed that muvalaplin reduced lipoprotein(a), or Lp(a), a genetically inherited carrier for “bad cholesterol,” by up to 65 percent after two weeks of daily treatment.

Around 1 in 5 people globally have high Lp(a) levels. A serum level above 30 mg/dl puts people at risk of heart disease and heart attack, and above 50 mg/dl raises the risk of stroke.

“When it comes to treating high Lp(a), a known risk factor for cardiovascular disease, our clinicians currently have no effective tools in their kit,” Dr. Stephen Nicholls, a cardiologist, the study’s lead researcher, and a professor at Monash University, said in a press release.

Dr. Nicholls believes the drug could be a “game changer.” Muvalaplin is the first oral drug specifically designed to target Lp(a), according to the study. “Not only do we have an option for lowering an elusive form of cholesterol, but being able to deliver it in an oral tablet means it will be more accessible for patients,” he said.

Since the study only investigated phase 1 of the clinical trial, which tests for drug safety, “it remains uncertain whether Lp(a) lowering with muvalaplin will reduce cardiovascular risk,” the study authors wrote

Only 89 healthy individuals took muvalaplin in phase 1 of the clinical trial. Phase 2 of the trial is ongoing and is due to be completed in 2024.

What Is Lipoprotein(a)?

Lipoprotein(a) is a type of low-density lipoprotein (LDL) particle, most infamous for carrying “bad cholesterol.”

Researchers have debated the accuracy of describing high-density lipoprotein (HDL) as “good cholesterol” and LDL as “bad cholesterol,” considering both cholesterols are the same, just transported by different carriers that perform different functions. Some researchers argue the harm lies in the carrier and not in the cholesterol transported inside.

While normal LDL levels can be influenced through lifestyle and dietary interventions, the Lp(a) levels in the body cannot be since these are influenced mainly by genes. Therefore, very few treatments can lower Lp(a).

Like all LDL particles, the job of Lp(a) is to carry cholesterol and fat from the liver to the tissues in the body. Some of it may get stuck to the blood vessel walls, causing the cholesterol it carries to spill out and form plaques.

Lp(a) is made in the liver, where an extra string of protein known as apolipoprotein(a) is attached to the LDL particle. This extra protein chain makes Lp(a) stickier and may make it more likely to stick together or build up in blood vessel walls.

Normal low-density lipoprotein, LDL (left), and lipoprotein(a) Lp(a). (The Epoch Times)
Normal low-density lipoprotein, LDL (left), and lipoprotein(a) Lp(a).

What Does Muvalaplin Do to Lipoprotein(a)?

Muvalaplin works in the body by blocking the first step that occurs in the liver and causes the proteins to bind together to form a chain.

“This approach mimics naturally occurring variants,” the authors wrote. People with these variants cannot get proteins to interact with each other, resulting in naturally low Lp(a) levels.

The study followed 89 healthy individuals given muvalaplin at varying doses. Half were given a fixed dose, while the other half were given an increasing dosage. Twenty-five participants took the placebo drug.

Within 24 hours of the participants taking muvalaplin, researchers observed a reduction in Lp(a), with a further reduction on repeated dosing. Participants who took the highest dose saw the most rapid Lp(a) clearance.

Overall, participants saw a maximum of 60 percent overall Lp(a) reduction after adjusting for placebo effects, and about 93 percent of participants who took the highest dose had an Lp(a) level below 50 mg/dl after treatment.

Research from the Copenhagen General Population Study estimated that lowering Lp(a) by 50 mg/dl within a five-year period could potentially reduce the risk of a recurring cardiovascular event by 20 percent within the next five years.

The study primarily tested for safety and tolerability. No deaths or serious adverse events were reported among the 89 participants who took muvalaplin.

Common adverse events included headaches, back pain, and fatigue among participants who took a fixed dose. For participants whose dose increased, common symptoms included headaches, diarrhea, abdominal pain, nausea, and fatigue.

Other Treatments for High Lipoprotein(a)

Apart from muvalaplin, lipoprotein apheresis can also lower Lp(a) levels.

The U.S. Food and Drug Administration (FDA) approved lipoprotein apheresis in 2018. The device filters and removes LDL from the blood, including Lp(a) (pdf).

Studies have shown that lipoprotein apheresis can reduce LDL and Lp(a) levels by up to 75 percent immediately after the treatment. Chronic apheresis could reduce the recurrence of cardiovascular events over two years.

Other drugs are still under clinical trials, like olpasiran, a gene therapy drug that disrupts the expression of Lp(a) in the body.

Doctor of pharmacy and cardiovascular research scientist James DiNicolantonio at Saint Luke’s Mid America Heart Institute told The Epoch Times via email that low-dose aspirin can also reduce cardiovascular risk in high Lp(a) individuals.

A 2002 Japanese study showed that taking 81 milligrams of aspirin daily reduced serum Lp(a) levels in people with high Lp(a) by approximately 80 percent. Aspirin may be particularly effective in people with high Lp(a) rather than the general population, one report suggested. Other papers have shown that low-dose aspirin significantly reduces the risk of cardiovascular events in individuals with high Lp(a). However, some studies have drawn conflicting results.

Given that there is already a safe and cheap medication, Mr. DiNicolantonio wondered if the benefit–risk ratio of muvalaplin and lipoprotein apheresis would outweigh taking baby aspirin.

Covariate Adjustment in Cardiovascular Randomized Controlled Trials: Its Value, Current Practice, and Need for Improvement


Abstract

In randomized controlled trials, patient characteristics are expected to be well balanced between treatment groups; however, adjustment for characteristics that are prognostic can still be beneficial with a modest gain in statistical power. Nevertheless, previous reviews show that many trials use unadjusted analyses. In this article, we review current practice regarding covariate adjustment in cardiovascular trials among all 84 randomized controlled trials relating to cardiovascular disease published in the New England Journal of MedicineThe Lancet, and the Journal of the American Medical Association during 2019. We identify trials in which use of covariate adjustment led to a change in the trial conclusions. By using these trials as case studies, along with data from the CHARM trial and simulation studies, we demonstrate some of the potential benefits and pitfalls of covariate adjustment. We discuss some of the complexities of using covariate adjustment, including how many covariates to choose, how covariates should be modeled, how to handle missing data for baseline covariates, and how adjusted analyses are viewed by regulators. We conclude that contemporary cardiovascular trials do not make best use of covariate adjustment and that more frequent use could lead to improvements in the efficiency of future trials.

Highlights

Too many contemporary cardiovascular trials do not use covariate adjustment in the primary analysis
Adjustment for a limited number of prognostic covariates is simple, has few risks, and is viewed as appropriate by regulators
Covariates used for adjustment should be prespecified before unblinding
Adjustment for prognostic covariates can offer a meaningful gain in statistical power

In 1918 We Faced the Flu Pandemic. Today, We’re Still Fighting the War.


 1918: The War We Lost

In 1918, the United States fought two wars. One it lost, and one it won.

You may have learned about World War I in history class, or even from your relatives. As a member of the Allied Forces, the United States defeated the Central Powers — a victory touted by history books, movies, and novels.

The second war, however, had a more elusive opponent. It descended perniciously, quietly claiming lives while armies concerned themselves with foxholes and mustard gas. In the first six months, this enemy killed 25 million people worldwide.

Ultimately, between 50 and 100 million lives — five percent of the world’s population at the time — would be lost as a result of the conflict.

This second enemy was, of course, the flu virus. By the time Americans realized that the country was under siege, it was too late to stop it. The flu made its way through the U.S., Europe, and Asia with terrifying speed; people who had been well in the morning dropped dead in the street by dinner time. Families that had already lost sons, fathers, and brothers to the war abroad dwindled as the virus attacked them, affecting the remaining young and healthy. In just one year, the average life expectancy for an American dropped by 12 years.

Over the century that followed, Americans would face three more pandemic flus, but none of them like the one in 1918. The 1957 pandemic flu killed roughly 1.1 million people worldwide; another in 1968 wiped out about another million globally. Most recently, the 2009 H1N1 pandemic flu killed between 151,700 and 575,400 people worldwide, according to estimates from the Centers for Disease Control and Prevention (CDC).

Today, a century after the 1918 pandemic, we know much more about the virus — how it spreads, how it kills. We now have influenza vaccines — unheard of in 1918 — that provide us with (albeit limited) protection. And sophisticated tracking mechanisms help us predict which flu viruses we might encounter in a given year.

We have not, however, completely vanquished the flu. In this particularly bad flu season in the U.S., we need little reminder that the virus is hardy and evolves rapidly. The flu that ravaged humanity in 1918 is not the same strain making headlines in 2018. Likewise, if another global pandemic flu is inevitable, we can’t assume the virus will be one we’ve seen before.

Today, our relationship with the flu has shifted from an adversarial, bellicose one, to one of competition; we are running a race, no longer fighting a war. To survive another century, or another season, public health experts will need stay one step ahead, armed with an artillery provided by science and a war plan drafted from the history of the battle we lost.

Why (and How) the Flu Still Kills

A high fever, fluid in the lungs, crushing fatigue, and body aches — if you’ve ever come down with influenza, it likely needs no introduction. It’s often easy to distinguish the full-blown flu from the common cold because the flu’s symptoms tend to come on suddenly and with an intensity that makes it hard to deny.

When a person is infected by any pathogen — a virus or bacteria — they usually won’t know it until that pathogen has started damaging cells. That kicks the immune system into gear, making you start to feel sick. The fever, aches, and mucus all too familiar to flu-sufferers aren’t from the virus itself, but rather are the side effects of the body’s attempt to vanquish it.

Even though our immune systems respond rapidly and with such force, they aren’t always successful in stopping the microbes wreaking havoc on the body’s cells. While most of us who get the flu just stay home and rest, the flu makes some people seriously ill — they have to be hospitalized. Some even die as a result of complications from the flu.

(The flu doesn’t directly cause death. Instead, the virus can induce an infection like pneumonia, or exacerbate an underlying condition. But oddly enough, it’s usually the body’s too-aggressive immune response to the flu that ultimately kills people).

A flu virus spreads when a healthy person ingests or inhales virus-infected droplets flung into the air by a sick person’s cough, sneeze, or mere breath. The CDC does not know exactly how many people get the flu each year. The agency doesn’t know how many people die from it either. People who come down with the flu don’t always seek medical attention. Even when they do, doctors don’t always test for it.

Those caveats make the data on this year’s flu season more striking: as of the first week of February, the number of flu cases in the United States was the highest since the 2009 pandemic. The most people have been hospitalized at this stage in the flu season since the CDC started tracking, in 2005. Both numbers are still climbing.

When we talk about the flu, we aren’t talking about a single virus. There are four types of influenza viruses — but only two of them cause serious illness in humans, Catherine Beauchemin, an associate professor of virophysics at Ryerson University in Toronto, explained to Futurism. You might remember hearing about H1N1 (the flu type that hit us in 2009) and H3N2 (the type of flu causing problems this year) — those Hs and Ns stand for hemagglutinin and neuraminidase, proteins found on virus’ surfaces that help either enter cells (H) or separate from cells to go infect another cell (N). The numbers identify groups of strains with similar Hs and Ns.

The flu mutates remarkably quickly, changing dramatically to dodge our antibodies in the span of a flu season or two. That means it can infect people who previously contracted it.

That’s why we get flu shots every year. Even though researchers have a sophisticated global tracking system to anticipate which strain might affect a region in a given year, there’s still a surprising amount of guesswork involved.

Flu seasons typically occur during the colder months, when people are more likely to congregate indoors. Because the flu season is opposite in Australia, the CDC’s Epidemiology and Prevention Branch in the Influenza Division can track that country’s flu season about six months before flu season arrives in North America. As travelers move the virus from Australia to Europe, Asia, and the U.S., public health experts can anticipate which strain will likely be the one to make people sick in the northern hemisphere that year.

The system, and the vaccine made from it, is far from perfect though. “The issue is that the recommendations have to be made some six months before the vaccine is actually used,” Richard Webby, Director of the WHO Collaborating Center for Studies on the Ecology of Influenza in Animals and a member of the Department of Infectious Disease at St. Jude Children’s Research Hospital, told Futurism. Researchers need that time to analyze data from Australia’s flu season, then manufacture and distribute the vaccines.

For a virus that evolves so quickly, that lead time can also be problematic. “There have been instances where the viruses have changed between when the recommendations have been made and when the vaccine has been administered, leading to suboptimal performance.” Webby added. For example, the latest data on this year’s flu vaccine shows it’s around 17 percent effective, though that may change before the flu season ends.

This year’s flu virus, H3N2, isn’t like other strains that have circulated in recent years. It binds to cells differently, and seems to be mutating more rapidly, making it difficult to study and create a vaccine against. The strain also doesn’t grow well in eggs, where bacteria are most commonly grown before being put into vaccines.

“We don’t have a flu vaccine problem so much as we have an H3N2 vaccine problem,” Ed Belongia, a vaccine researcher and director of the Center for Clinical Epidemiology and Population Health at Wisconsin’s Marshfield Clinic, recently told STAT News.

Although we can identify and classify them, track them, and create vaccines to defend against them, the viruses continue to evade us, evolving faster than we can keep up — sickening or killing people in the process.

Fighting the Flu of the Future

In 1918, many of the treatments we have today for secondary infections like pneumonia or strep throat either didn’t exist or were not yet widely available. That partially explains why the epidemic killed so many.

Today, the antiviral Tamiflu can quell symptoms within the first 48 hours of their onset, or even prevent them in the first place. But it’s pricey (a five-day course costs $100 minimum) and comes with risks, especially for children and teens, who are more likely to experience serious psychological side effects and “seizures, confusion, or abnormal behavior early during their illness,” according to the CDC.

In 1918, many people felt that the flu descended upon their community out of nowhere. Today, we can at the very least see the flu coming so our doctors and emergency rooms can be prepared — even if we don’t have weapons powerful enough to completely stop it yet.

One elegant solution is to gather data from smart devices sick people usually use to track the spread of the flu. Smart thermometer company Kinsa does just that. Over the past six years, the device’s 1 million users gather real-time data to track infectious disease with the help of “smart thermometers” and a smartphone app. Though it may seem counter-intuitive that a relatively small number of users could track how many people have the flu and where, the flu-tracking data over the past two years has lined up with CDC data — and the app is gathering it much more quickly than public health agencies are able. Nationally, the number of people with the flu is 39 percent higher than it was at this time last year, according to Kinsa’s most recent report.

Some are thinking bigger than treating or tracking the flu. The holy grail for flu treatment would be a vaccine that doesn’t change from year to year depending on the annual strain. If everyone could just get the vaccine once to protect us from all strains of flu for our entire lives, hundreds of thousands of lives could be saved every year.

We’re talking, of course, about a universal flu vaccine.

A team of researchers out of UCLA is genetically-engineering flu viruses that could become candidates for a universal vaccine. The researchers engineered flu cells to stimulate a bigger, more targeted immune response than the real-life strains. So far, the team has only developed the potential vaccine in the lab; the researchers hope to test two strains in animal models before moving into human trials.

Pharma company BiondVax Pharmaceuticals recently completed Phase 3 clinical trials for its universal vaccine candidate, which incorporates synthetic compounds. It has already received a patent in India. This type of vaccine targets specific areas on the surface of a flu virus that determine the phase and severity of the immune response. Being able to “ramp up” or “tamp down” different aspects of that process in animal models has convinced researchers that the vaccine could be useful in preventing other infectious disease beyond the flu, such as HIV and malaria.

FluGen, a startup out of the University of Wisconsin-Madison, is also working with a genetically-mutated form of the virus to make a universal vaccine. According to FluGen’s website, the company’s genetically-altered  viruses have had a gene deleted so that they “can infect cells, express the entire spectrum of influenza RNA and proteins, yet cannot produce any infectious virus particles.”

But to get there, the researchers encountered substantial controversy. You have to break a few eggs to make an omelet; to create a vaccine against mutating flu viruses, you’ll have to mutate a few flu viruses. Researchers worked to avoid creating some kind of super-virus. When the researchers mutated the H1N1 virus from the 2009 pandemic, and when they recreated the 1918 pandemic flu, the global scientific community called their methods and safety into question.

Other researchers, like those on a team at Georgia State University, are harnessing nanoparticles to facilitate a universal vaccine. Most vaccines target the outside surface of a virus’s protein, which varies across different viruses. But if nanoparticles could target further down, on a part of the protein called the stalk, a vaccine could have broader efficacy. In experiments detailed in a study published in Nature Communications in January 2018, mice inoculated with nanoparticles containing the protein to elicit an immune system response were completely immune to four different strains of the flu, including this year’s H3N2. They will need to conduct more animal studies — first in ferrets, as their respiratory systems are quite similar to those of humans — before testing the vaccine on humans.

There are other logistical hurdles to a universal vaccine. There’s little financial incentive for pharmaceutical companies to develop vaccines, much less universal ones only administered once in a person’s life. Distribution of vaccines can be challenging and shortages are not uncommon. Plus, people just love to find reasons why they shouldn’t get the jab.

But these challenges are not insurmountable. A universal vaccine could be possible within a generation. How well it works, well, that’s another question.

As 1918 came to a close, the editors of the Journal of the American Medical Association published its final edition for the year. The editors reflected on what could be learned from the two wars humanity fought that year, then turned their attention to the future.

“Medical science for four and one-half years devoted itself to putting men on the firing line and keeping them there,” they wrote. “Now, it must turn with its whole might to combating the greatest enemy of all — infectious disease.” In another century, perhaps the flu of today — the damage it causes, the lives lost to it — will seem equally distant, perhaps even innocuous.

Black Lung Study Finds Biggest Cluster Ever Of Fatal Coal Miners’ Disease


In this historical image, a doctor reviews an X-ray of a patient with black lung disease. Federal researchers say they’ve now identified the largest cluster ever recorded of the most advanced stage of the disease.

Epidemiologists at the National Institute for Occupational Safety and Health say they’ve identified the largest cluster of advanced black lung disease ever reported, a cluster that was first uncovered by NPR 14 months ago.

In a research letter published Tuesday in the Journal of the American Medical Association, NIOSH confirms 416 cases of progressive massive fibrosis or complicated black lung in three clinics in central Appalachia from 2013 to 2017.

“This is the largest cluster of progressive massive fibrosis ever reported in the scientific literature,” says Scott Laney, a NIOSH epidemiologist involved in the study.

“We’ve gone from having nearly eradicated PMF in the mid-1990s to the highest concentration of cases that anyone has ever seen,” he said.

The clinics are operated by Stone Mountain Health Services and assess and treat coal miners mostly from Virginia, Kentucky and West Virginia, a region that includes what have historically been some of the most productive coalfields in the country.

“When I first implemented this clinic back in 1990, you would see … five [to] seven … PMF cases” a year, says Ron Carson, who directs Stone Mountain’s black lung program.

The clinics now see that many cases every two weeks, he says, and have had 154 new diagnoses of PMF since the fieldwork for the NIOSH study concluded a year ago.

“That’s an indication that it’s not slowing down,” Carson says. “We are seeing something that we haven’t seen before.”

A slide from a presentation by the National Institute for Occupational Safety and Health shows the progression from a healthy lung to advanced black lung disease.

NIOSH

Laney acknowledges that the full scope of what he calls an epidemic is still unknown. “Even with this number, which is substantial and unacceptable, it’s still an underestimate.”

“Nobody looks forward to dying”

PMF, or complicated black lung, encompasses the worst stages of the disease, which is caused by inhalation of coal and silica dust at both underground and surface coal mines. Miners gradually lose the ability to breathe, as they wheeze and gasp for air.

Edward Brown is a 55-year-old former coal miner with progressive massive fibrosis, or complicated black lung disease.

 

“I’ve seen it too many times,” said Charles Wayne Stanley, a Stone Mountain client with PMF, who spoke with NPR in 2016. “My wife’s grandpa … [I] watched him take his last breath. I watched my uncle die with black lung. You literally suffocate because you can’t get enough air.”

Lung transplants are the only cure, and they’re possible only when miners are healthy enough to qualify.

“[I] can’t breathe, you know. [I] can’t do nothing hardly like I used to,” says Edward Brown, a 55-year-old retired miner from Harlan, Ky., who was diagnosed with PMF at both Stone Mountain and another medical clinic.

“That’s all I got to look forward to is to get worser and worser,” Brown says, pausing for a deep sigh and nervous chuckle. “Nobody looks forward to dying, you know, but it’s a-comin’ and then that worries me.”

Brown’s age and disease fit another finding of the NIOSH study and a trend Carson first disclosed to NPR in December 2016.

“Miners are dying at a much younger age,” he says, noting that in the 1990s, the clinic’s PMF diagnoses typically involved miners in their 60s, 70s and 80s. Now the disease strikes miners in their 50s, 40s and even 30s with fewer years mining coal.

“A high proportion” of the miners in the NIOSH study had severely advanced disease and “coal mining tenure of less than 20 years, which are indications of exceptionally severe and rapidly progressive disease,” the study says.

The lung of deceased West Virginia coal miner Chester Fike was taken out during a double lung transplant when he was 60. He worked in the mines for 35 years.

 

The Stone Mountain study follows a NIOSH review of cases at a small clinic in Coal Run Village, Ky., in 2016. NIOSH researchers confirmed 60 diagnoses of PMF there in 20 months. That alarmed them because NIOSH had earlier reported only 99 cases nationwide in five years.

At the same time, an NPR survey of 11 black lung clinics in Kentucky, Virginia, Pennsylvania and Ohio identified 962 cases, 10 times the original NIOSH count. Since then, NPR’s ongoing survey of clinics has counted nearly 1,000 more cases.

The NPR investigation also found that the likely cause of the epidemic is longer work shifts for miners and the mining of thinner coal seams. Massive mining machines must cut rock with coal and the resulting dust contains silica, which is far more toxic than coal dust.

The spike in PMF diagnoses is also due to layoffs and retirements brought on by the decline in coal mining. Miners who had put off getting checked for black lung earlier began streaming into clinics, especially if they needed the medical and wage replacement benefits provided by black lung compensation programs.

A public health emergency?

There is also concern for the 50,000 coal miners still working.

“They really need to declare this a public health emergency,” says Joe Wolfe, an attorney in Norton, Va., who helps miners file claims for black lung compensation.

“If you had 400 cases of E. coli, [NIOSH] would flood the area with technicians and doctors and nurses checking people’s health,” Wolfe adds. “There are people literally working in the mines right now … that have complicated black lung that do not have a clue.”

NIOSH doesn’t have that authority, according to David Weissman, who directs the agency’s respiratory health program in Morgantown, W.Va. Public health emergencies are declared by the secretary of the U.S. Department of Health and Human Services.

“But I will say that this is a very important problem. We’re very passionate about this problem,” Weissman says. “And we’re going to keep doing everything in our power to address it.”

Multiple NIOSH and independent studies are underway or planned to try to pinpoint the number of miners who have the disease, as well as the causes.

A mining disaster in slow motion

Jess Bishop, a black lung victim, takes his last breaths while his sons — also coal miners — keep vigil in Logan County, W.Va., in 1976. The disease spiked in the 1960s and ’70s but then plummeted with the passage of mine safety laws.

 

Coincidentally, new federal regulations that are supposed to limit exposure to dangerous levels of coal and silica dust were fully implemented in 2016, a few months before NPR first reported the PMF epidemic. The Trump administration recently announced a “retrospective study” of the new regulations, a move that has mine safety advocates concerned, especially given the epidemic of the disease caused by mine dust.

“It would be outrageous for any undercutting of those regulations that puts miners [back] in harm’s way and subjects even more of them to this terrible disease,” says Joe Main, the former mine safety chief at the federal Mine Safety and Health Administration.

“When we think we know as much as we thought we should know about the disease, the next day [there’s] worse information,” says Main. “It shows that the depth of the disease is worse than what we knew the day before.”

Main pushed for the tougher mine dust exposure limits. His successor at MSHA is David Zatezalo, a former mining company executive.

“We are not proposing to weaken this rule,” Zatezalo tells NPR in a written statement. “We are planning to collect feedback on the rule from stakeholders, which was both a commitment previously made by MSHA, and a directive from President Trump, who strongly supports America’s miners.”

Zatezalo did not respond to requests for an interview. His agency’s formal notice for the “retrospective study” labels it a “deregulatory” action, which implies less regulation.

At a congressional hearing today in Washington, Zatezalo was asked directly about his agency’s “retrospective study” of the tougher mine dust limits imposed by the Obama administration.

“Do you plan to rollback any aspect of the 2014 respirable dust rule?” asked Rep. Bobby Scott, D-Va., the ranking Democrat on the House Committee on Education and the Workforce.

David Zatezalo, the Assistant Secretary of Labor for Mine Safety and Health, was asked about the advanced black lung epidemic at a congressional hearing in Washington, D.C., on Feb. 6, 2018.

“I do not,” Zatezalo responded.

Zatezalo was also asked about his agency’s own description of the “retrospective study” of the new mine dust regulations as “deregulatory.”

“I can’t tell you why it was listed as a deregulatory item,” Zatezalo responded, unless, he added, that had something to do with the frequency of testing using new dust monitors.

“Each case of advanced black lung disease is an entirely preventable tragedy, and represents mine operators’ unwillingness to adequately control mine dust exposures, and safety regulators failure to set, monitor and enforce standards necessary to protect miners,” Scott said in a statement to NPR.

“MSHA should not bend to pressure from well-connected coal mine executives to roll back the regulations,” Scott added. “The Mine Safety and Health Administration (MSHA) cannot keep looking the other way while the burden of this preventable disease grows.”

The burden is clear on the walls of Ron Carson’s office at the Stone Mountain black lung clinic in St. Charles, Va. They’re lined with photographs and other mementos of clinic patients, some who died from the disease.

Carson describes a kind of mining disaster in slow motion, in which the disease takes years to develop, even though it strikes quicker now, and in which each death is solitary. He points to a half sheet of white paper tacked to his bulletin board. It shows a phrase he printed out from an article about black lung.

“Mining disasters get monuments,” Carson says, his voice softening. “Black lung deaths get tombstones. And I’ve seen many a tombstone in [the last] 28 years from black lung. And I’m seeing more now. A lot more now.”

NFL Career Doesn’t Shorten Players’ Lives


No elevated mortality rate or deaths attributed to CTE in 30-year study

Men who played professional football for a median 5 years showed no special mortality risk in the following decades, versus athletes who played just a few games in the National Football League (NFL), researchers found in a retrospective study.

NFL players with significant careers and those hired to play three game during a strike had similar rates of death over 30-year follow-up (4.9% versus 4.2%, adjusted HR 1.38, 95% CI 0.95-1.99), according to a group led by Atheendar Venkataramani, MD, PhD, of University of Pennsylvania in Philadelphia.

“Given the small number of events, analysis of longer periods of follow-up may be informative,” the authors noted in their study published online in the Journal of the American Medical Association. There were just 181 deaths in the whole cohort, leaving the analysis possibly underpowered; the oldest men in both cohorts were in their mid- to late-50s when the analysis was performed, the investigators acknowledged.

The temporary players in the study were the 879 replacements hired to play during a 3-week NFL player strike in 1987. A temporary player was typically a former college football player, someone released from an NFL team during the preseason, or a former player from a rival league.

This group was compared with 2,933 NFL professionals who started their league careers from 1982 to 1992, with median careers of 5 seasons (interquartile range 2-8).

“NFL replacement players in the 1987 season serve as an appropriate comparison group for career NFL players under the assumption that replacement players were experienced athletes fit enough to obtain an NFL roster position with the exit of another professional athlete, but relatively unexposed to an NFL career (i.e., expected to play at most three games),” Venkataramani’s group wrote.

“The cleverly conceived cohort of replacement players had a much lower ‘dose’ of head trauma and other factors related to a career in the NFL than long-time professional players and presumably returned to much less physically demanding jobs after their participation as NFL replacement players,” agreed Steven DeKosky, MD, of the University of Florida in Gainesville, in an accompanying editorial.

Among career NFL players, the most common causes of death were cardiometabolic disease (35.4%), transportation injuries (13.9%), unintentional injuries (10.4%), and cancer (10.4%). For the replacements, the leading causes were cardiometabolic disease (51.4%), self-harm and interpersonal violence (13.5%), and cancer (10.8%).

Notably, the sole neurological cause of death was amyotrophic lateral sclerosis, responsible for 4.9% of deaths in the career NFL player group and none among replacement players.

DeKosky commented that “no dementia cases were reported as the cause of death in this 1982-1992 cohort despite concerns about chronic traumatic encephalopathy (CTE), which had not become a recognized issue in football until a little more than 10 years ago.”

“Although the life expectancy of professional football players was not significantly reduced based on the current evidence, the health of professional athletes should remain a focus of future research. Clinicians and researchers should now turn to the pressing issues of understanding how such repeated trauma leads to manifestations of neurodegenerative disease (and sometimes overlapping cognitive, neuropsychiatric, and movement disorders such as parkinsonism, tremor, and depression) and why and how altered tau protein plays a role in CTE,” he urged.

“There have been repeated calls for large longitudinal studies of former players. Some studies designed to address this issue are under way with the help and cooperation of former players. Such studies will help determine the actual incidence and prevalence of these neurodegenerative diseases and will provide both a perspective on the real risks associated with repeated subconcussive brain trauma and an understanding of the susceptibility to them,” according to DeKosky.

Venkataramani and colleagues said that the possibility that baseline differences may have biased the results despite adjustment was another major limitation of their study.

Stereotactic Radiosurgery Benefit for Brain Mets: Case Closed?


The use of stereotactic radiosurgery (SRS) alone on patents with limited (one to three) brain metastases results in less cognitive deterioration than when combined with whole brain radiotherapy (WBRT), investigators have reported.

In a study led by Paul D. Brown, MD, director of the CNS stereotactic radiotherapy program at the University of Texas MD Anderson Cancer Center, researchers determined there was less cognitive deterioration in patients who underwent SRS alone after 3 months (64%), than in patients who underwent SRS plus WBRT (92%).

The results, published online in the Journal of the American Medical Association, showed a “significant difference” in the level of cognitive deterioration, particularly considering the controversy surrounding the role WBRT should play in the treatment of patients with brain metastases, the authors wrote.

The use of WBRT has been associated with cognitive decline, and while previous randomized clinical trials have demonstrated improved intracranial tumor control with the combined use of WBRT and SRS for brain metastases, none have showed any significant survival advantage with adjuvant WBRT.

“Central to this issue is whether tumor progression anywhere in the brain is more detrimental to a patient’s well-being than the potential deterioration of cognitive function and quality of life associated with WBRT,” Brown and his colleagues wrote. “Because more than 200,000 individuals in the United States alone are estimated to receive WBRT each year, it is important that the potential benefits and risk of adjuvant WBRT be clearly defined.”

The study involved 213 patients from 34 institutions in North American who had between one and three brain metastases (all less than 3 cm in diameter); participants were randomized to receive SRS or SRS plus WBRT.

After excluding patients who died, did not return for a 3-month or subsequent evaluation, or did not complete the required baseline tests, 111 patients were available for evaluation.

 There was less cognitive deterioration at 3 months after use of SRS alone (40 of 63 patients, 63.5%) than with the SRS-plus-WBRT group (44 of 48 patients, 91.7%). This was a difference of 28.2%; 90% CI, -41.9% to -14.4%).

In addition, quality of life was higher at 3 months with SRS alone (mean change from baseline, −0.1 versus −12.0 points; mean difference, 11.9; 95% CI, 4.8-19.0 points).

The time to intracranial failure was shorter for those in the SRS-alone group compared with those in the SRS-plus-WBRT group (hazard ratio [HR], 3.6; 95% CI, 2.2-5.9), and there was no significant difference in functional independence between the two groups at three months.

Median overall survival was 10.4 months for patients receiving SRS alone and 7.4 months for those given the combined treatments (HR, 1.02; 95% CI, 0.75-1.38).

“In the absence of overall survival, these findings suggest that for patients with one to three brain metastases amenable to radiosurgery, SRS alone may be a preferred strategy,” the team concluded.

 Limitations to the study included the fact that a majority of the participants had lung cancer and the trial did not attempt to include other types of primary cancers. However, Brown and his colleagues noted that lung cancer is the predominant primary cancer reported in most brain metastases trials and that “there is no obvious biological basis to believe that the quality-of-life and cognitive effects of WBRT would vary between different primary cancers.”

The authors also noted that there was significant patient dropout in their trial, mostly due to death, and that clinicians and trial participants were not blinded to treatment.

In an editorial accompanying the study, titled “Whole Brain Radiotherapy for Brain Metastases: Is the Debate Over?,” Orit Kaidar-Person, MD, Carey K. Anders, MD, and Timothy M. Zagar, MD, wrote that the trial “confirms previous recommendations that WBRT should not be routinely added to SRS for patients with brain metastases of limited number or size.”

But, while there may be little role for WBRT in the type of patient enrolled in the this particular study, the editorial argued that based on the findings, and “until proven otherwise,” WBRT could still have an important role to play in the treatment of patients not in that disease category.

“However, the study results cannot be extrapolated to infer that SRS is the standard for patients with four or more metastases or that WBRT no longer has a role in the treatment of brain metastases,” Kaidar-Person and his colleagues wrote.

No Breast Cancer Risk Seen With IVF


Dutch study finds protective effect in some instances.

Women undergoing in vitro fertilization treatment did not seem to have an increased risk of breast cancer years later, according to a large Dutch study.

After a median follow-up of more than 2 decades, breast cancer incidence among IVF patients was relatively comparable (standardized incident ratio 1.01, 95% CI, 0.93-1.09) with a subfertile non-IVF comparison group (SIR 1.00, 95% CI 0.88-1.15), reported Alexandria W. van den Belt-Dusebout, PhD, of the Netherlands Cancer Institute in Amsterdam, and colleagues.

Cumulative incidence of breast cancer at age 55 was also nonsignificant when comparing the two groups (3.0% for IVF group and 2.9% for non-IVF group, P=0.85), the authors wrote in the Journal of the American Medical Association. The group also noted that, in some IVF patients, the risk seemed to be lower.

In an email to MedPage Today, van den Belt-Dusebout said that because the use of IVF is relatively recent, long-term breast cancer risk was not yet known.

“Earlier studies [on IVF] that reported no increase of breast cancer based their conclusions on shorter follow-up and smaller numbers of breast cancers, whereas some studies reported increased risks in subgroups of IVF treated women,” she said. “Because of the conflicting results in the literature and methodological limitations of earlier studies, even in reviews and a meta-analysis, a large study with long follow-up was needed.”

This study included patients from a historical cohort examining subfertility (the OMEGA study) over a mean period of 21.1 years following treatment Overall, the authors looked at data from 25,108 women who underwent IVF treatment from 1983 to 1995. Women had a mean age of 32.8 at baseline and a mean number of 3.6 IVF cycles.

They found 839 cases of invasive breast cancer and 109 cases of in situ breast cancer during the follow-up period.

 They found no increase in the incidence of breast cancer after ≥20 years following treatment in either group (SIR 0.92, 95% CI 0.73-1.15 in IVF group vs SIR 1.03, 95% CI 0.82-1.29 in non-IVF group).

The authors reported that IVF treatment appeared to reduce a woman’s risk of breast cancer in some instances, specifically an increased number of IVF cycles (seven or more) was associated with a significantly decreased risk (HR 0.55, 95% CI o.39-0.77) compared with one to two cycles (P=0.001 for trend).

“The finding that more IVF cycles was associated with further decreases in risk is intriguing,” said Nanette Santoro, MD, of University of Colorado School of Medicine in Aurora, in an email to MedPage Today. “Although the trend was not large, it implies that IVF cycles, which may expose women to high levels of both estrogen and progesterone, may actually be protective in a manner similar to how pregnancy is protective against breast cancer.”

Santoro, who was not involved with the research, added that that the findings of the overall study support the widely held belief that IVF is a short exposure that should not have a major impact on a woman’s breast cancer risk.

Van den Belt-Dusebout said she was surprised by both the reduced risk associated with an increasing amount of IVF cycles and among women who responded poorly to their first IVF cycle (<4 collected oocytes versus >4 collected oocytes, HR 0.77, 95% CI 0.61-0.96).

 “This can be explained by the fact that these women, like women who respond poorly to their first ovarian stimulation, more often reach menopause at an early age,” she said. “An early age at menopause lowers the risk of developing breast cancer.”

However, not all IVF patients experienced a decline in breast cancer risk. Parous women (who had previously given birth) were linked with a significant increased risk in breast cancer versus nulliparous women (HR 1.35, 95% CI 1.16-1.73). Women who were age 40 and older at first birth did have a more than two-fold increased risk of breast cancer compared with women younger than age 25 (HR 2.52, 95% CI 1.71-3.73).

Study limitations included more missing data in the non-IVF group (33% versus 16% in the IVF group). In addition, age at menopause and menopausal status were unknown for most women, because most women were not postmenopausal at the questionnaire completion and the study was based on IVF treatment protocols used until 1995.

Van den Belt-Dusebout told MedPage Today that group team plans to address these issues, including further study of the OMEGA cohort to identify postmenopausal breast cancer risk, as well as a separate study examining women undergoing IVF treatment from 1995 to 2001.

“It is not very likely that more recently treated women would have an increased risk of breast cancer, because these protocols are more like the natural menstrual cycle … the downregulation phase is shorter and the stimulation is milder than in the earlier periods,” she said.

Kidney Failure a Possible Risk of Prostate Cancer Hormone Treatment.


Hormone therapy for prostate cancer may dramatically increase a man’s risk of kidney failure, according to a new study.

Use of androgen deprivation therapy was tied to a 250 percent increase in a man’s chances of suffering acute kidney injury, Canadian researchers found in a review of more than 10,000 men receiving treatment for early stage prostate cancer.

The study appears in the July 17 issue of the Journal of the American Medical Association.

Androgen deprivation therapy uses medication or surgery to reduce the amount of male hormones in a man’s body, which can then cause prostate cancer cells to shrink or grow more slowly.

It is a therapy usually reserved for advanced cases of prostate cancer, said study co-author Laurent Azoulay, a pharmacoepidemiologist at Jewish General Hospital‘s Lady Davis Institute, in Montreal. Previous research already has linked androgen deprivation therapy to a possible increased risk of heart attack.

These new findings tying hormone therapy to acute kidney injury — a rapid loss of kidney function with a 50 percent mortality rate — should prompt doctors to think twice before using androgen deprivation therapy to treat prostate cancer patients at little risk of dying from the disease, said Azoulay, also an assistant professor in McGill University‘s department of oncology.

“There is a big debate over who should receive androgen deprivation therapy, and the timing of use,” he said. “In patients whose prostate cancer has spread, the benefits outweigh the risk, but now there’s this jump to using [androgen deprivation therapy] in patients who would not typically die from prostate cancer. In that subgroup of patients, the risks might outweigh the benefit.”

Dr. Durado Brooks, director of prostate and colorectal cancers for the American Cancer Society, called the Canadian study “intriguing.”

“They did find what would appear to be a fairly strong association between androgen deprivation treatment and acute kidney injury,” Brooks said. “This is something that men and their clinicians need to be aware of and watching out for if they choose to go with androgen deprivation therapy as part of their treatment plan for prostate cancer.”

However, Brooks also noted that the study relied on past medical data and did not involve current prostate cancer patients compared against a control group.

“These results are suggestive that an association may exist, but they are not definitive,” Brooks said. “There will need to be other research looking at this.”

For the new study, the research team identified 10,250 men who had been diagnosed with nonmetastatic (not spreading) prostate cancer between 1997 and 2008, using patient data maintained by the United Kingdom. Researchers then tracked whether each patient had been hospitalized with acute kidney injury, and whether their kidney failure occurred during or after the hormone treatment.

Prostate cancer patients who received androgen deprivation therapy were 2.5 times more likely to suffer kidney failure, the study found. Their risk of acute kidney injury particularly increased if they received a combined androgen blockade, a therapy that uses different hormone-suppression methods to drastically decrease male and female hormone levels in the body.

Both male and female hormones play a large role in kidney function, Azoulay said, which could explain why androgen deprivation therapy can cause such drastic damage to the organ.

“Testosterone and estrogen have been shown to play an important role in renal [kidney] function,” he said. “It seems that testosterone has vessel-dilating effects, and estrogen has a protective effect against renal injury.”

Source: http://healthyliving.msn.com

 

Statins and Musculoskeletal Conditions, Arthropathies, and Injuries.


ABSTRACT

Importance  Statin use may be associated with increased musculoskeletal adverse events, especially in physically active individuals.

Objective  To determine whether statin use is associated with musculoskeletal conditions, including arthropathy and injury, in a military health care system.

Design  A retrospective cohort study with propensity score matching.

Setting  San Antonio Military Multi-Market.

Participants  Tricare Prime/Plus beneficiaries evaluated from October 1, 2003, to March 1, 2010.

Interventions  Statin use during fiscal year 2005. On the basis of medication fills, patients were divided into 2 groups: statin users (received a statin for at least 90 days) and nonusers (never received a statin throughout the study period).

Main Outcomes and Measures  Using patients’ baseline characteristics, we generated a propensity score that was used to match statin users and nonusers; odds ratios (ORs) were determined for each outcome measure. Secondary analyses determined adjusted ORs for all patients who met study criteria and a subgroup of patients with no comorbidities identified using the Charlson Comorbidity Index. Sensitivity analysis further determined adjusted ORs for a subgroup of patients with no musculoskeletal diseases at baseline and a subgroup of patients who continued statin therapy for 2 years or more. The occurrence of musculoskeletal conditions was determined using prespecified groups of International Classification of Diseases, Ninth Revision, ClinicalModification codes: Msk1, all musculoskeletal diseases; Msk1a, arthropathies and related diseases; Msk1b, injury-related diseases (dislocation, sprain, strain); and Msk2, drug-associated musculoskeletal pain.

Results  A total of 46 249 individuals met study criteria (13 626 statin users and 32 623 nonusers). Of these, we propensity score–matched 6967 statin users with 6967 nonusers. Among matched pairs, statin users had a higher OR for Msk1 (OR, 1.19; 95% CI, 1.08-1.30), Msk1b (1.13; 1.05-1.21), and Msk2 (1.09; 1.02-1.18); the OR for Msk1a was 1.07 (0.99-1.16; P = .07). Secondary and sensitivity analyses revealed higher adjusted ORs for statin users in all outcome groups.

Conclusions and Relevance  Musculoskeletal conditions, arthropathies, injuries, and pain are more common among statin users than among similar nonusers. The full spectrum of statins’ musculoskeletal adverse events may not be fully explored, and further studies are warranted, especially in physically active individuals.

Source: JAMA

Oxygen Saturation and Outcomes in Preterm Infants.


BACKGROUND

The clinically appropriate range for oxygen saturation in preterm infants is unknown. Previous studies have shown that infants had reduced rates of retinopathy of prematurity when lower targets of oxygen saturation were used.

METHODS

In three international randomized, controlled trials, we evaluated the effects of targeting an oxygen saturation of 85 to 89%, as compared with a range of 91 to 95%, on disability-free survival at 2 years in infants born before 28 weeks’ gestation. Halfway through the trials, the oximeter-calibration algorithm was revised. Recruitment was stopped early when an interim analysis showed an increased rate of death at 36 weeks in the group with a lower oxygen saturation. We analyzed pooled data from patients and now report hospital-discharge outcomes.

RESULTS

A total of 2448 infants were recruited. Among the 1187 infants whose treatment used the revised oximeter-calibration algorithm, the rate of death was significantly higher in the lower-target group than in the higher-target group (23.1% vs. 15.9%; relative risk in the lower-target group, 1.45; 95% confidence interval [CI], 1.15 to 1.84; P=0.002). There was heterogeneity for mortality between the original algorithm and the revised algorithm (P=0.006) but not for other outcomes. In all 2448 infants, those in the lower-target group for oxygen saturation had a reduced rate of retinopathy of prematurity (10.6% vs. 13.5%; relative risk, 0.79; 95% CI, 0.63 to 1.00; P=0.045) and an increased rate of necrotizing enterocolitis (10.4% vs. 8.0%; relative risk, 1.31; 95% CI, 1.02 to 1.68; P=0.04). There were no significant between-group differences in rates of other outcomes or adverse events.

CONCLUSIONS

Targeting an oxygen saturation below 90% with the use of current oximeters in extremely preterm infants was associated with an increased risk of death.

Source: NEJM