Obesity Paradox in Heart Disease Challenged by New Analysis


A new analysis of data of long-term follow-up from 10 population-based cohorts challenges the so-called obesity paradox  — previous counterintuitive findings suggesting that patients with heart disease may live longer if they are overweight or obese.

“Our research differs from previous studies in that we have included a lifespan perspective — we started to follow people before they developed heart disease,” lead author, Sadiya Khan, MD, Northwestern University Feinberg School of Medicine, Chicago, Illinois, told Medscape Medical News.

“We found that obese people live shorter lives, and while overweight people had similar length of life to those of a normal weight, they developed cardiovascular disease earlier than people with normal BMI.”

She explained that previous studies have suggested that obesity may in some way be associated with lower mortality in cardiovascular disease (CVD), known as the obesity paradox. But these studies included patients who already had CVD at the time of the study, and this can introduce many biases, particularly the issue that obese patients may be diagnosed with heart disease earlier and so may appear to live with heart disease for a longer time.

“Our results provide a different context by following people before the onset of cardiovascular disease, which should therefore eliminate this ‘lead time bias’,” Khan said. “We found that obese people develop cardiovascular disease at a younger age and so they have more years with heart disease but in the context of a shorter lifespan.”

The paper was published online in JAMA Cardiology on February 28.

In the paper, the researchers state: “Taking a life course perspective, we observe that the obesity paradox…appears largely to be caused by earlier diagnosis of CVD. Study of inception cohorts of people at the time of cardiovascular diagnosis would not detect this finding, leading to unclear messaging about the true risks of being overweight.”

They say this “false reassurance” is akin to the phenomenon of lead-time bias observed in other situations, such as with cancer screening.

Commenting on the paper for Medscape Medical News, Naveed Satter, MD, professor of cardiovascular and medical sciences at the University of Glasgow, United Kingdom, said that the problem with previous studies is that “not all heart disease is equal.”

“You can be a thin smoker or an obese nonsmoker,” Satter said. “There are many different confounders. When you take a population with any chronic condition you see a different picture which may not tell the truth. We also have to consider when the patient developed heart disease. By starting to follow people before they developed heart disease, these researchers have removed one of the biggest confounders.”

Other strengths of these data are a large population, a wide age range, a long follow-up, and robust statistical methods to overcome other confounders, he added. “And when looking at this cleaner picture, we see clearly than lower BMI is better.”

“Better-quality studies such as this, which have longer-term follow-up and do not include people with disease at baseline, tend to find that lowest risks for bad outcomes are in the leaner people,” Satter said. “We are understanding the complexities of these studies better now and there is more evidence that lowest risks for heart disease or death are in normal weight folk, not those who are overweight or obese.”

The study analyzed individual-level data from 190,672 in-person examinations across 10 large prospective cohorts with an aggregate of 3.2 million years of follow-up. All of the participants were free of CVD at baseline and had objectively measured height and weight to assess BMI.

Results showed that compared with individuals with a normal BMI (defined as a BMI of 18.5 to 24.9), lifetime risks for incident CVD were higher in people in the overweight (BMI, 25.0 to 29.9) and obese (BMI, 30.0 to 39.9) groups.

Compared with normal weight, overweight middle-aged men had a hazard ratio for incident CVD of 1.21; for overweight women, the hazard ratio was 1.32.

Obese men had a hazard ratio for CVD of 1.67, and the corresponding figure for obese women was 1.85.

The hazard ratios for the morbidly obese (BMI, 40.0) were 3.14 for men and 2.53 for women. All these results were statistically significant.

The researchers found the strongest association between BMI categories and heart failure (HF) compared with other subtypes of CVD, with a fivefold increase in incident HF in middle-aged men with morbid obesity, which they say “has particularly important implications for focusing on weight management strategies for HF prevention.”

In terms of lifespan, normal weight middle-aged men lived 1.9 years longer than obese men and 6 years longer than those who were morbidly obese. Normal-weight men had longevity similar to that of overweight men.

Normal-weight middle-aged women lived 1.4 years longer than overweight women, 3.4 years longer than obese women, and 6 years longer than morbidly obese women.

The researchers point out that “our findings suggest that earlier occurrence of CVD in those with obesity is most strongly associated with a greater proportion of life lived with CVD and shorter overall survival in adults aged 20 to 59 years at baseline.”

However, they note that the association of obesity with mortality may change at older ages, which may explain why some earlier studies in older individuals showed no difference in total life expectancy in older men and women with obesity.

“Our results provide critical perspective on the cardiovascular disease burden associated with overweight, highlight unhealthy years lived with increased cardiovascular morbidity, and challenge the prevalent view that overweight is associated with greater longevity compared with normal BMI,” they conclude.

Ultrasound First, Not CT, for Women’s Pelvic Pain


Using ultrasound to diagnose pelvic pain can reveal unexpected pathologies, according to new research presented at the American Institute of Ultrasound in Medicine 2015 Annual Convention in Lake Buena Vista, Florida.

One of the most interesting findings was that transvaginal ultrasound is useful in diagnosing appendicitis, said Daniel Ohngemach, a fourth-year student at the Hofstra North Shore–LIJ School of Medicine in Hempstead, New York.

“You could see it beautifully. In the follow-up transabdominal scan, the images were not nearly as good and probably wouldn’t have been considered diagnostic for appendicitis,” he told Medscape Medical News.

“It’s a modality people wouldn’t usually run to for appendicitis, but you can see things that might otherwise be obstructed by bowel gas,” he explained.
His team searched the radiology department database at North Shore–LIJ to identify women who reported pelvic pain over a 3-year period. They found that ultrasounds performed in the emergency department or a hospital unit led to diagnoses doctors might not have predicted, including appendicitis, diverticulitis, colitis, tumors, small bowel obstruction, inflammatory bowel disease, pelvic inflammatory disease, fallopian tube torsion, endometriosis, and hernia.

Even when it is not diagnostic, an ultrasound, in combination with other tests, can lead to answers doctors don’t expect, Ohngemach said.

The thrust of the project was to make doctors aware that in addition to the many conditions ultrasound is considered diagnostic for, there are other conditions that the imaging technique can be used to diagnose, which could help expedite patient care, he explained.

Ultrasound First

When nonpregnant women report pelvic pain, ultrasound rather than CT or MRI should be the first imaging choice, write Beryl Benacerraf, MD, from Brigham and Women’s Hospital in Boston, and colleagues in a report published in the April issue of the American Journal of Obstetrics and Gynecology (2015;212:450-455).

Ultrasound doesn’t expose the patient to radiation and is at least as reliable as, and can be about four times less expensive than, CT, they report.

“Yet still today, many women with pelvic pain, masses, or flank pain first undergo CT scans and those with Müllerian duct anomalies typically have MRIs. Not uncommonly, CT or MRI of the pelvis often yield indeterminate and confusing findings that then require clarification by ultrasound imaging,” they explain.

A lot of the joy of it is to be able to see things that CT people think we won’t be able to see.
Ultrasound has come a long way since the “black dots on a white screen,” they point out. “Currently available 3D/4D volume ultrasound imaging can produce images of the female pelvis of comparable quality and orientation to those of MRI and CT, but without radiation and at relatively lower cost.”
It’s rare a radiologist who would choose CT first for pelvic pain, said Theodore Dubinsky, MD, from the University of Washington School of Medicine in Seattle.

“The only circumstance where they might do a CT first is for appendicitis, or maybe a kidney stone, but the vast majority of the time, ultrasound is first,” he told Medscape Medical News. For those two exceptions, CT might have a slightly higher sensitivity, but it is still valid to do the ultrasound first and then CT if the diagnosis is still unclear.

With ultrasound, “a lot of the joy of it is to be able to see things that CT people think we won’t be able to see,” said Dr Dubinsky.

CTs are often done first in emergency settings, acknowledged Maitray Patel, MD, from the Mayo Clinic in Scottsdale, Arizona. However, that might be less about best practices and more about available resources, he explained.

Most hospitals are set up so that CTs are easier to order than ultrasounds, he told Medscape Medical News. Some hospitals, including the Mayo in Arizona, don’t have 24/7 availability for ultrasounds. Someone can be called in to do one, but that adds to the wait time, especially in the middle of the night.

“CT scans are generally staffed 24/7,” Dr Patel said. Doctors in the emergency department “want to do what’s right for the patient, but they also want to do it as expeditiously as possible.”

He added that for women who are obese, it’s very difficult to do an ultrasound of the appendix. Ultrasound is also very operator-dependent; if the person available to do the ultrasound lacks the experience you’re looking for, a CT might be a better choice, he said.

Peripheral Neuropathy Treatment Taps Into Quantum Theory


In the difficult challenge of treating peripheral neuropathy, a novel approach combining injections of local anesthesia with electronic signal treatment shows improved efficacy in managing pain.

Lead author Peter M. Carney, MD, a neurosurgeon based in Elkhart, Indiana, says the approach is based on the principles of quantum theory.
“What is unique about this is it doesn’t use pharmacology, it uses the principles of physics, and it follows the theory of Nobel Laureate and quantum theory founder Erwin Schrödinger,” he told Medscape Medical News, who first proposed in 1943 that “living matter at the cellular level can be thought of in terms of quantum mechanics — pure physics and chemistry.”

“If we understand that you can change how living cells function by changing their mechanics at a cellular level, you can heal them, and that is what this does,” Dr. Carney explained.

The approach, called combined electrochemical therapy (CET), involves the injection of 1 or 2 mL of 0.5% bupivacaine just above the ankle to the 5 nerves of the foot, followed immediately by application of the electronic signaling technique.

The electronic signaling delivers complex “specific parameters of electroanalgesia employing both varied amplitudes and frequencies of electronic signals,” he reports.

He presented preliminary findings with the approach here at the American Academy of Pain Management (AAPM) 25th Annual Clinical Meeting.

Combined Electrochemical Therapy

For the study, 98 patients with peripheral neuropathy were treated with CET up to 2 times per week. The treatments last 15 to 30 minutes.

After an average of 17.6 treatments, the patients showed an average reduction in visual analogue scale (VAS) score of 3.7 points, while their average peripheral neuropathy function index (PNFI) improvement was 49.4%.

As many as 82% of patients reduced their VAS score by at least 30%, and 63% had a reduction in pain of at least 50%.

Four patients stopped after just 4 treatments because their VAS score was reduced by 50% to 100%.

Dr. Carney said 2 patients had minor adverse effects, including a patient who developed a blister at the site of an electrode and a 91-year-old patient who felt faint while receiving an injection.
He noted that, unlike other studies that treated only diabetic peripheral neuropathy, the current study treated patients with 5 different diagnostic types, including chemotherapy-induced peripheral neuropathy, idiopathic peripheral neuropathy, traumatic neuropathy, and other mixed neuropathies.

The results well exceed those typically seen with pregabalin treatment, Dr. Carney said. Compared with findings in a previous report, in this study, CET decreased the average VAS score 54% more than did pregabalin (P = .00006), and it was 62% more effective than pregabalin in reducing the average pain score by 50% (P = .003).

Furthermore, adverse effects of pregabalin can be significant, with a randomized controlled trial showing at least 38% of patients having one or more adverse effects.

 “These results suggest CET therefore has at least 95% fewer adverse events than pregabalin,” Dr. Carney writes.

Dr. Carney gave the details of a patient who had success with the therapy. Before her treatments, the 34-year-old woman with ovarian dysgerminoma and chemotherapy-induced peripheral neuropathy described her pain as “On a daily basis I feel like I’m walking on glass.”

After 17 sessions with CET, the woman’s VAS score dropped from 7 to just 1.5 (a 79% decrease) and her PNFI decreased from 44 to 0. “The patient was able to run a marathon 3 months after CET.”

Dr. Carney noted that in a pilot group of 10 patients, biopsies performed 2 to 4 months after CET showed that 7 of the 10 patients had regrown their nerves by an average of 81%. “That is impressive,” he said.

He underscored that the study was not a randomized controlled trial, but he called on major organizations to continue the research with vigorous, double-blinded trials.

“If these results are replicated, then millions of people with neuropathy who are in agony may be able to lead fuller lives,” he said.

The study was awarded as one of the meeting’s “Best Posters.”

Tom Watson, DPT, a physical therapist based in Bend, Oregon, and chair of the AAPM’s Education Committee who moderated the session, commented that the study offered some intriguing insights.

“We really don’t fully understand exactly what peripheral neuropathy is, but if you use a technique to apply the anesthetic directly into the nerve electrically, that is intriguing,” he told Medscape Medical News.

“It could be that they are in some way altering the nerves. We don’t know, and we do need long-term follow-up studies to prove the efficacy, but I think it’s a great new theory and the results are promising.”

Breast Radiotherapy Not Linked to Cardiac Conduction Problems


Women who undergo breast cancer radiotherapy do not face an increased risk for subsequent heart conduction problems requiring a pacemaker as a result of their treatment, according to new research reported here at the European Society for Radiotherapy and Oncology (ESTRO) 3rd Forum.

“If a woman receives a pacemaker after breast cancer we can assure her it was not due to her therapy,” said study investigator Jens Christian Rehammar, MD, from Odense University Hospital in Denmark.

“Our study is quite unique, as there are no corresponding large studies in breast cancer patients only — previous studies are often mixed with patients with Hodgkin’s lymphoma,” he told Medscape Medical News.
The findings are good news considering recent evidence that an increase in other types of cardiac problems is associated with breast radiotherapy, Dr Rehammar said. He was referring to a 2013 study that found that for every additional gray of radiation to the breast there is 7.4% increase in major coronary events (P < .001) including myocardial infarction, the need for coronary revascularization, and dying from ischemic heart disease ( N Engl J Med. 2013 Mar 14;368(11):987-998).

Analysis of Danish Data

Dr Rehammar’s study merged data from the Danish Breast Cancer Collaborative Group with data from the Danish Pacemaker and ICD Registry to compare women who did and did not receive breast radiotherapy and their likelihood of needing a subsequent pacemaker.

A total of 44,704 women were included in the analysis.

The study found that among 18,308 breast cancer patients who received radiotherapy, 179 (0.98%) received a pacemaker, 90 of whom had been treated for left-sided and 89 for right-sided breast cancers.

For comparison, 1.54% of the nonradiotherapy patients received a pacemaker.

After adjusting for year of treatment, age, and time since breast cancer diagnosis, the risk for cardiac conduction problems requiring a pacemaker was not significantly different between the two groups (relative risk, 1.06; P = .71), he said.
Asked by Medscape Medical News to comment on the findings, Ben Smith, MD, assistant professor of radiation oncology at The University of Texas MD Anderson Cancer Center in Houston, said, “Conduction problems have not really been studied before in breast patients, to my knowledge. I would have predicted this would be a negative study, but it is nice to have data to confirm that.”

Dr Smith, who wrote a recent review of cardiac effects of breast radiotherapy, added, “This study provides reassurance that the typical doses of radiation received by the heart during this era in Denmark did not produce clinically severe damage to the heart’s conduction system. When coupled with prior studies, this body of work suggests that the primary mechanism for cardiac damage following breast cancer radiation is vascular in origin. The conduction apparatus in the heart is fair away from the breast, whereas the blood vessels that feed the heart are closer to the breast.”

MicroRNA Potential Biomarker for Esophagitis


In patients with eosinophilic esophagitis, microRNAs in the saliva might one day be used to diagnose and manage the disease, according to a pilot study.

“The technology we used to measure miRNAs is readily available and already inexpensive,” said senior investigator Faoud Ishmael, MD, from Pennsylvania State University College of Medicine in Hershey.

“It’s possible that a commercial assay could be available within the next 5 years or so,” he told Medscape Medical News.

The pilot study, presented here at the American Academy of Allergy, Asthma & Immunology 2015, is part of the ongoing search for noninvasive biomarkers of eosinophilic esophagitis, especially in children.

“The problem with eosinophilic esophagitis is that the only way to diagnose and monitor response is to perform invasive endoscopies to biopsy the esophagus, which can be difficult and possibly dangerous,” said Dr Ishmael.

But miRNAs, which are made by most cells in the body and are secreted into saliva and other body fluids, could serve as a “fingerprint” of the disease, he suggested.
“They are important regulators of inflammation, and their expression is altered in diseases such as eosinophilic esophagitis. About 2000 miRNAs have been discovered in humans so far, but only about 100 to 150 are easily detectable in biofluids, and the pattern of expression of these 100 or so varies across many diseases,” he explained.

Detectable in Biofluids

The study involved 15 adults with eosinophilic esophagitis. Each provided saliva samples at the time of diagnostic esophageal biopsy and after 2 months of swallowed fluticasone therapy.

Expression profiles from untreated eosinophilic esophagitis patients were compared with those from healthy control subjects. There was a significant relation between eosinophil count and the expression of miR-3613-3p (P = .005), but not the expression of miR-4668 (P = .06) or miR-570-3p, reported investigator Theodore Kelbel, MD, from the Penn State Hershey Medical Center.

After treatment, there was a significant reduction in the expression of miR-3613-3p and miR-4668 (P < .05) in eosinophilic esophagitis patients, but the expression of miR-570-3p did not change.

The expression of miR-3613-3p and miR-4668, however, did not reach the levels seen in the control group.

“Perhaps this is just a function of patients only being 2 months on therapy. If they’d been on therapy longer, maybe levels would have continued to go down,” said Dr Kelbel.

“We are now performing these studies in larger numbers of patients to confirm that our findings are valid and clinically useful,” Dr Ishmael reported.

The investigators found that miRNAs might be involved in asthma, and might have similar diagnostic potential in that disease. Dr Ishmael said they are hopeful that, in the future, “these miRNAs could be manipulated for a generation of new anti-inflammatory therapies.”
These findings suggest that “noninvasive diagnosis and/or eosinophilic esophagitis status monitoring may be possible in the near future,” said Ting Wen, PhD, from Cincinnati Children’s Hospital.

However, “whether this noninvasive approach will replace the current conventional method depends on the reproducibility and the to-be-proven specificity of the test,” he told Medscape Medical News. “There is still a long way to go.”

Dr Wen and his colleagues recently developed the eosinophilic esophagitis diagnostic panel, a molecular test that is currently used on biopsied tissue when the initial esophageal biopsy results are ambiguous.

A reliable noninvasive biomarker for the disease could “save patients the time, cost, and suffering of obtaining at least five esophageal biopsies,” he pointed out.

Other approaches currently being developed include blood miRNA, exhaled breath condensate miRNA, and esophageal lavage cytokine levels, Dr Wen reported. “All of these hold strong potential,” he said.

Intralymphatic Injections Safe for Grass Pollen Allergy


Intralymphatic injections of standard grass-pollen extract are well tolerated and could offer a new and convenient option for treating grass-pollen allergy, new research suggests.

The therapy involves ultrasound-guided injection into the right inguinal lymph node, and requires only three preseasonal doses, given 1 month apart, explained investigator Amber Patterson, MD, from Nationwide Children’s Hospital in Columbus, Ohio.

Intralymphatic immunotherapy is similar to subcutaneous immunotherapy “in that you do your course of treatment, but then you’re done,” without the monthly injections for 3 to 5 years, she explained.

In several European trials, it has been demonstrated that intralymphatic injections are effective against grass, birch, and cat.

“When I first read about this in Europe, I thought, ‘this is incredible, we need to be doing this here.’ It’s so nice to have options since a lot of people don’t do traditional allergy shots because it’s so inconvenient,” Dr Patterson told Medscape Medical News.

“The Europeans followed people 3 years out and saw sustained efficacy, so we’re banking on that,” she added.

Results from the study by Dr Patterson’s team were presented as a late-breaking poster here at the American Academy of Allergy, Asthma & Immunology 2015.

First Evidence

The double-blind trial involved 15 teenagers with grass-pollen-induced rhinoconjunctivitis, three of whom also reported mild intermittent asthma.

The patients were randomly assigned to receive three intralymphatic injections of either extract from North American grass pollen or placebo, administered at least 4 weeks apart and timed to be completed before the start of the grass-pollen season.

Escalating doses — 0.1 mL, 0.2 mL, and 0.5 mL — were administered with a 1.5-inch 25-gauge needle inserted into the subcapsular node space, Dr Patterson explained.

“The main aim was to show that we could take American extracts, which are standardized differently than European extracts, and that we could do this safely,” she said.

You hardly ever see this kind of compliance.

Patients were assessed for adverse safety events — such as erythema, pruritus, and edema — 2 and 5 hours after the injection and then 1 week later. There were no differences in safety scores between the treatment and placebo groups.

Of note, all the teenagers completed 100% of their injections, Dr Patterson reported. “When I told other allergists this, they were so surprised; you hardly ever see this kind of compliance.”

When she was asked about allergists’ access to ultrasound guidance for the injections, Dr Patterson explained that “we need to figure out the science first. Once that’s more established, we can figure out how to give the shots because it’s a pretty simple procedure.”

“It’s interesting,” said Hugh Sampson, MD, from Kravis Children’s Hospital at Mount Sinai in New York City, who was not involved in the study.

“It does seem safe and, if you can do it in three doses, you’re definitely going to get better compliance,” he told Medscape Medical News.

Shift From IV to Oral Sedation for Pain Procedures?


The transition from extensive use of intravenous (IV) sedation for interventional pain procedures to almost exclusive use of oral anxiolysis at an outpatient pain center led to significant improvements in patient satisfaction and recovery times, a new study shows.

But not all practitioners are on board with such a policy, citing potential safety issues.

“The shift from IV sedation to oral anxiolysis for procedures performed in an outpatient academic pain center has allowed successful interventional pain procedures to be completed with an increase in patient safety associated with patient awareness and feedback,” write the authors of the study, conducted at the San Antonio Military Center, Fort Sam Houston, Texas.

For the study, presented here at the American Academy of Pain Medicine (AAPM) 31st Annual Meeting, first author Edward Lopez, MD, described a policy that the large academic military medical institution’s outpatient pain center has implemented over the past 2 years, under which oral benzodiazepines, specifically diazepam, are used instead of IV sedation, such as opioids, propofol, or benzodiazepines, before most invasive pain intervention procedures.

Under the policy, the number of IV sedations for such procedures declined from 44.4% to just 1.5% over the 2 years.

The authors note that IV sedation, particularly heavy sedation, can compromise safety by preventing the patient from providing feedback.

“You look at each patient differently and try to give what’s best for them, but for us, in general we find it’s safer to use an anxiolytic because the patient can give feedback and let you know if you’re going in the wrong direction or something,” Dr Lopez told Medscape Medical News.

The change in policy has resulted in improvements, such as reduced procedure time and patient recovery times, without reducing customer satisfaction, Dr Lopez said. Because the research is still preliminary, exact figures could not be provided, but Dr Lopez said the improvements are statistically significant.

He noted that the average dose of oral anxiolytics, specifically diazepam (Valium, Roche), is usually about 10 mg.

Dr Lopez stressed that key to success in the approach is involving patients and keeping them informed of the process.

“I think a big part of this is doing the education up front, telling patients they’re going to get the medication and how it works, and explaining the procedure and the importance and letting us know if they feel tingling or something they think they shouldn’t be feeling,” he said.

“Making them a part of their healthcare makes a big difference.”

While the policy shift to oral anxiolysis at the San Antonio pain center was voluntary, pain specialists are increasingly finding themselves involuntarily having to take such measures as insurance companies back away from coverage of IV anesthesia for pain intervention procedures.

 The shift is even causing some pain clinics to have to take matters into their own hands when it comes to IV anesthesia, explained Shailen Jalali, MD, from the Greater Philadelphia Spine and Pain Center, Havertown, Pennsylvania.

“The reason [insurers] say they are no longer covering IV sedation is that it is safer to take the oral approach and IV sedation is usually unnecessary, but I frankly think the reality is they saw how much money that can be saved by not having to pay for anesthesia,” he told Medscape Medical News.

Dr Jalali questions the suggestion that such procedures are safer without IV sedation.

“Sometimes it is in fact safer when the patient is comfortable and calm and not moving around and nervous,” he said. “If the patient is sedated, you can do the procedure quickly and efficiently and it can otherwise not be as safe.”

Dr Jalali said his center’s anesthesia group has even taken the action of launching a pilot program to help cover the costs of anesthesia for certain patients who need pain management procedures and don’t have coverage for them.

“In our surgical center, we’re taking a small amount of the facility to help pay for these services for the pilot program,” he said.

“The patients are given three options: They can just get local anesthesia; they can have conscious sedation; or they can have anesthesia and will only have to pay a certain amount to receive it,” he said.

With so many insurers declining coverage, the group can only extend the offer to help pay to a limited number of patients, Dr Jalali noted.

“We are giving it a try — we’ll see how it works and then we’ll regroup. Obviously it won’t be a money maker, but we think it will help patients.”

Samer Narouze, MD, chairman of the Center for Pain Medicine at Western Reserve Hospital, Cuyahoga Falls, Ohio, also expressed reservations about the idea of a policy using primarily oral sedation.

“I’ve mixed feelings about this abstract,” Dr Narouze told Medscape Medical News.

“I do both oral and IV sedation [and] my issue with oral sedation is that you cannot predict the timing of the sedation and the quality of sedation,” he said.

“One dose does not fit all, however with careful IV sedation titration, you will have better control on the sedation.”

Dr Narouze added that, like Dr Jalali, he also sees challenges in terms of some insurers declining coverage but finds ways to work around it for the benefit and safety of the patient.

“I’m aware that some insurance carriers tend not to cover IV sedation, but still I provide this service to my patients — and eat the cost, as it is safer to have a calm, communicating patient during the procedure rather than an anxious, agitated, moving patient with a needle nearby delicate structures.”

Oral Regimen Temporarily Suppresses Resistant Gut Bacteria.


 

An oral regimen of colistin and neomycin suppresses extended-spectrum beta-lactamase in patients, but the effect of treatment disappeared within 7 days, according to a new study.

There have been attempts to eradicate these resistant bacteria, but most previous research has been uncontrolled or used case studies. “This is the first really methodologically sound and well-controlled study in this area. We tried to improve the evidence base,” Stephan Harbarth, MD, associate professor of infectious diseases and infection control at Geneva University Hospitals in Switzerland, told Medscape Medical News.

This regimen could be used to combat outbreaks and prepare patients for surgery, said Dr. Harbarth, who presented findings here at the 23rd European Congress of Clinical Microbiology and Infectious Diseases.

In the single-center, double-blind, placebo-controlled trial, adults with a rectal swab that tested positive for extended-spectrum beta-lactamase were randomized to 1 of 2 groups. A total 27 patients received colistin (50 mg 4 times daily) and neomycin (250 mg 4 times daily) for 10 days, plus nitrofurantoin (100 mg 3 times daily) for 5 days if they had a positive urine culture at baseline; and 27 patients received placebo.

The researchers conducted cultures (rectal, inguinal, urine) on day 6 of treatment, day 1 after the end of treatment, and day 7 after the end of treatment. The primary outcome was the detection of Enterobacteriaceae by rectal swab at 28 ± 7 days after the end of treatment. When primary outcome data were missing, they were imputed on the basis of the last observation.

There was no significant difference between the 2 groups at 28 ± 7 days after the end of treatment. However, there were differences during and soon after treatment.

Patients in the treatment group were more likely to experience liquid stool than those in the placebo group (25.9% vs 6.9%; = .05). Inguinal Enterobacteriaceae colonization was present in 29 of 58 patients at baseline, whereas urinary colonization was present in 12 patients in the treatment group and 7 in the placebo group. There was no difference between the 2 groups for inguinal or urinary colonization at any point during the study.

In the treatment group, there was no significant change in minimum inhibitory concentration of colistin from baseline to 28 ± 7 days after the end of treatment.

Although the primary end point was not met and the regimen can’t be used for broad-scale control of infection, “it’s an infection-control measure that could be useful for suppression of this multiresistant carriage in the gut flora. In certain situations, like epidemics or prior to high-risk surgery, this could be a valuable intervention,” said Dr. Harbarth.

He pointed out that modifications to the regimen might be more successful. Different agents that are more active in the colon might improve outcome, probiotics could be used, and drug dosages could be refined. It could also be that one of the drugs used — neomycin — was underdosed in the study, Dr. Harbarth added.

“The results were interesting, but sadly disappointing,” session moderator Hilary Humphreys, MD, professor of clinical microbiology at the Royal College of Surgeons in Dublin, Ireland, told Medscape Medical News. This study confirms that “it’s very difficult with a temporary suppression regimen to see permanent eradication in the gastrointestinal tract, because there are huge numbers of bacteria and the physiology and the environment where those bacteria exist is very complex.”

Dr. Humphreys said he agrees that the regimen could be applied to outbreaks and during preparation for major surgery. “I think those are probably its major uses.”

Source: medscape.com