Iodine deficiencies during pregnancy linked to lower IQ in offspring.


Iodine deficiencies during pregnancy may have negative neurocognitive outcomes among offspring, according to findings by researchers in the United Kingdom that were published in The Lancet.

Pregnant women and those planning a pregnancy should ensure adequate iodine intake; good dietary sources are milk, dairy products and fish. Women who avoid these foods and are seeking alternative iodine sources can consult the iodine fact sheet that we have developed, which is available on the websites of the University of Surrey and the British Dietetic Association. Kelp supplements should be avoided, as they may have excessive levels of iodine,” Sarah C. Bath, PhD, RD, of the department of nutritional sciences at the University of Surrey, said in a press release.

Bath and colleagues analyzed stored samples of urinary iodine concentrations from 1,040 first-trimester pregnant women, measures of IQ from the offspring aged 8 years and reading ability at age 9 years. The mother-child pairs were collected from the Avon Longitudinal Study of Parents and Children (ALSPAC).

The researchers defined mild-to-moderate iodine deficiency as a median urinary iodine concentration of 91.1 mcg/L (interquartile range [IQR], 53.8-143; iodine-to-creatinine ratio of 110 mcg/g; IQR, 74-170).

After adjusting for 21 socioeconomic, parental and child factors as confounders, data indicated that children of women with an iodine-to-creatinine ratio of less than 150 mcg/g were more likely to have scores in the lowest quartile for verbal IQ (OR=1.58; 95% CI, 1.09-2.3), reading accuracy (OR=1.69; 95% CI, 1.15-2.49) and reading comprehension (OR=1.54; 95% CI, 1.06-2.23) vs. those of mothers with ratios of at least 150 mcg/g. Furthermore, scores continued to dwindle when the less than 150-mcg/g group was subdivided, researchers wrote.

In an accompanying commentary, Alex Stagnaro-Green, MD, MHPE,professor of medicine and obstetrics and gynecology at the George Washington University School of Medicine and Health Sciences, andElizabeth N. Pearce, MD, associate professor of medicine at Boston University School of Medicine, wrote that this study, along with previous research, represents a call-to-action because of the documented link between iodine deficiency and poor neurocognitive outcomes.

 “Absence of a public health policy in the face of clear documentation of moderate iodine deficiency and strong evidence of its deleterious effect on the neurodevelopmentof children is ill advised,” they wrote. “Nor should unmonitored and adventitious dietary iodine sources continue to be relied on.”

Source: Endocrine Today

Effect of inadequate iodine status in UK pregnant women on cognitive outcomes in their children: results from the Avon Longitudinal Study of Parents and Children (ALSPAC)

Summary

Background

As a component of thyroid hormones, iodine is essential for fetal brain development. Although the UK has long been considered iodine replete, increasing evidence suggests that it might now be mildly iodine deficient. We assessed whether mild iodine deficiency during early pregnancy was associated with an adverse effect on child cognitive development.

Methods

We analysed mother—child pairs from the Avon Longitudinal Study of Parents and Children (ALSPAC) cohort by measuring urinary iodine concentration (and creatinine to correct for urine volume) in stored samples from 1040 first-trimester pregnant women. We selected women on the basis of a singleton pregnancy and availability of both a urine sample from the first trimester (defined as ≤13 weeks’ gestation; median 10 weeks [IQR 9—12]) and a measure of intelligence quotient (IQ) in the offspring at age 8 years. Women’s results for iodine-to-creatinine ratio were dichotomised to less than 150 μg/g or 150 μg/g or more on the basis of WHO criteria for iodine deficiency or sufficiency in pregnancy. We assessed the association between maternal iodine status and child IQ at age 8 years and reading ability at age 9 years. We included 21 socioeconomic, parental, and child factors as confounders.

Findings

The group was classified as having mild-to-moderate iodine deficiency on the basis of a median urinary iodine concentration of 91·1 μg/L (IQR 53·8—143; iodine-to-creatinine ratio 110 μg/g, IQR 74—170). After adjustment for confounders, children of women with an iodine-to-creatinine ratio of less than 150 μg/g were more likely to have scores in the lowest quartile for verbal IQ (odds ratio 1·58, 95% CI 1·09—2·30; p=0·02), reading accuracy (1·69, 1·15—2·49; p=0·007), and reading comprehension (1·54, 1·06—2·23; p=0·02) than were those of mothers with ratios of 150 μg/g or more. When the less than 150 μg/g group was subdivided, scores worsened ongoing from 150 μg/g or more, to 50—150 μg/g, to less than 50 μg/g.

Interpretation

Our results show the importance of adequate iodine status during early gestation and emphasise the risk that iodine deficiency can pose to the developing infant, even in a country classified as only mildly iodine deficient. Iodine deficiency in pregnant women in the UK should be treated as an important public health issue that needs attention.

Source: Lancet

 

 

Cherries May Prevent Gout Flares.


Patients with gout were less likely to report acute attacks after 2 days of eating cherries or imbibing cherry extract than during periods after no cherry intake, according to data reported in Arthritis & Rheumatism by Yuqing Zhang, DSci, and colleagues from Boston University School of Medicine in Massachusetts.

Dr. Zhang, who is professor of medicine and epidemiology at Boston University School of Medicine, told Medscape Medical News that cherry intake during a 2-day period was associated with a 35% lower risk for gout attacks and that cherry extract intake was associated with a 45% lower risk.

Risk for gout attacks was reduced by 75% when cherry intake was combined with allopurinol use. Dr. Zhang said, “We found that if subjects took allopurinol alone, it reduced the risk of gout attack by 53%; if subjects took cherry alone, it reduced the risk by 32%; if they took both, the risk of gout attack was reduced by 75%.”

These associations were discovered in a case-crossover study of 633 individuals with physician-diagnosed gout who were prospectively recruited and followed online for 1 year. When a participant reported a gout attack, the researchers asked about the onset date of the gout attack, symptoms and signs, medications, and potential risk factors (including daily intake of cherries and cherry extract) during the 2 days before the attack. Patients served as their own controls, so the same information was assessed for 2-day control periods not associated with gout attacks. A cherry serving was defined as one-half cup or 10 to 12 cherries.

Participants had a mean age of 54 years; 88% were white and 78% were male. Of patients with some form of cherry intake, 35% ate fresh cherries, 2% ingested cherry extract, and 5% consumed both fresh cherry fruit and cherry extract. Researchers documented 1247 gout attacks during the 1-year follow-up period, with 92% occurring in the joint at the base of the big toe.

Factors associated with increased serum uric acid levels, such as increased alcohol consumption and purine intake, or use of diuretics, were associated with increased risk for recurrent gout attacks.

“Our findings indicate that consuming cherries or cherry extract lowers the risk of gout attack,” Dr. Zhang said in a press release. “The gout flare risk continued to decrease with increasing cherry consumption, up to three servings over two days.” Further cherry intake was not associated with additional benefit.

“However, the protective effect of cherry intake persisted after taking into account patients’ sex; body mass (obesity); purine intake; and use of alcohol, diuretics, and antigout medications,” according to the release.

The authors speculate that cherries may decrease serum uric acid levels by increasing glomerular filtration or reducing tubular reabsorption. They also note that cherries and cherry extract contain high levels of anthocyanins, which possess anti-inflammatory properties.

Dr. Zhang told Medscape Medical News, “While our study findings are promising, randomized clinical trials should be conducted to confirm whether cherry products could provide a nonpharmacological preventive option against grout attacks. Until then we would not advocate on the basis of the current findings that individuals who suffer from gout abandon standard therapies and opt for cherry extract products as an alternative.”

In an accompanying editorial, Allan Gelber, MD, from Johns Hopkins University School of Medicine in Baltimore, Maryland, and Daniel Solomon, MD, from Brigham and Women’s Hospital and Harvard University Medical School in Boston, write that the findings are promising but reiterates the need for randomized clinical trials to confirm that consumption of cherry products could prevent gout attacks.

Dr. Gelber told Medscape Medical News, “For the patient who asks his/her doctor ‘Doc, what can I do, myself, to decrease my chance of developing another gout attack, above and beyond the medications you have prescribed for me?’ our response would include that one of the options is dietary modification. Previously, physician recommendations included advocating for moderation in alcohol consumption, weight reduction, and decreasing high-purine foods from the diet…but now there are new data supporting a beneficial role in eating cherries to reduce one’s risk for recurrent gout attacks.”

Dr. Gelber noted that the most definitive support for the recommendation to eat cherries as a strategy to reduce gout risk would come from a randomized clinical trial. “Just as with new medications that come down the pipeline, dietary interventions ought also be subject to the rigor of a clinical trial. Such a study could be undertaken. There is logistical challenge to undertaking such a trial since cherry fruit is broadly available. But, in a controlled setting, such a trial would be feasible,” he said.

Source: Mescape.com

 

 

 

Experts debate benefits of routine nerve monitoring in thyroid surgery.


Although many clinicians use intraoperative nerve monitoring during thyroid surgery, data do not necessarily associate the practice with improved outcomes. The question of whether it should be used routinely was up for discussion at the American Thyroid Association 82nd Annual Meeting.

Potential benefits

Jennifer E. Rosen, MD, FACPS, assistant professor of surgery and molecular medicine at Boston University School of Medicine, said that nerve monitoring may be beneficial from a cost standpoint, explaining that post-operative permanent nerve injury and post-operative permanent hypothyroidism are the driving force behind the majority of lawsuits in thyroid surgery.

She also highlighted several uses for intraoperative nerve monitoring in thyroidectomy. For instance, it offers more than visual confirmation when identifying the recurrent laryngeal nerve, Rosen said. Additionally, nerve monitoring can help identify abnormalities in the anatomy of the nerve and aid in dissection. Further, she noted, nerve monitoring has value as a prognostic tool in terms of postoperative neural function.

The major question, however, is whether intraoperative nerve monitoring prevents nerve injury or paralysis during thyroidectomy. Although data are not completely positive, this may be due to several factors, according to Rosen, such as whether the surgeon performs pre-operative and post-operative laryngoscopy and in what setting; how many procedures the surgeon performs per year; what techniques are used; and more.

If a surgeon is going to use nerve monitoring, he or she should do it routinely, Rosen said. The surgeon should also perform pre- and post-operative laryngoscopy and voice assessment, as well as be very aware and knowledgeable about the type of equipment and approach to surgery that is being used.

“Based on the preponderance of evidence and an interpretation of the strengths and limitations of the data on which we base our decisions, and with some qualifications based on the type of surgery, the setting and the surgeon, then yes, [intraoperative nerve monitoring] should be done routinely,” she said.

A lack of data

However, David J. Terris, MD, FACS, Porubsky Professor and chairman of the department of otolaryngology at Georgia Health Sciences University and surgical director of the Thyroid Center, pointed out that the published scientific evidence does not support the routine use of nerve monitoring in thyroid surgery.

“It’s important to consider this in two different ways: what is the logic behind nerve monitoring vs. what about the data actually supporting the use of nerve monitoring? We want to consider those separately,” he said.

Terris cited four studies that failed to prove a connection between nerve monitoring and improved functional outcomes in thyroid surgery. For example, results from a trial conducted at 63 centers in Germany and involving 29,998 nerves demonstrated no differences in the nerve monitoring group when compared with the nerve identification and dissection group (although each of these methods were superior to an approach where the nerve is not sought and identified) . Similarly, researchers for another study involving 1,804 nerves at risk concluded no benefit to nerve monitoring (although both nerve monitoring and nerve stimulation and twitch palpation without nerve monitoring were able to predict nerve injury).

The potential for added costs, including a $300 endotracheal tube, additional time in the operating room and from $500 to $1,000 in surgical fees, is another possible downside to nerve monitoring, according to Terris. Complications such as airway obstruction, tongue necrosis and increased parasympathetic tone associated with clamping the vagus nerve are also concerns, he said. Moreover, clinicians may become reliant on the technology for identifying the nerve.

“One concern is training a new generation of surgeons who have inferior anatomical skills,” he said. “The bottom line is that [nerve monitoring] adds expense; has its own potential for complications; induces a false sense of security; and there’s no evidence that it does what it’s supposed to do, which is prevent injury.”  Despite these shortcomings, Dr. Terris indicated that he himself generally uses nerve monitoring because of subtle advantages associated with it, and incremental surgical information that it provides.

Source: Endocrine Today.