Using Precisely-Targeted Lasers, Researchers Manipulate Neurons in Worms’ Brains and Take Control of Their Behavior.


In the quest to understand how the brain turns sensory input into behavior, Harvard scientists have crossed a major threshold. Using precisely-targeted lasers, researchers have been able to take over an animal’s brain, instruct it to turn in any direction they choose, and even to implant false sensory information, fooling the animal into thinking food was nearby.

As described in a September 23 paper published in Nature, a team made up of Sharad Ramanathan, an Assistant Professor of Molecular and Cellular Biology, and of Applied Physics, Askin Kocabas, a Post-Doctoral Fellow in Molecular and Cellular Biology, Ching-Han Shen, a Research Assistant in Molecular and Cellular Biology, and Zengcai V. Guo, from the Howard Hughes Medical Institute were able to take control of Caenorhabditis elegans — tiny, transparent worms — by manipulating neurons in the worms’ “brain.”

The work, Ramanathan said, is important because, by taking control of complex behaviors in a relatively simple animal — C. elegans have just 302 neurons -we can understand how its nervous system functions..

“If we can understand simple nervous systems to the point of completely controlling them, then it may be a possibility that we can gain a comprehensive understanding of more complex systems,” Ramanathan said. “This gives us a framework to think about neural circuits, how to manipulate them, which circuit to manipulate and what activity patterns to produce in them .”

“Extremely important work in the literature has focused on ablating neurons, or studying mutants that affect neuronal function and mapping out the connectivity of the entire nervous system. ” he added. “Most of these approaches have discovered neurons necessary for specific behavior by destroying them. The question we were trying to answer was: Instead of breaking the system to understand it, can we essentially hijack the key neurons that are sufficient to control behavior and use these neurons to force the animal to do what we want?”

Before Ramanathan and his team could begin to answer that question, however, they needed to overcome a number of technical challenges.

Using genetic tools, researchers engineered worms whose neurons gave off fluorescent light, allowing them to be tracked during experiments. Researchers also altered genes in the worms which made neurons sensitive to light, meaning they could be activated with pulses of laser light.

The largest challenges, though, came in developing the hardware necessary to track the worms and target the correct neuron in a fraction of a second.

“The goal is to activate only one neuron,” he explained. “That’s challenging because the animal is moving, and the neurons are densely packed near its head, so the challenge is to acquire an image of the animal, process that image, identify the neuron, track the animal, position your laser and shoot the particularly neuron — and do it all in 20 milliseconds, or about 50 times a second. The engineering challenges involved seemed insurmountable when we started. But Askin Kocabas found ways to overcome these challenges”

The system researchers eventually developed uses a movable table to keep the crawling worm centered beneath a camera and laser. They also custom-built computer hardware and software, Ramanathan said, to ensure the system works at the split-second speeds they need.

The end result, he said, was a system capable of not only controlling the worms’ behavior, but their senses as well. In one test described in the paper, researchers were able to use the system to trick a worm’s brain into believing food was nearby, causing it to make a beeline toward the imaginary meal.

Going forward, Ramanathan and his team plan to explore what other behaviors the system can control in C. elegans. Other efforts include designing new cameras and computer hardware with the goal of speeding up the system from 20 milliseconds to one. The increased speed would allow them to test the system in more complex animals, like zebrafish.

“By manipulating the neural system of this animal, we can make it turn left, we can make it turn right, we can make it go in a loop, we can make it think there is food nearby,” Ramanathan said. “We want to understand the brain of this animal, which has only a few hundred neurons, completely and essentially turn it into a video game, where we can control all of its behaviors.”

Source: http://www.sciencedaily.com

 

Human Brains Outpace Chimp Brains in Womb.


Humans‘ superior brain size in comparison to their chimpanzee cousins traces all the way back to the womb. That’s according to a study reported in the September 25 issue of Current Biology, a Cell Press publication, that is the first to track and compare brain growth in chimpanzee and human fetuses.

“Nobody knew how early these differences between human and chimp brains emerged,” said Satoshi Hirata of Kyoto University.

Hirata and colleagues Tomoko Sakai and Hideko Takeshita now find that human and chimp brains begin to show remarkable differences very early in life. In both primate species, the brain grows increasingly fast in the womb initially. After 22 weeks of gestation, brain growth in chimpanzees starts to level off, while that of humans continues to accelerate for another two months or more. (Human gestation time is only slightly longer than that of chimpanzees, 38 weeks versus 33 or 34 weeks.)

The findings are based on 3D ultrasound imaging of two pregnant chimpanzees from approximately 14 to 34 weeks of gestation and comparison of those fetal images to those of human fetuses. While early brain differences were suspected, no one had previously measured the volume of chimpanzee brains as they develop in the womb until now.

The findings are part of a larger effort by the research team to explore differences in primate brains. In another Current Biology report published last year, they compared brain development in chimps versus humans via magnetic resonance imaging (MRI) scans of three growing chimpanzees from the age of six months to six years.

“Elucidating these differences in the developmental patterns of brain structure between humans and great apes will provide important clues to understand the remarkable enlargement of the modern human brain and humans’ sophisticated behavior,” Sakai said.

The researchers say they now hope to explore fetal development in particular parts of the brain, including the forebrain, which is critical for decision making, self-awareness, and creativity.

Source: http://www.sciencedaily.com

Basic Infection Control and Prevention Plan for Outpatient Oncology Settings.


An estimated 1.5 million new cases of cancer were diagnosed in the United States in 2010 [1]. With improvements in survivorship and the growth and aging of the U.S. population, the total number of persons living with cancer will continue to increase [2]. Despite advances in oncology care, infections remain a major cause of morbidity and mortality among cancer patients [3-5]. Increased risks for infection are attributed, in part, to immunosuppression caused by the underlying malignancy and chemotherapy. In addition patients with cancer come into frequent contact with healthcare settings and can be exposed to other patients in these settings with transmissible infections. Likewise, patients with cancer often require the placement of indwelling intravascular access devices or undergo surgical procedures that increase their risk for infectious complications. Given their vulnerable condition, great attention to infection prevention is warranted in the care of these patients.

In recent decades, the vast majority of oncology services have shifted to outpatient settings, such as physician offices, hospital-based outpatient clinics, and nonhospital-based cancer centers. Currently, more than one million cancer patients receive outpatient chemotherapy or radiation therapy each year [6]. Acute care hospitals continue to specialize in the treatment of many patients with cancer who are at increased risk for infection (e.g., hematopoietic stem cell transplant recipients, patients with febrile neutropenia), with programs and policies that promote adherence to infection control standards. In contrast, outpatient oncology facilities vary greatly in their attention to and oversight of infection control and prevention. This is reflected in a number of outbreaks of viral hepatitis and bacterial bloodstream infections that resulted from breaches in basic infection prevention practices (e.g., syringe reuse, mishandling of intravenous administration sets) [7-10]. In some of these incidents, the implicated facility did not have written infection control policies and procedures for patient protection or regular access to infection prevention expertise.

Top of Page

Scope

A. Intent and Implementation

This document has been developed for outpatient oncology facilities to serve as a model for a basic infection control and prevention plan. It contains policies and procedures tailored to these settings to meet minimal expectations of patient protections as described in the CDC Guide to Infection Prevention in Outpatient Settings. The elements in this document are based on CDC’s evidence-based guidelines and guidelines from professional societies (e.g., Oncology Nursing Society).

This plan is intended to be used by all outpatient oncology facilities. Those facilities that do not have an existing plan should use this plan as a starting point to develop a facility-specific plan that will be updated and further supplemented as needed based on the types of services provided. Facilities that have a plan should ensure that their current infection prevention policies and procedures include the elements outlined in this document. While this plan may essentially be used exactly “as is,” facilities are encouraged to personalize the plan to make it more relevant to their setting (e.g., adding facility name and names of specific rooms/locations; inserting titles/positions of designated personnel; and providing detailed instructions where applicable).

This plan does not replace the need for an outpatient oncology facility to have regular access to an individual with training in infection prevention and for that individual to perform on-site evaluation and to directly observe and interact regularly with staff. Facilities may wish to consult with an individual with training and expertise in infection prevention early on to assist with their infection control plan development and implementation and to ensure that facility design and work flow is conducive to optimal infection prevention practices.

B.  Aspects of Care That Are Beyond the Scope of This Plan

This model plan focuses on the core measures to prevent the spread of infectious diseases in outpatient oncology settings. It is not intended to address facility-specific issues or other aspects of patient care such as:

  • Infection prevention issues that are unique to blood and marrow transplant centers (a.k.a. bone marrow transplant or stem cell transplant centers)
  • Occupational health requirements, including recommended personal protective equipment for handling antineoplastic and hazardous drugs as outlined by the Occupational Safety and Health Administration and the National Institute for Occupational Safety
  • Appropriate preparation and handling (e.g., reconstituting, mixing, diluting, compounding) of sterile medications, including antineoplastic agents
  • Clinical recommendations and guidance on appropriate antimicrobial prescribing practices and the assessment of neutropenia risk in patients undergoing chemotherapy

For more information on these topics, refer to the list of resources provided in Appendix D of the plan.

Top of Page

References

  1. American Cancer Society. Cancer Facts & Figures 2010 Tables & Figures.
  2. Warren JL, Mariotto AB, Meekins A, Topor M, Brown ML. Current and future utilization of services from medical oncologists. J Clin Oncol 2008;26:3242−7.
  3. Kamboj M, Sepkowitz KA. Nosocomial infections in patients with cancer. Lancet Oncol 2009;10:589−97.
  4. Maschmeyer G, Haas A. The epidemiology and treatment of infections in cancer patients. Int J Antimicrob Agents 2008;31:193−7.
  5. Guinan JL, McGuckin M, Nowell PC. Management of health-care−associated infections in the oncology patient. Oncology 2003;17:415−20.
  6. Halpern MT, Yabroff KR. Prevalence of outpatient cancer treatment in the United States: estimates from the Medical Panel Expenditures Survey (MEPS). Cancer Invest 2008;26:647−51.
  7. Macedo de Oliveria A, White KL, Leschinsky DP, Beecham BD, Vogt TM, Moolenaar RL et al. An outbreak of hepatitis C virus infections among outpatients at a hematology/oncology clinic. Ann Intern Med 2005;142:898−902.
  8. Watson JT, Jones RC, Siston AM, Fernandez JR, Martin K, Beck E, et al. Outbreak of catheter-associated Klebsiella oxytoca and Enterobacter cloacae bloodstream infections in an oncology chemotherapy center.  Arch Intern Med 2005;165:2639−43.
  9. Greeley RD, Semple S, Thompson ND, High P, Rudowski E, Handschur E et al. Hepatitis B outbreak associated with a hematology-oncology office practice in New Jersey, 2009.  Am J Infect Control 2011 Jun 8. Epub ahead of print.
  10. Herndon E. Rose Cancer Center shut down; patients advised to get screening. Enterprise-Journal.  July 31, 2011. Accessed September 9, 2011

Source: CDC

A Call for Caution on Antipsychotic Drugs.


You will never guess what the fifth and sixth best-selling prescription drugs are in the United States, so I’ll just tell you: Abilify and Seroquel, two powerful antipsychotics. In 2011 alone, they and other antipsychotic drugs were prescribed to 3.1 million Americans at a cost of $18.2 billion, a 13 percent increase over the previous year, according to the market research firm IMS Health.

Those drugs are used to treat such serious psychiatric disorders as schizophrenia, bipolar disorder and severe major depression. But the rates of these disorders have been stable in the adult population for years. So how did these and other antipsychotics get to be so popular?

Antipsychotic drugs have been around for a long time, but until recently they were not widely used. Thorazine, the first real antipsychotic, was synthesized in the 1950s; not just sedating, it also targeted the core symptoms of schizophrenia, like hallucinations and delusions. Later, it was discovered that antipsychotic drugs also had powerful mood-stabilizing effects, so they were used to treat bipolar disorder, too.

Then, starting in 1993, came the so-called atypical antipsychotic drugs like Risperdal, Zyprexa, Seroquel, Geodon and Abilify. Today there are 10 of these drugs on the market, and they have generally fewer neurological side effects than the first-generation drugs.

Originally experts believed the new drugs were more effective than the older antipsychotics against such symptoms of schizophrenia as apathy, social withdrawal and cognitive deficits. But several recent large randomized studies, like the landmark Catie trial, failed to show that the new antipsychotics were any more effective or better tolerated than the older drugs.

This news was surprising to many psychiatrists — and obviously very disappointing to the drug companies.

It was also soon discovered that the second-generation antipsychotic drugs had serious side effects of their own, namely a risk of increased blood sugar, elevated lipids and cholesterol, and weight gain. They can also cause a potentially irreversible movement disorder called tardive dyskinesia, though the risk is thought to be significantly lower than with the older antipsychotic drugs.

Nonetheless, there has been a vast expansion in the use of these second-generation antipsychotic drugs in patients of all ages, particularly young people. Until recently, these drugs were used to treat a few serious psychiatric disorders. But now, unbelievably, these powerful medications are prescribed for conditions as varied as very mild mood disorders, everyday anxiety, insomnia and even mild emotional discomfort.

The number of annual prescriptions for atypical antipsychotics rose to 54 million in 2011 from 28 million in 2001, an 93 percent increase, according to IMS Health. One study found that the use of these drugs for indications without federal approval more than doubled from 1995 to 2008.

The original target population for these drugs, patients with schizophrenia and bipolar disorder, is actually quite small: The lifetime prevalence of schizophrenia is 1 percent, and that of bipolar disorder is around 1.5 percent. Drug companies have had a powerful economic incentive to explore other psychiatric uses and target populations for the newer antipsychotic drugs.

The companies initiated dozens of clinical trials to test these drugs against depression and, more recently, anxiety disorders. Starting in 2003, the makers of several second-generation antipsychotics (also known as atypical neuroleptics) have received F.D.A. approval for the use of these drugs in combination with antidepressants to treat severe depression, which they trumpeted in aggressive direct-to-consumer advertising campaigns.

The combined spending on print and digital media advertising for these new antipsychotic drugs increased to $2.4 billion in 2010, up from $1.3 billion in 2007, according to Kantar Media. Between 2007 and 2011, more than 98 percent of all advertising on atypical antipsychotics was spent on just two drugs: Abilify and Seroquel, the current best sellers.

There is little in these alluring advertisements to indicate that these are not simple antidepressants but powerful antipsychotics. A depressed female cartoon character says that before she starting taking Abilify, she was taking an antidepressant but still feeling down. Then, she says, her doctor suggested adding Abilify to her antidepressant, and, voilà, the gloom lifted.

The ad omits critical facts about depression that consumers would surely want to know. If a patient has not gotten better on an antidepressant, for instance, just taking it for a longer time or taking a higher dose could be very effective. There is also very strong evidence that adding a second antidepressant from a different chemical class is an effective and cheaper strategy — without having to resort to antipsychotic medication.

A more recent and worrisome trend is the use of atypical antipsychotic drugs — many of which are acutely sedating and calming — to treat various forms of anxiety, like generalized anxiety disorder and even situational anxiety. A study last year found that 21.3 percent of visits to a psychiatrist for treatment of an anxiety disorder in 2007 resulted in a prescription for an antipsychotic, up from 10.6 percent in 1996. This is a disturbing finding in light of the fact that the data for the safety and efficacy of antipsychotic drugs in treating anxiety disorders is weak, to say nothing of the mountain of evidence that generalized anxiety disorder can be effectively treated with safer — and cheaper — drugs like S.S.R.I. antidepressants.

There are a small number of controlled clinical trials of antipsychotic drugs in generalized anxiety or social anxiety that have shown either no effect or inconsistent results. As a consequence, there is no F.D.A.-approved use of an atypical antipsychotic for any anxiety disorder.

Yet I and many of my colleagues have seen dozens of patients with nothing more than everyday anxiety or insomnia who were given prescriptions for antipsychotic medications. Few of these patients were aware of the potential long-term risks of these drugs.

The increasing use of atypical antipsychotics by physicians to treat anxiety suggests that doctors view these medications as safer alternatives to the potentially habit-forming anti-anxiety benzodiazepines like Valium and Klonopin. And since antipsychotics have rapid effects, clinicians may prefer them to first-line treatments like S.S.R.I. antidepressants, which can take several weeks to work.

Of course, physicians frequently use medications off label, and there is sometimes solid empirical evidence to support this practice. But presently there is little evidence that atypical antipsychotic drugs are effective outside of a small number of serious psychiatric disorders, namely schizophrenia, bipolar disorder and treatment-resistant depression.

Let’s be clear: The new atypical antipsychotic drugs are effective and safe. But even if these drugs prove effective for a variety of new psychiatric illnesses, there is still good reason for caution. Because they have potentially serious adverse effects, atypical antipsychotic drugs should be used when currently available treatments — with typically fewer side effects and lower costs — have failed.

Atypical antipsychotics can be lifesaving for people who have schizophrenia, bipolar disorder or severe depression. But patients should think twice — and then some — before using these drugs to deal with the low-grade unhappiness, anxiety and insomnia that comes with modern life.

Source: NY Times

 

New approaches to thyroidectomy prompt discussion.


During an academic debate held here, two presenters focused on the pros and cons of conventional vs. minimally invasive approaches to thyroidectomy.

In the process, some of the concerns that patients now harbor, particularly in terms of cosmetic effects, came to the forefront.

“The frontiers of thyroidectomy today focus on minimizing pain and maximizing cosmesis and preventing long hospital stays,” Carmen C. Solorzano, MD, professor of surgery and director of the Vanderbilt Endocrine Surgery Center, said during a presentation at the American Thyroid Association 82nd Annual Meeting.

Gold standard

Conventional thyroidectomy consists of a Kocher incision and requires elevation of large flaps, often with the use of a surgical drain, to allow complete exposure of the thyroid gland, according to Solorzano, whereas a minimally invasive approach involves an incision in the cervical area that is small and requires less extensive dissection. These approaches include minimally invasive video-assisted thyroidectomy (MIVAT), minimal incision and endoscopic minimally invasive thyroidectomy, but not remote approaches to the thyroid gland, such as the robotic facelift thyroidectomy.

Solorzano, who spoke in favor of the conventional approach, noted that a meta-analysis showed that the rate of recurrent nerve palsy between the two approaches was the same, although cosmetic satisfaction and pain scores were better in the minimally invasive thyroidectomy group. The conventional approach, however, was associated with shorter operative times, lower cost and wider applicability, she said. Additionally, conventional thyroidectomy remains the standard approach for Graves’ disease, which usually involves very large glands, and bulky cancer, as these would be difficult to remove through small incisions.

“The fact remains that one of the drawbacks to the minimally invasive approach is that it is only appropriate in about 5% to 30% of cases,” Solorzano said. “Major limitations are thyroid size, thyroiditis or toxic glands and cancer or adenopathy.”

Nevertheless, patients can still experience the benefits associated with minimally invasive surgery, according to Solorzano, as long as surgeons adapt by considering cosmesis with smaller incisions in the skin crease, using magnification and lighting, and paying attention to the edges of the wound.

“The conventional thyroidectomy remains the gold standard approach to removing the thyroid gland,” Solorzano said. “The minimally invasive approach remains an option but is limited by thyroid size and pathology.”

For select patients

Although not appropriate for all, according to Maisie L. Shindo, MD, FACS, patients and physicians may benefit from the MIVAT approach, which is similar to a laparoscopic procedure in which a high definition camera is used that allows the surgeon to dissect using a monitor.

“An advantage of the high definition camera is you can really see the nerve in magnified view and then just take out the thyroid,” Shindo, who is director of thyroid and parathyroid surgery at Oregon Health & Science University, said.

She also cited data from several studies suggesting that patients who underwent MIVAT experienced somewhat better outcomes vs. those who underwent conventional thyroidectomy. In a 2002 prospective study comparing post-operative pain at 24 and 48 hours after the procedure, for instance, indicated that post-operative pain was better in the MIVAT group. Similarly, a 2004 study showed that patients in the MIVAT group experienced better cosmetic and pain results than those in the conventional approach group.

Additionally, a study comparing minimally invasive thyroidectomy without video with mini-incision revealed that pain was significantly lower among patients who underwent surgery with the minimally invasive approach, according to Shindo.

She expressed concern, however, about the use of MIVAT in patients with thyroid cancer where the surgeon would likely be performing a total thyroidectomy and potentially removing lymph nodes as well, and noted that becoming skilled in using MIVAT requires time.

“My argument is that MIVAT is safe with the appropriate patient selection,” Shindo said. “It does provide a small incision and less pain, but there is a learning curve like with any other laparoscopic procedure. You have to be very experienced because there can be anatomic variations, so you have to be aware of that.”

Perspective

 

David J. Terris

  • I thought both of the speakers made very balanced and informed presentations. It’s always a challenge assessing new technology and new procedures, and I thought they both did a great job of presenting fair arguments about the procedures.

Much of the discussion was about minimally invasive techniques, but there was mention of robotic surgery, and it was clear that neither speaker was necessarily supportive of that approach. I think they drew an important distinction between minimally invasive surgery and robotic remote access surgery conventional techniques because sometimes the lines get blurred by the uninformed who may think that robotic surgery must be minimally invasive. For other procedures, such as robotic prostatectomy, it is. In many respects, it is minimally invasive, but when we refer to thyroid surgery and remote locations like the armpit or behind the ear, there’s more dissection involved just to get to where the thyroid gland is. The reason the robot is so valuable in those cases is because you’re working down a long tunnel and you can use these very minitaturized instruments to a) provide tremendous 3-D visualiation and b) the maneuverability of the instruments in that small space is so superior that if you’re going to do remote access surgery, it’s much easier if you use the robot. But the overall technique itself, the remote access techniques, is more invasive, but I was pleased to see that each of the speakers kind of emphasized that point.

Source: Endocrine Today.

 

Inadvertent prescription of gelatin-containing oral medication: its acceptability to patients.


When prescribing, doctors usually only consider the ‘active’ component of any drug’s formulation ignoring the majority of the agents which make up the bulk of the tablet or capsule, collectively known as excipients. Many urological drugs contain the excipient gelatin which is, universally, of animal origin; this may conflict with the dietetic ideals of patients. A questionnaire-based study, undertaken between January and June 2010 in a mixed ethnicity inner-city population presenting with urological symptoms, asked which patients preferred not to ingest animal-based products, who would ask about the content of their prescribed treatment and who would refuse to take that medication if alternatives were available. Ultimately, the authors sought to find out how many patients had been inadvertently prescribed gelatin-containing oral medications and to suggest ways in which prescriptions might be more congruous with an individual patient’s dietetic wishes. This study demonstrated that 43.2% of the study population would prefer not to take animal product-containing medication even if no alternative were available. 51% of men with lower urinary tract symptoms were also found to have inadvertently been prescribed gelatin-containing products against their preferred dietary restriction. Education of healthcare professionals about excipients and getting them to ask about a patient’s dietetic preferences may help avoid inadvertent prescription of the excipient gelatin in oral medications. Substitution of gelatin with vegetable-based alternatives and clearer labelling on drug packaging are alternative strategies to help minimise the risks of inadvertently contravening a patient’s dietetic beliefs when prescribing oral medication.

Discussion

Our current study shows that, in our inner-city catchment area, 40% of patients would prefer to take oral medication which contains no animal products; this is considerably in excess of the 20% of our population which might practice dietetic restriction on a religious basis. Patients practicing dietetic restriction do not ask what the formulation contains before commencing drug treatment, which puts them at risk of transgressing their belief. This may simply be due to ignorance about what an oral medication contains, not reading, or being able to read, the patient information leaflet which contains a list of excipients, or belief that the doctor or pharmacist involved in the prescription would tell them if the medication contained ingredients that might contravene their beliefs. We already know that doctors are fairly ignorant about the issue of excipients in medication,4 while data from 2008–20097 shows that pharmacists are dispensing only 50% of the generic oral medications prescribed. This indicates that pharmacists are tailoring prescriptions for an individual patient’s needs, and this could, possibly, include reasons of dietetic preference. Those patients preferring to avoid the ingestion of animal products do, however, appear willing to take gelatin-containing medication if no effective alternative oral drug was available, as is permitted by dietarily restrictive religions.8

This observational study may include bias as a consequence of questionnaire design and the limited use of interpreters to enable the most appropriate response from the multi-cultural population studied. Equally, the dietetic preferences of our ambient population may not be totally referable to other communities, and this means our conclusions may need to be used with caution when considering the applicability of these data.

This study does, however, highlight the importance of asking about cultural and lifestyle factors in the prescription and dispensing of oral medications when more than 50% of men with lower urinary tract symptoms were already receiving medications which transgressed their dietetic preference. This is, almost certainly, a much bigger issue for the 860 million non-urological preparations, prescribed in the UK each year, whose excipient content is not easily identified.2 These data highlight the necessity to prescribe, and dispense, with diligence across the totality of the pharmacopea9 as is recommended for best practice.10

Although this study shows there is significant potential for transgressing an individual’s dietetic preferences, many excipients, such as gelatin, have non-animal-based alternatives. Agar Agar (E406) and Carrageenan (E407) are derived from various seaweed species and are used as gelling agents, in the food industry, and in some oral medications already.11 Universal incorporation in oral medications would negate the potential for dietetic transgression.

It is tempting to suggest that several things might decrease inadvertent prescription of animal-product-containing medications. These could be educating healthcare professionals about the issue of dietetic preference, making it easier to identify a medication’s constituents or by altering the manufacturing process so that only non-animal excipients are used in drug formulation.

In particular, we would recommend that every doctor needs to be aware that it is not just the active drug being dispensed but a whole group of other agents which may have relevance to an individual patient’s compliance with treatment when oral treatments are prescribed. We would also suggest that it should be easier to find out what the composition of a drug’s formulation is; hierarchical constituent listing, as used on food packaging,12 is one option, although labelling to denote products of vegetarian composition might be easier. Adoption of the Vegetarian Society’s ‘Seedling’ symbol,13 a voluntarily applied symbol used on accredited merchandise,14 might be an easily identified logo to denote a medication containing no animal ingredients.

In conclusion, we feel that our service evaluation has identified a number of issues for further research and which, clearly, have ethical implications for doctors across the totality of drug prescribing. Systems to help patients, doctors and pharmacists identify those oral medications containing animal-based products require evolution. This would facilitate choice for patients about the oral medication they take, whatever their dietetic beliefs, and would conform to best practice in medical care.

Source: BMJ.

 

 

Increased risk of inflammatory bowel disease in women with endometriosis: a nationwide Danish cohort study

Abstract

Background An association between endometriosis and certain autoimmune diseases has been suggested. However, the impact of endometriosis on risk of inflammatory bowel disease (IBD) remains unknown.

Objective To assess the risk of Crohn’s disease (CD) and ulcerative colitis (UC) in an unselected nationwide Danish cohort of women with endometriosis.

Design By use of national registers, 37 661 women hospitalised with endometriosis during 1977–2007 were identified. The relative risk of developing IBD after an endometriosis diagnosis was calculated as observed versus expected numbers and presented as standardised incidence ratios (SIRs) with 95% CIs.

Results Women with endometriosis had a increased risk of IBD overall (SIR=1.5; 95% CI 1.4 to 1.7) and of UC (SIR=1.5; 95% CI 1.3 to 1.7) and CD (SIR=1.6; 95% CI 1.3 to 2.0) separately, even 20 years after a diagnosis of endometriosis (UC: SIR=1.5; 95% CI 1.1 to 2.1; CD: SIR=1.8; 95% CI 1.1 to 3.2). Restricting analyses to women with surgically verified endometriosis suggested even stronger associations (UC: SIR=1.8; 95% CI 1.4 to 2.3; CD: SIR=1.7; 95% CI 1.2 to 2.5).

Conclusion The risk of IBD in women with endometriosis was increased even in the long term, hence suggesting a genuine association between the diseases, which may either reflect common immunological features or an impact of endometriosis treatment with oral contraceptives on risk of IBD.

Source: BMJ.

 

 

 

Evidence supports optional use of RAI for papillary thyroid cancer.


The use of radioactive iodine for the management of papillary thyroid cancer has been recommended for years, but researchers said it should not be a “blanket treatment” for all patients.

Guidelines for the management of well-differentiated thyroid cancer (WDTC) recommend routine usage of radioactive iodine (RAI) in patients with T3 disease or distant metatases, and selected use in patients with more limited disease.

However, lain J. Nixon, MBCHB, clinical fellow in the head and neck surgery department of Memorial Sloan-Kettering Cancer Center, told Endocrine Today that, due to a lack of evidence, the American Thyroid Association’s guidelines are not definitive for most patients when it comes to treatment.

“Over the years, different groups have looked at outcomes of patients who were treated with RAI. And initially, it dramatically improved patient outcomes. But, as treatment has progressed over the years and surgery is better now than it was in the 1940s, groups have discovered the  benefit is probably limited to high-risk patients,” Nixon said. “We now know that high-risk patients benefit, but low-risk patients don’t. The difficulty for clinicians is that most patients are somewhere between those two extremes, and there isn’t very good guidance about who should and should not receive RAI in that middle group.”

Nixon and colleagues conducted a review of 1,129 patients (median age of 46 years) who underwent total thyroidectomy at Memorial Sloan-Kettering Cancer Center between 1986 and 2005. After an average follow-up of 63 months, the researchers found that some patients with early primary disease (pTl/T2) and low-volume metastatic disease in the neck (pTl/T2 N1) who were managed without RAI displayed positive outcomes.

“It’s not a study that proves whether RAI works or it doesn’t,” Nixon said. “The idea of it is to give clinicians who are interested in the concept of managing patients without RAI some evidence to back up that position.”

For patients with advanced local disease (pT3/T4), some patients with pT3NO disease were safely managed without RAI. The 5-year disease-specific survival (DSS) and recurrence-free survival (RFS) rates for the pTl/T2NO group were 100% and 92%; for the pT1/T2N1, rates were 100% and 92%; and for the pT3/T4 group, rates were 98% and 87%, according to data.

Despite the traditional recommendations, the researchers suggest that RAI should be administered on a case­by-case basis through a multidisciplinary team with extensive experience in managing thyroid cancer.

“Our experience is that in properly selected patients, it’s very safe to manage them without RAI,” Nixon said. – by Samantha Costa

.

Disclosure: The researchers report no relevant financial disclosures.

Perspective

 

Megan R. Haymart

  • Nixon and colleagues performed a retrospective review of 1,129 patients who underwent total thyroidectomy for thyroid cancer at a tertiary referral center between 1986 and 2005. They evaluated mortality and cancer recurrence in those patients that received radioactive iodine post thyroid surgery versus those that did not. They found that select patients do well without radioactive iodine treatment. This study suggests that it is time for the pendulum to swing. Although radioactive iodine treatment has clear benefit in high risk iodine avid patients, for many patients management with surgery alone may be adequate.
  • Source: Endocrine Today.

 

Experts debate benefits of routine nerve monitoring in thyroid surgery.


Although many clinicians use intraoperative nerve monitoring during thyroid surgery, data do not necessarily associate the practice with improved outcomes. The question of whether it should be used routinely was up for discussion at the American Thyroid Association 82nd Annual Meeting.

Potential benefits

Jennifer E. Rosen, MD, FACPS, assistant professor of surgery and molecular medicine at Boston University School of Medicine, said that nerve monitoring may be beneficial from a cost standpoint, explaining that post-operative permanent nerve injury and post-operative permanent hypothyroidism are the driving force behind the majority of lawsuits in thyroid surgery.

She also highlighted several uses for intraoperative nerve monitoring in thyroidectomy. For instance, it offers more than visual confirmation when identifying the recurrent laryngeal nerve, Rosen said. Additionally, nerve monitoring can help identify abnormalities in the anatomy of the nerve and aid in dissection. Further, she noted, nerve monitoring has value as a prognostic tool in terms of postoperative neural function.

The major question, however, is whether intraoperative nerve monitoring prevents nerve injury or paralysis during thyroidectomy. Although data are not completely positive, this may be due to several factors, according to Rosen, such as whether the surgeon performs pre-operative and post-operative laryngoscopy and in what setting; how many procedures the surgeon performs per year; what techniques are used; and more.

If a surgeon is going to use nerve monitoring, he or she should do it routinely, Rosen said. The surgeon should also perform pre- and post-operative laryngoscopy and voice assessment, as well as be very aware and knowledgeable about the type of equipment and approach to surgery that is being used.

“Based on the preponderance of evidence and an interpretation of the strengths and limitations of the data on which we base our decisions, and with some qualifications based on the type of surgery, the setting and the surgeon, then yes, [intraoperative nerve monitoring] should be done routinely,” she said.

A lack of data

However, David J. Terris, MD, FACS, Porubsky Professor and chairman of the department of otolaryngology at Georgia Health Sciences University and surgical director of the Thyroid Center, pointed out that the published scientific evidence does not support the routine use of nerve monitoring in thyroid surgery.

“It’s important to consider this in two different ways: what is the logic behind nerve monitoring vs. what about the data actually supporting the use of nerve monitoring? We want to consider those separately,” he said.

Terris cited four studies that failed to prove a connection between nerve monitoring and improved functional outcomes in thyroid surgery. For example, results from a trial conducted at 63 centers in Germany and involving 29,998 nerves demonstrated no differences in the nerve monitoring group when compared with the nerve identification and dissection group (although each of these methods were superior to an approach where the nerve is not sought and identified) . Similarly, researchers for another study involving 1,804 nerves at risk concluded no benefit to nerve monitoring (although both nerve monitoring and nerve stimulation and twitch palpation without nerve monitoring were able to predict nerve injury).

The potential for added costs, including a $300 endotracheal tube, additional time in the operating room and from $500 to $1,000 in surgical fees, is another possible downside to nerve monitoring, according to Terris. Complications such as airway obstruction, tongue necrosis and increased parasympathetic tone associated with clamping the vagus nerve are also concerns, he said. Moreover, clinicians may become reliant on the technology for identifying the nerve.

“One concern is training a new generation of surgeons who have inferior anatomical skills,” he said. “The bottom line is that [nerve monitoring] adds expense; has its own potential for complications; induces a false sense of security; and there’s no evidence that it does what it’s supposed to do, which is prevent injury.”  Despite these shortcomings, Dr. Terris indicated that he himself generally uses nerve monitoring because of subtle advantages associated with it, and incremental surgical information that it provides.

Source: Endocrine Today.

 

 

High cardiovascular risk in severely obese young children and adolescents.


Abstract

Objective To assess the prevalence of cardiovascular risk factors in severely obese children and adolescents.

Methods A nationwide prospective surveillance study was carried out from July 2005 to July 2007 where paediatricians were asked to report all new cases of severe obesity in 2–18-year-old children to the Dutch Paediatric Surveillance Unit. Severe obesity is defined by gender and age-dependent cut-off points for body mass index based on Dutch National Growth Studies corresponding to the adult cut-off point of 35 kg/m2. Paediatricians were asked to complete a questionnaire for every severely obese child regarding socio-demographic characteristics and cardiovascular risk factors (blood pressure, fasting blood glucose and lipids).

Results In 2005, 2006 and 2007, 94%, 87% and 87%, respectively, of paediatricians in the Netherlands responded to the monthly request from the Dutch Paediatric Surveillance Unit and 500 children with newly diagnosed severe obesity were reported. 72.6% (n=363) of paediatricians responded to a subsequent questionnaire. Cardiovascular risk factor data were available in 255/307 (83%) children who were correctly classified as severely obese. 67% had at least one cardiovascular risk factor (56% hypertension, 14% high blood glucose, 0.7% type 2 diabetes and up to 54% low HDL-cholesterol). Remarkably, 62% of severely obese children aged ≤12 years already had one or more cardiovascular risk factors.

Conclusion A high number (2/3) of severely obese children have cardiovascular risk factors. Internationally accepted criteria for defining severe obesity and guidelines for early detection and treatment of severe obesity and comorbidity are urgently needed.

Source: BMJ.

 

 

Mood disorder as a specific complication of stroke

Appraising the impact of Folstein et al’s1 1977 report on ‘Mood disorder as a specific complication of stroke’ is a challenging task for someone who did not enter medical school until the mid-1980s. Stroke changed in the 1970s, and the view in retrospect appears unrecognisable. This was a dramatic change, from an intellectual backwater too dull for neurologists to even bother seeing, to become a hot topic: a disease to be studied in mega trials and a standard bearer for evidence based medicine. Prior to the 1970s, with the exception of dysphasia, neuropsychiatric complications had been given scant thought—it was a disorder that affected how people walked. It was recognised that some elderly patients became depressed after stroke but the prevailing view appeared to be “so what, they’re old and infirm, what do you expect?” It is against this backdrop that the work of researchers at John Hopkins has to be judged.

The importance of the paper was perhaps not the findings but the very fact that they published the study at all. Two years earlier their John Hopkins colleague Robert Robinson published a fascinating study demonstrating that experimentally induced strokes in rats led to alteration in cerebral metabolism of catecholamines that correlated with behavioural changes in the rats that mimicked depression.2 Folstein’s data appeared to be an early example of translational research and were widely disseminated as they appeared to link laboratory based neurobiology with clinical practice. Tantalisingly it seemed to offer a human model for studying the anatomy of depression. Appearing, as it did, contemporaneously with the development of cerebral imaging techniques, this was the impetus researchers had needed. Over the next 2 decades, 143 reports were made on this topic. Sadly, the theory of anatomical location of brain lesions as a simplistic explanation for mood disorder did not stand up to scrutiny.3 It was perhaps too good to be true; a salient reminder of the need for confirmation in humans of findings from animal models.

In critical analysis the paper itself has suffered with the passage of time. Epidemiological techniques have advanced, as has expectation of sample sizes and analysis strategies. Future investigators submitting to the journal are unlikely to get a case control study past peer review without any statistical comparisons! But for all that, it is a well written report that gets its key messages across clearly and succinctly, perhaps because the manuscript was not cluttered with t tests and hazard ratios, and that is something editors welcome in any era.

And the key messages were important—the realisation that depression after stroke was not simply an understandable reaction to disability has stood the test of time. We now know that 33% of stroke patients suffer from depression (95% CI 29% to 36%).4 We now know that this depression leads to increased disability5 and probably increased mortality.6 Most importantly, we now know that antidepressants are effective in treating it.7 Countless patients round the world are benefiting from this knowledge and that is an impact that any researcher can be proud of.

Footnotes

  • Competing interests None.
  • Provenance and peer review Commissioned; not externally peer reviewed.

References

    1. Folstein MF,
    2. Maiberger R,
    3. McHugh PR

. Mood disorder as a specific complication of stroke. J Neurol Neurosurg Psychiatry 1977;40:1018–20.

[Abstract/FREE Full text]

    1. Robinson RG,
    2. Shoemaker WJ,
    3. Schlumpf M,
    4. et al

. Effect of experimental cerebral infarction in rat brain on catecholamines and behaviour. Nature 1975;255:332–4.

[CrossRef][Medline]

    1. Carson AJ,
    2. Machale S,
    3. Allen K,
    4. et al

. Depression after stroke and lesion location: a systematic review. Lancet 2000;356:122–6.

[CrossRef][Medline][Web of Science]

    1. Hackett ML,
    2. Yapa C,
    3. Parag V,
    4. et al

. Frequency of depression after stroke: a systematic review of observational studies. Stroke 2005;36:1330–40.

[Abstract/FREE Full text]

    1. Pohjasvaara T,
    2. Vataja R,
    3. Leppavuori A,
    4. et al

. Depression is an independent predictor of poor long-term functional outcome poststroke. Eur J Neurol 2001;8:315–19.

[CrossRef][Medline][Web of Science]

    1. House A,
    2. Knapp P,
    3. Bamford J,
    4. et al

. Mortality at 12 and 24 months after stroke may be associated with depressive symptoms at 1 month. Stroke 2001;32:696–701.

[Abstract/FREE Full text]

    1. Hackett ML,
    2. Anderson CS,
    3. House A,
    4. et al

. Interventions for treating depression after stroke. Cochrane Database Syst Rev 2008;4:CD003437.

Search Google Scholar

Source:BMJ

 

 

Subclinical hyperthyroidism unrelated to overall, CV mortality.


The associated health risks for subclinical hyperthyroidism in patients aged at least 65 years are not entirely clear. However, data presented at the 82nd Annual Meeting of the American Thyroid Association suggest that the disease was not linked to overall or cardiovascular mortality.

“[Older patients with subclinical hypothyroidism] They are a group with a high prevalence of subclinical thyroid dysfunction and a high prevalence of comorbidities that make their management more complex,” researcher Anne R. Cappola, MD, ScM, associate professor of medicine at Penn Medicine and physician at the Perelman Center for Advanced Medicine in Philadelphia told Endocrine Today.

Cappola said there are two clinical implications.

“One, thyroid function testing should be repeated in older people with subclinical hyperthyroidism to confirm testing prior to initiating management. Two, older people with subclinical hyperthyroidism are at increased risk of atrial fibrillation,” Cappola said.

The Cardiovascular Health Study (CHS) was used to examine the 5,009 community-dwelling patients aged 65 years and older who were not taking thyroid medications. According to data, the serum thyroid-stimulating hormone and free thyroxine concentrations were measured in banked specimens at visits between 1989 and 1990, 1992 and 1993, and 1996 and 1997.

Within the CHS, researchers identified 70 patients with an average age of 73.7 years (60% women, 24% not white) with subclinical hyperthyroidism based on their first TSH measurement. They studied the persistence, resolution and progression of the disease during a 2- to 3-year period.

Using Cox proportional hazard models, researchers were able to determine the link between subclinical hyperthyroidism and CV risk and total mortality after more than 10 years of follow-up, with 4,194 euthyroid patients used as a reference group.

According to data, of the patients with subclinical hyperthyroidism who participated in follow-up thyroid testing or were taking thyroid medication at the time of follow-up (n=44), 43% persisted; 41% became euthyroid; 5% progressed to the point of overt hyperthyroidism; and 11% began taking thyroid medication.

“Our study provides additional supportive data in both estimates of persistence of subclinical hyperthyroidism and risk of cardiovascular effects,” Cappola said.

  • Source: Endocrine Today.