Vitamin D Supplementation Prevents Fractures.


A meta-analysis suggests benefit at a dose of >800 IU daily, but factors relating to treatment adherence could have biased the results

Clinical trials have not consistently confirmed that vitamin D supplementation in older adults prevents fractures. To reconcile conflicting data, researchers pooled participant-level data from 11 randomized controlled trials of vitamin D supplementation that involved 31,000 older people (age, 65).

In intent-to-treat analyses, vitamin D supplementation lowered risks for hip fracture (hazard ratio, 0.90; P=0.07) and any nonvertebral fracture (HR, 0.93; P=0.03), but these reductions were not significant (for statistical reasons, a P-value of 0.0125 was considered significant). However, the researchers conducted additional analyses that incorporated treatment adherence and supplement use outside the trial: Participants in the highest quartile of daily vitamin D intake (median, 800 IU; range, 792–2000) did have significantly lower risks for both hip fracture (HR, 0.70) and any nonvertebral fracture (HR, 0.86), compared with controls. In contrast, participants in the lowest three quartiles did not benefit from vitamin D supplementation. Some trials also involved calcium supplementation, but the vitamin D findings were independent of additional calcium intake. Vertebral fractures were not documented consistently in these trials.

Comment: These data suggest a threshold dose of about 800 IU of vitamin D daily for fracture prevention in older adults. The pooling of participant-level data, plus the incorporation of information on adherence and supplement use outside the trial, distinguish this analysis from previous ones. However, the departure from intent-to-treat analysis could be a problem here. The subgroup with the highest vitamin D intake was compared with all controls; thus, the analysis probably compared a high-adherence population to a mixed-adherence population and, if adherence is a marker for better health or healthier behaviors, this fact could confound the results. An editorialist adds that, for any given dose of supplemental vitamin D, a person with baseline deficiency is more likely to benefit than a person whose baseline vitamin D status is adequate.

Source: Journal Watch General Medicine

Can Botulinum Toxin Improve Disabling Tremor Caused by Multiple Sclerosis?


Ability to write, drink, and pour was improved in this placebo-controlled crossover trial.

Tremor can be a disabling and frustrating symptom in multiple sclerosis (MS). No treatment with a lasting and consistent functional benefit has been identified, despite much research. Now, investigators have tested the effect of botulinum toxin (BT) in 23 people with mostly moderate, upper-extremity tremor of posture, action, and intentional movement caused by MS, in a randomized, double-blind, crossover trial. Participants received BT injections directed toward agonist and antagonist muscles related to specific tremor patterns.

At baseline, participants had good proximal and distal upper-limb strength (median Medical Council Rating, 5; interquartile range, 4.5–5.0) and high Expanded Disability Status Scale scores (median, 5.5; interquartile range, 4.0–6.5). Most (74%) had secondary-progressive MS; 26% had relapsing–remitting MS. At 6 and 12 weeks, median scores for tremor, writing, and spiral drawing improved in the BT-injection group but did not change in the placebo group; the between-group differences were significant. The BT group also improved on the 9-hole peg test and drinking from a cup at 12 weeks, and pouring at 6 weeks. Quality-of-life scores did not change significantly. Adverse reactions included up to 2 weeks of weakness, which was mild (minimally detectable, not interfering with function) in 4 participants (2 after BT injection, 2 after placebo injection) and moderate (present but not interfering with most activities) in 12 participants after BT.

Comment: This study provides commendable preliminary results that a trial of botulinum toxin may be considered for upper-limb multiple sclerosis tremor. Study strengths include blinding, a placebo arm, quantitative functional assessments at up to 3 months, and primary funding independent from BT manufacturers (who were involved only by donating 40 vials of the therapy). The authors say their findings support modulation of stretch-elicited peripheral feedback, not weakening of the limb, as the mechanism of benefit in MS tremor. Although the treatment may not result in functional improvement for everyone, clinicians might consider it for MS patients with isolated tremors and preserved upper-extremity strength.

Source: Journal Watch Neurology

 

The Garden of Our Neglect: How Humans Shape the Evolution of Other Species.


As humans have come to dominate the planet, they have modified not only their own evolutionary course but also that of fellow species. Although such alterations help us survive, their unintended evolutionary consequences often produce  harmful results that threaten our well-being.

For the vast majority of the history of our kind we were in some ways no more sophisticated than crows, which use sticks to poke around in promising holes. Eventually, of course, we discovered fire and invented stone tools, which then led to guns, pesticides and antibiotics. Using these tools, we encouraged the survival of favorable species such as wheat and yeast needed for beer and cows for meat and milk—a garden of delights.

But we also encouraged a garden of neglect—a surprising number of resilient pests that have been able to survive in spite of our weapons. These species are now coming back to haunt us as toxins, pathogens or worse. Here are ten ways we have helped this garden of neglect prosper.

1. SHARP ROCKS, SOFT FLESH. In the beginning someone held aloft a sharpened rock. “Progress!,” he screamed out, or maybe, “Ouch!,” depending on which end he grabbed. With that first stone weapon and its many pointy descendants, life changed. Our initial impact would have been small. However, by 10,000 years ago we had extinguished many of the largest species on Earth—mastodons, mammoths, American cheetahs, giant kangaroos and many more. In our wake, we left behind smaller species more able to reproduce rapidly or escape detection in the first place.

As humans came to rely on tools to survive, those with hands better able to make and wield those tools were more likely to pass their genes to the next generation. Mary Marzke at Arizona Sate University in Tempe argues that hand bones of humans are quite different from those of other primates because of our use of tools. Our hands are better able to manage the subtle grips necessary for making and using tools to maim or kill other species. In response to our first tools the animals around us changed. So did we.

2. BIG FISH, LITTLE FISH. Not only have we altered the course of big game evolution on land but we’ve also effectively reduced the size of fishes in the sea. Fishermen prefer to catch big fishes, and fishing regulations tend to prohibit the harvest of the smallest individuals of a species. In response, fishes have evolved the ability to reproduce at a smaller size and/or younger age. If they can breed before they get big enough to be harvested their genes stand a much higher chance of being passed on. American plaice, Atlantic cod, Atlantic herring, Atlantic salmon, brook trout, and chinook salmon all have appeared to grow more slowly and/or to reproduce at smaller sizes where and when they are heavily fished (Jorgenson et al., 2007; Palcovacs, 2011) Once, a large cod could eat a small boy. Now, a small boy could almost eat an entire cod.

3. RESISTANCE IS FUTILE. Bacteria have been evolving in response to threats from other species, including fungi, for hundreds of millions of years. Bacteria and fungi compete for food and often do so using chemical warfare. A fungus evolves an antibiotic and bacteria evolve resistance, so fungi evolve a new antibiotic. Recently, though, things changed. We invented (or rather stole from fungi) antibiotics, which allowed us to kill bacteria—and, importantly, treat bacterial infections. However, by using them too much, too incompletely or too indiscriminately we cause bacterial strains resistant to our drugs to evolve. Unlike fungi, we cannot retaliate by simply evolving new antibiotics. Hundreds of bacterial lineages have evolved resistance to more than a dozen of our antibiotics. In response, we are forced to discover new antibiotics, an endeavor that has proved ever more difficult.

4. GOING (ANTI)VIRAL. Viruses generally evolve even more quickly than bacteria. For example, multiple drugs for HIV infection are taken together as a cocktail for one reason: the HIV virus evolves quickly. The cocktail slows the evolution of full resistance. Even if HIV evolves resistance to one drug, the odds it will evolve complete resistance to all three are far lower. Similarly, the flu that usually starts each year in Asia is different by the time it reaches North America. The flu virus evolves to get by not only as a function of how we respond to it but also in response to our population size and patterns of movement. It, and other viruses, even evolve within our bodies. The virus that makes you sick is almost inevitably different than the one you give someone else.

5. PESTICIDES. In wild grasslands up to one third of the living mass of plants is eaten by herbivores. In our crop fields just 10 percent is eaten. The difference is in part the result of the more than 2.3 billion kilograms of pesticides we use annually to control pests. Though in holding back the pests, we also kill many beneficial species and favor varieties resistant to our pesticides. Resistance to pesticides has evolved in hundreds of species of insects. In addition to pesticides for insects, farmers also use fungicides to kill fungi. Nearly all fungicides have led to the evolution of new resistant strains of plant pathogens (Gould).

6. HERBICIDES. Any patch of land, left alone, will tend to sprout with plants bent on outcompeting each other, rising higher and higher into the sky to win access to the sun. Once, we prevented such competition by weeding our fields and sorting crop seeds from weed seeds, one by one. This selection depended on visual acuity and caused multiple lineages of weeds to evolve seeds resembling those of our crops. Now we exclude weeds using herbicides, whether in our lawns or our fields, before they bear their seeds. The weeds evolve resistance to herbicides, becoming invisible to our chemicals rather than our eyes. More than a hundred species of weeds have evolved resistance to one or another herbicide. We clear the ground, till the soil and spray the fertilizer and herbicide, and when we do, row by row the resistant weeds grow.

7. ENVIRONMENTAL TOXINS. The environmental toxins we produce are everywhere. Often they influence the health and well-being of species around us; sometimes they also influence their evolution.  PCBs (aka polychlorinated biphenyls) were once used in industrial coolants. Whereas PCBs are good coolants, they are toxic. PCBs kill fish and other animals, in part by blocking one of the receptors in their bodies, AHR2. The fish with ordinary receptors simply died where PCBs were plentiful, leaving behind food and habitat. Those fish with slightly different receptors, to which the PCBs bound less well, survived and eventually thrived. PCBs were never meant to be used to control other species. Nevertheless, they had the effect of killing some (but not all) of the species and individuals they came into contact with, strongly favoring the individuals with resistance of one form or another. Nor are PCBs unique. Many of our pollutants—be they heavy metals, cadmium, oil and others—appear to lead to rapid evolution of tolerant and, at least sometimes, toxic creatures.

8. OF MICE (AND RATS) AND MEN. Mice and rats have been following humans since at least the origins of agriculture more than 10,000 years ago. It is easy to imagine we have probably been trying to kill them for nearly as long. More recently, however, we’ve been poisoning these pests, offering them tempting treats laced with deadly chemicals. Rats living in forests and other wild places are attracted to new foods in particular and so feed readily from such baits. Rats living with humans are not, at least not anymore. Present them with a new food and they will wait. Several authors have suggested that this “neophobia” in urban rats has evolved in response to the threat posed to rats and mice by our new “foods.” For now, the little we know about the evolution of neophobia fits with this idea. The clearest evolutionary change in rats and mice as a result of our interference has been the evolution of resistance to the rat poison warfarin. We then created superwarfarin to target these resistant populations, but resistance to this poison has recently evolved (Mayumi et al., 2008). Once again our garden of neglect is seemingly growing out of our control.

9. URBAN JUNGLE. Plant species living in urban environments tend to be surrounded by patches of habitat less suitable than the ones in which they are situated. Seeds that disperse far from their mothers are more likely to end up in those less suitable surroundings (think: concrete or pavement; Cheptou et al., 2008). As a consequence some city plants have evolved to produce fewer, larger seeds that fall near them rather than smaller ones that can disperse farther away. Although this type of quick evolution lends a short-term survival advantage, it may mean that these plants are less robust to adapt to a changing environment in the future. Meanwhile, thousands of other city species are acquiring new survival mechanisms despite the ways we build our cities, whether that means evolving the ability to eat concrete, call more loudly to their mates or simply find a place among our towers of glass  and steel to hide.

10. THE NEW GALÁPAGOS. Our stone weapons and antibiotics are just a few of the tools we’ve created that have inadvertently helped shape the evolution of the species around us. Simply moving around has caused changes, too, many of which may be innocuous but all of which are unintentional. We have moved cane toads, wild pigs, mice, rats, weeds, sparrows, pavement ants and thousands of other species around the world with us. These species have responded to our tools, but they have also responded to the climate and organisms already present in the places we have introduced them. A recent study in Australia found most of the hundreds of plant species introduced there show some evidence of recent evolution, post-introduction, with many of them apparently having evolved smaller, more drought-tolerant forms (Bushwell et al., 2011). Cane toads introduced to Australia are evolving longer legs that aid in colonizing new habitats (for example, Philips et al., 2007). Where cane toads are present snakes are evolving smaller mouths (those with bigger mouths eat cane toads and, in doing so, die). Vultures introduced to the Canary Islands have evolved larger bodies (Agudo et al., 2010). Elsewhere, house sparrows (Johnston and Selander, 2008), cane toads, houseflies and many other species show evidence of evolving differently in different places. Each new place to which we introduce organisms is a kind of  island and the species, new versions of Darwin’s Galápagos birds.

Ultimately, whereas evolution can be whimsical (think: vampire bats), its general tendencies are predictable. It revisits its best-worn routes. If we continue to manage the world around us as we have managed it in the past, it’s likely we’ll continue to favor even more of those species that thrive despite us, species that are resistant to our drugs, pesticides and toxins. Such species might get bigger or more beautiful, but probably not. And, a world filled with small, resistant species is not necessarily what we want. It’s time to use our knowledge of evolution and its well-worn paths to cultivate a new garden as we plan our future, one seeded with species that benefit rather than harm us.

Source: Scientific American.

 

 

 

 

 

 

 

 

 

 

 

 

Warming Oceans Means Seafood Menu Changes


Warm water species are beginning to appear in the northerly seas around Britain.

The seas around Britain are starting to teem with fish species once deemed exotic as climate change raises water temperatures, forcing the former dominant occupants to flee northward toward the Arctic and opening the way for those from the hotter south, according to marine and fisheries scientists.

Such is the extent of the migration already observed, which is expected to grow in coming decades and could even force a change in the country’s fish menus. Once-local species are moving farther afield and therefore becoming more expensive to catch, while formerly foreign ones become plentiful locally and therefore presumably cheaper and easier to harvest.

“People have started calling the North Sea the crucible of climate change. It has warmed by about a degree Celsius over the last 50 to 100 years, which is something like six times faster than pretty much any marine area around the world,” John Pinnegar, program director of the Marine Climate Change Centre at the government’s Centre for Environment, Fisheries and Aquaculture Science, told ClimateWire.

“We have seen quite a lot of warm-water fish becoming more abundant — things like anchovy, red mullet, sea bass — all of which are actually quite nice to eat,” he said. “Species that we traditionally got in the Bay of Biscay area are now showing up in the Irish Sea and into the North Sea. At the same time, things like cod, a cold-water fish, seem to be suffering and moving northward.

“The British have very traditional fish eating habits — historically consuming predominantly cod in the south and haddock in the north. Not many people are used to eating red mullet and sea bass. But eating habits can change, and that is partly what adapting to climate change could mean,” he added.

Pinnegar said sea bass not only has quadrupled in quantity in the seas off southern England in the past 20 years, but is now being found by anglers as far north as Scotland and is being commercially fished off the coast of Yorkshire, 250 miles north of its former northernmost range.

The chips remain, but the fish are foreigners
Pinnegar was lead author of the marine and fisheries section of a vast U.K. government report earlier this year on all aspects of the risks associated with climate change. Among other findings, the report shows significant warming of waters around the United Kingdom in studies from 1961 to 1990 across all seasons but particularly marked in autumn and winter.

Although the picture is complicated by factors such as the impact of commercial fishing, water-based recreational activities and the growth of human coastal populations, scientists say the rising acidity of the seas due to absorption of carbon dioxide from the atmosphere and changing salinity and oxygenation is having an effect on fish and shellfish.

“Sea temperatures are rising — although it is hard to say whether this is a blip in geological terms or evidence of global warming. But with it we would expect to see some changes in the species distribution of fish,” said Richard Handy, director of the Ecotoxicology Research and Innovation Centre at Plymouth University’s School of Biomedical and Biological Sciences.

“As the seas warm, we would expect to see some of the species of fish we more usually associate with the warmer waters off Spain to start appearing in the U.K. There have even been reports of fishermen catching barracuda and types of shark they haven’t seen before,” he added.

Fish food moves north, too
While some of the colder-water species will move north to escape the heat, the rising water temperatures could also have an impact on those that remain as their metabolisms speed up with the warmth and they need to eat more. Meanwhile, the plankton and other species lower down the food chain have already moved on and so become scarcer.

The Marine Climate Change Impacts Partnership — a group combining scientists, government departments, nongovernmental organizations and the fishing industry — in its annual report card for 2012 said the dominant cold-water zooplankton species in the North Sea had declined 70 percent since the 1960s, while many plankton species had moved 10 degrees of latitude north in the same period. That equates to a distance of nearly 700 miles.

At the same time, deepwater species like monkfish had moved steadily deeper to keep cold, while shallow-water species like sole had moved steadily higher with the warmth.

The MCCIP report card also noted that some fish species had already moved north 30 to 250 miles over the past 30 years and said that by 2050 they could have added a further 135 to 375 miles.

Sole, seemingly perversely, has moved south against the trend. In previous winters it always migrated north from the Dutch coast’s shallow waters, which became unbearably cold in winter while the deeper North Sea remained relatively warm. Now it stays put as the water remains warm throughout the year.

The warming waters also appear to be a boon for squid, and in Scotland many trawlers are switching to hunting that.

The changing movements and ranges of commercially exploited fish stocks have also produced some unexpected conflicts, with trawlers having to follow the fleeing fish farther afield and into the territories of other nations. This has already happened with mackerel moving from off Norway to off Iceland, while Spanish trawlers are starting to venture into U.K. waters in search of anchovy.

Some winners, some losers
“The model predictions suggest that Iceland and Greenland — Greenland in particular — and Norway are probably going to benefit in terms of fisheries from climate change, at least for the next 50 to 100 years,” Pinnegar said.

“Things like the cod populations are projected to really boom further north, and it is already starting to happen,” he added. “The herring populations are projected to do quite well, too. Those are really big commercial stocks.

“From the modeling that has been done, it looks like the U.K. is almost at the break-even point. We gain some species and we lose some species. But in terms of our fisheries, it will probably balance out almost,” he said.

But with the good comes the bad. Invasive species such as the zebra mussel have been extending their ranges northward, and some marine-borne bacteria usually associated with warmer water are also expected to move in.

The marine impact section of the U.K. government’s Climate Change Risk Assessment particularly notes the possibility of Vibrio cholerae, associated with outbreaks of cholera in humans from eating contaminated shellfish, arriving in force off U.K. shores as the temperature climbs. There have already been outbreaks in Spain, and scientists in the United Kingdom are on watch.

It also warns of the potential arrival of other water-borne Vibrios such as V. parahaemolyticus, which is associated with seafood bacterial gastroenteritis in humans. This is already very common in the United States, with more than 10,000 cases in a year, against about 40 in the United Kingdom.

“Assessments based upon global sites suggest that changing climatic conditions could result in increased rates of infection and illness in humans via shellfish and through bathing,” it says.

Source: Scientific American.

 

Coffee May Help Protect against Skin Cancer.


At least three cups of a day appear to protect against basal cell carcinoma, the most common form of skin cancer, but more studies are necessary to confirm the association

Protection against skin cancer can be added to the list of health benefits that come with drinking coffee, a new study says.

Women who drank more than three cups of coffee daily were 21 percent less likely to develop basal cell carcinoma (BCC), compared with women who drank less than one cup of caffeinated coffee per month, the study showed. For men, this risk reduction was 10 percent.

“Most likely, the protective effect is due to caffeine,” said lead author Jiali Han, an associate professor at Harvard Medical School and Harvard School of Public Health in Boston. People in the study who drank decaffeinated coffee did not appear to have a lower risk of developing the skin cancer.

Additionally, the researchers found that the more caffeinated coffee that people in the study drank, the lower their risk of developing BCC, the most common type of skin cancer.

But the findings don’t mean that your cup of joe can substitute for daily sunscreen.

“I would hope that people would not decide to spend a lot more time in the sun because they are drinking coffee,” said Lorelei Mucci, an associate professor of epidemiology at the Harvard School of Public Health, who was not involved in the study. “There is a lot more about the prevention of BCC that we need to understand,” Mucci said.

Caffeine and skin cancer

BCC accounts for about 80 percent of all skin cancer cases, according to the American Cancer Society. An estimated 2.8 million cases are diagnosed each year in the U.S., according to the Skin Cancer Foundation. BCC does not readily spread to other parts of the body, and so it is rarely deadly. Chronic exposure to the sun or ultraviolet radiation in tanning booths is the major environmental factor that causes BCC.

The researchers analyzed data gathered from 113,000 nurses and health professionals during two long-term studies. Study participants completed questionnaires about their diets, and provided information regarding their cancer risk factors, including family history of melanoma, sunburn reactions, complexion and exposure to direct sunlight. They were also monitored for signs of skin cancer.

Over the 20-year study, 22,786 participants developed basal cell carcinoma, while 1,953 developed squamous cell carcinoma and 741 participants developed melanoma.

The researchers found that the reduction in the risk of developing BCC seen in those who drank coffee was similar to the reduction in risk in people who consumed similar amounts of caffeine from other sources, including tea, chocolate and soda. Still, coffee was the major source of caffeine among the study population, accounting for 78.5 percent of all caffeine intake.

No link was found between caffeinated coffee intake and melanoma, the most deadly form of skin cancer, or squamous cell carcinoma (SCC). However, because the number of study participants diagnosed with melanoma or SCC were small, it is unclear whether caffeine truly has no effect on these skin cancers, or whether more time would be needed to see an effect, Han said.

“In another 10 years or more, it may be clearer whether caffeine also helps stave off these other types of skin cancer,” Han said.

The study is not conclusive — it showed an association, not a direct cause-and-effect relationship between caffeinated coffee and skin cancer risk. Although mouse studies have shown that caffeine may prevent the development of SCC due to UV exposure, there is still no direct, convincing data showing coffee prevents skin cancer in people.

Han also emphasized that while it seems likely the benefit of the coffee comes from caffeine, researchers cannot yet know for sure. “There are lots of compounds in the coffee, including antioxidants. The process of decaffeination can wash out other compounds in the coffee, so we cannot 100 percent tease out that caffeine is the only factor responsible for the effect,” Han said.

Who reaps the most cancer-protective benefits from caffeine?

“Not everyone equally benefits from caffeine consumption,” Han said. The researchers would like to investigate which genes may explain why some people gain cancer protection from drinking caffeine, he said.

Coffee has recently been found to lower people’s risk of dying over a given period, and to decrease the risk of prostate, breast and endometrial cancer, said Mucci.

But the mechanisms at play in these conditions may be different, Mucci said. “For prostate cancer and endometrial cancers, the data show the same benefit of lower risk from caffeinated and decaffeinated coffee,” she said.

Coffee influences several body processes — it has antioxidant effects, helps insulin regulation and may lower inflammation, Mucci said. “It may be that different components of coffee are important for different cancers.”

Source: Scientific American.

 

Implantable Devices Could Detect and Halt Epileptic Seizures.



STIMULATING: A new generation of implantable “closed-loop” devices are designed to monitor the seizure focus, detect patterns of electrical activity that indicate a seizure is beginning, and quickly respond without external intervention. Image: Courtesy of Henrik Jonsson, via iStockphoto.com

Epilepsy affects some 2.7 million Americans—more than Parkinson’s disease, multiple sclerosis and amyotrophic lateral sclerosis (Lou Gehrig’s disease) combined. More than half of patients can achieve seizure control with treatment, yet almost a third of people with epilepsy have a refractory form of the disease that does not respond well to existing antiepileptic drugs. Nor are these patients typically helped by the one implanted device—Cyberonics’ Vagus Nerve Stimulator (VNS)—that has had U.S. Food and Drug Administration approval for treatment of epilepsy since 1997.

Because epilepsy causes repeated, sudden seizures, people with the condition would benefit greatly from a therapy that can detect seizures just as they are starting or, eventually, predict them before they begin and prevent them from happening. A new generation of implantable devices is looking to pick up where medications—and even the VNS—often leave off, at least for people whose seizures routinely begin in one part of the brain (the seizure focus). “Closed-loop” devices are designed to monitor the seizure focus, detect patterns of electrical activity that indicate a seizure is beginning, and quickly respond without external intervention. Such responses could include electrical stimulation, cooling or focused drug delivery—all meant to interrupt the activity and stop the seizure.

Closed-loop devices are considered a new frontier in epilepsy treatment because of their responsiveness. By comparison, the VNS is an open-loop device that stimulates the vagus nerve—a pair of nerves running from the brain stem to the abdomen—to deliver mild electrical pulses (which mitigate the electrical activity of seizures) to the brain on a consistent schedule rather than in response to detected seizure activity. The concept of a closed-loop device for epilepsy comes out of the cardiac world, jumping off from implanted defibrillators that monitor the heart and deliver stimulation in response to an event.

Responsive neurostimulation
So far, only one closed-loop device has reached human trials: NeuroPace’s Responsive Neurostimulation System (RNS), an electrical-stimulation implant with two leads, each containing four electrodes, placed in the brain at the seizure focus. The RNS detects electrical activity that denotes the start of a seizure and delivers direct electrical stimulation to interrupt the activity and normalize the area. The device is surgically positioned in a section of the skull, can be accessed via outpatient surgery when the battery has to be changed, and is imperceptible to the patient and others—all strong design advantages for patients and doctors. The implant, which is now seeking FDA approval, also records information on electrical activity in the brain throughout the day for later review. The RNS has a laptop-based wand interface for remote patient monitoring.

Results of the RNS trials, which tested the implant in conjunction with medications, have been mixed: seizure frequency was reduced by about half in approximately 50 percent of patients. “For a patient to go though permanent implanting of the device on the skull, and electrodes over the brain, which is what is needed for RNS, you’d want it to eliminate most or all seizures, which isn’t the result in most patients,” says John Miller, director of the University of Washington School of Medicine’s Regional Epilepsy Center at Harborview in Seattle. Possible ways to improve the device’s effectiveness, Miller says, could include refining patient selection, improving electrode placement or honing the RNS’s detection process so that it can pick up seizure activity earlier.

Work in closed-loop electrical stimulation is also happening at Boston’s Center for Integration of Medicine and Innovative Technology, where researchers are effectively attempting to turn the VNS into a closed-loop device by developing a nonimplanted add-on system to detect early seizure activity and automatically fire the VNS in response. The VNS comes with a therapy magnet wristband that allows wearers to stimulate the device if they feel a seizure coming on (a sensation called an aura), but not everyone is physically able to do so once the aura begins. The CIMIT system automates the process, activating the VNS once the start of a seizure is detected through electroencephalogram and electrocardiogram readings.

Cool it
Another key area of closed-loop research is focal cooling. Here, an implant—after detecting the onset of a seizure by sensing a rise in brain temperature at the seizure focus, which may slightly precede the start of abnormal electrical activity—rapidly cools the involved region to halt the event. The warming associated with the seizure focus makes thermal detection and cooling a potentially promising technique. One center of focal cooling research is the University of Kansas Medical Center, where Ivan Osorio, professor of neurology, has collaborated with an international research partnership to design a prototype implant with funding from the U.S. Department of Energy. Work on cooling is also in progress at other sites, including Yale University and the University of Minnesota.

“I think cooling is the most promising approach,” says Miller, who collaborates on cooling research led by a University of Washington colleague. “If a particular cooling temperature can be found that prevents seizures, but does not injure the brain or interfere with normal brain function, it would be possible to maintain the region of brain around the seizure focus at that temperature all the time, so that it would not be necessary to detect the seizures to apply the therapy.”

Targeted drug delivery
The third possible mode of operation for closed-loop devices would use convection-enhanced drug delivery (CED). CED involves feeding seizure-halting medications directly to specific areas of brain tissue through an implanted catheter; the concept of CED is designed to avoid the systemic side effects of giving medications orally and having them suffuse through the bloodstream in order to reach the brain.

Yet CED may ultimately prove more useful on a set infusion schedule, rather than linked to a responsive, automated seizure-detection system. “Our current conception of how CED would be used in epilepsy is that patients would receive periodic infusions of a long-lasting antiseizure agent into the epileptic brain region,” says Michael Rogawski, chair of neurology at the University of California, Davis, whose lab is working with British Columbia–based biopharmaceutical company MedGenesis Therapeutix to develop an implantable CED device for epilepsy. “Seizure control might be maintained for months,” he says. “This approach greatly simplifies the technical challenges in comparison with a device that must sense and deliver a drug on a moment-to-moment basis.”

Deep-brain stimulation
With electrical stimulation, too, some patients will find that an open-loop device that fires consistently works better—like the VNS, or Medtronic’s Deep Brain Stimulation (DBS) implant for epilepsy, which the FDA is now reviewing. Similar to the company’s widely-used DBS technology for Parkinson’s disease, the DBS for epilepsy is placed within the brain and consistently stimulates a region called the anterior nucleus of the thalamus, which helps control the electrical excitability of the cortex.

Unlike closed-loop devices, which typically require a distinct seizure focus, the DBS can be used to treat patients whose seizures appear to engulf the entire brain, or large portions of it, at once. “If you look at the population of patients who have these very unlocalizable, diffuse seizure disorders, folks who are having many, many seizures a day and are just devastated—if you can control some of those seizures even in some of those patients, you’ve done a great good for the families and the patients,” says Dennis Spencer, chair of neurosurgery and director of the Epilepsy Surgery Program at Yale University School of Medicine. “We think that the DBS will open up a path for therapy.”

Closing the loop
Closed-loop technologies for epilepsy face several hurdles. Skeptics note that brain surgery poses significant risks, and that the benefits of implanted devices will not always outweigh those dangers. There are also concerns about the possibility of false positives—detection of electrical activity that turns out not to be a seizure. “If the intervention did cause a transient interruption in brain function, it would be undesirable for the patient,” Miller says. “For example, if the area that was being affected mediated language, the person might have a brief interruption in the ability to speak.”

Researchers also acknowledge that in a condition as variable as epilepsy, there will never be a single solution, such as cooling, stimulation or drug delivery alone. “We may need to use more than one modality to fully control epilepsy,” Osorio says. “But all of that hinges on the ability to detect in real time—and to quantify—seizures.”

Although the design of first-generation closed-loop devices is just beginning, theoretical development of the second generation is already underway. Because people with epilepsy never know when and where a seizure will occur, the goal of second-generation closed-loop devices will be finding a way to predict seizures before they begin and intervene to prevent them. “You can detect seizures, but you’re still detecting them too late to really have a major therapeutic possibility,” Spencer says. “Prediction is where we’re really looking to put our eggs—in that basket.”

Source: Scientific American.

Leggy Robot (Almost) Moves Like Jagger.


In popular fiction,  have no rhythm—look no further than the “robot dance” for evidence of this. Yet rhythm—or the neurophysiological processes that enable humans to produce patterns of recurring movement—is the key to creating bots that move more like people. So says a team of University of Arizona engineers who claim to have built a set of robotic legs that mimic the human gait better than any other artificial life form to date.

Indeed, the video below makes a compelling case. Although the gait is a bit stiff, the robot legs flex and even have some swagger. M. Anthony Lewis, director of Arizona’s Robotics and Neural Systems Laboratory, and Theresa Klein, a Ph.D. student at the lab, are publishing a study on Friday in the Journal of Neural Engineering detailing how they were able to accomplish this.

Like many roboticists, Lewis and Klein looked to nature for inspiration. Humans have a central pattern generator (CPG) in their spinal cord’s lumbar region. The CPG is a neural network producing rhythmic signals that allow the body to generate the step cycle needed for locomotion. The CPG creates and controls these signals based on information it gathers from the legs, which indicate, for example, the slope and solidity of a surface as they walk.

Lewis and Klein’s robot features the simplest form of a CPG—just two neurons that fire signals alternately to produce a rhythm, as well as load sensors that determine force in the limb when each leg presses against a stepping surface. This setup is similar to the mental mechanism that allows human babies to learn to walk—a pair of neurons enables their little legs to work in rhythm with practice.

Each leg of the university’s robot consists of a hip, knee and ankle moved by nine muscle actuators. Muscle contraction is mimicked by rotating the motor to pull on Kevlar straps. Each muscle strap features a load sensor that models a tendon in a human leg, sensing muscle tension when a muscle is contracted and sending signals to the brain about how much force is being exerted and where.

Of course one of the primary goals of this research is to create more human-like movement in robots. But the researchers also hope their work helps better explain how humans walk and how spinal-cord-injury patients can recover the ability to walk if properly stimulated in the months following their injury.

Source: Scientific American.

Mixed Signals: Smart Phone Sensors Recruited to Deliver Indoor GPSMixed Signals: Smart Phone Sensors Recruited to Deliver Indoor GPS.


Duke University researchers are developing a mobile app that uses wi-fi antennas, cellular radios and other detectors to guide smart phone users

Global positioning system (GPS) devices may not always provide spot-on directions, but they do provide drivers, cyclists and hikers with convenient access to digital map data of every square meter of the planet shadowed by satellites. Step indoors and you will find that same GPS receiver becomes an expensive paperweight.

Indoor GPS has been in the works for at least a decade, but the plethora of interfering signals from wi-fi, ultrasound, cellular and other devices make it difficult for GPS units to come up with an accurate reading. Whereas a GPS discrepancy of a few meters while navigating city streets makes little difference to a driver, that same margin of error inside a big-box electronics store or hospital is likely to send users down the wrong aisle or hallway.

Instead of focusing on a single type of signal to map indoor areas, researchers at Duke University and Egypt-Japan University of Science and Technology are developing software to help smartphone users find their way by gathering information from a number of different signal types. Their UnLoc system—short for unsupervised indoor localization—gathers signal data using wi-fi antennas, cellular radios, compasses, gyroscopes and accelerometers.

UnLoc tags each of these signals as a virtual landmark. One example would be an elevator whose distinct pattern of movement can be detected by a smartphone’s accelerometer, according to the researchers, who presented their UnLoc system on June 27 at the 10th International Conference on Mobile Systems, Applications, and Services (MobiSys 2012) in Low Wood Bay, England (pdf). Similarly, a particular corridor might be configured with a unique set of wi-fi access points that the smartphone can read. UnLoc envisions these kinds of signatures as internal landmarks within a building, according to the researchers, who are led by Romit Roy Choudhury, an associate professor of computer engineering at Duke. Landmark information could then be stored in individual phones or shared as part of a larger database that maps indoor environments in more detail.

The researchers have demonstrated how UnLoc works in a YouTube video. In the video, Alex Mariakakis, a Duke undergraduate student who participated in the research, navigates an on-campus building using a Samsung Nexus S Android phone. As Mariakakis walks the building’s corridors, UnLoc superimposes a dot marking his progress on a map of the building’s floor plan, which is stored in the phone. The researchers say that UnLoc, which can track a user’s location even without a preinstalled floor plan, typically located between 10 and 20 landmarks per floor in the buildings where they tested and was accurate to within about 1.7 meters.

As mobile devices have become more powerful, a number of academics and technology companies are developing indoor localization capabilities. For instance, Google, Microsoft and Stanford University are participating in the WiFiSLAM project, which focuses specifically on using wi-fi signals—as opposed to the variety of signals UnLoc purports to detect—to pinpoint a smartphone user’s location to an accuracy of within 2.5 meters.

Indoor localization is expected to be particularly useful in health care settings. Norway-based Sonitor Technologies sells an ultrasound-based indoor positioning system (IPS). Patients in a health care facility wear wristbands that emit ultrasonic waves, and microphones placed throughout the facility pick up the high-frequency sound. Since walls and doors effectively confine the signals to a room, Sonitor’s IPS avoids signal confusion with any radio-frequency identification tags or wi-fi hot spots used in the same building.

Source: Scientific American.

 

 

Weight-Loss Drug Wins U.S. Approval.


The obesity treatment shows promise for patients with diabetes despite concerns that it could cause heart complications.

After 13 suspenseful years, the US Food and Drug Administration (FDA) has approved a pill that could help to fight the US obesity epidemic.

Belviq () is no wonder drug, but it can help people to lose about 3–4% of their body weight when combined with a healthy diet and exercise. The drug has been approved for use by obese people with a body mass index (BMI) greater than 30, and for a subset of overweight people (with a BMI of more than 27) who have health conditions such as high blood pressure, elevated cholesterol and type 2 diabetes.

“It’s a start in the right direction,” says Abraham Thomas, head of endocrinology at the Henry Ford Hospital in Detroit, Michigan, who chairs the FDA’s Endocrinologic and Metabolic Drugs Advisory Committee. “We don’t have the tools to really treat obesity.”

Developed by Arena Pharmaceuticals of San Diego, California, Belviq faced a high bar for safety. In 1997, the weight-loss drug fenfluramine was pulled from the market for causing heart-valve problems. In the past two years, the FDA has rejected a total of three obesity drugs because of concerns over safety or lack of efficacy. The FDA advisory committee recommended in March that all obesity drugs should go through tests for cardiovascular risks, which would extend already lengthy clinical trials.

The FDA had already rejected Arena’s first application for approval of Belviq in September 2010 because the compound seemed to produce tumours in rats and because the company could not statistically rule out an increase in the risk of heart-valve problems. Similar to fenfluramine, Belviq suppresses food cravings by mimicking the effects of serotonin in the brain, making people eat less and feel full. However, Belviq seems to activate only the serotonin 2C receptor in the brain, not the serotonin 2B receptor that is present in heart muscle.

The FDA’s turnaround this week came after Arena performed echocardiograms in nearly 8,000 people to measure heart-valve function, which revealed that there was no increase in heart-valve abnormalities among those taking the drug. The firm has agreed to run six post-marketing studies, including a long-term cardiovascular trial, and patients with congestive heart failure are advised not to take the drug.

“I felt the benefits outweighed the risk,” says Ida Johnson Spruill, the consumer representative on the FDA advisory committee and a diabetes specialist at the Medical University of South Carolina in Charleston. One-third of adults in the United States are obese, so regulators must balance the risks of a new weight-loss drug with the health consequences of obesity, including rising diabetes rates.

Weighing the benefits
Compared to the placebo, Belviq’s efficacy is about the same as that of orlistat, which was first approved in 1999 and blocks the uptake of fat calories. A 90-kilogram patient on Belviq loses, on average, an extra 3 kilograms (6–7 pounds) or so after a year. “The good minimum weight loss would be in the 10–15-pound range,” notes endocrinologist Peter Savage of the National Heart, Lung, and Blood Institute in Bethesda, Maryland, “That doesn’t mean that people who lose 5–8 pounds don’t do well.”

It would also be a mistake to reject a drug that works well for a subset of the patient population, says Thomas. About 20% of people on the drug lost 10% or more of their body weight. The FDA recommends that patients who have not lost 5% of body weight by week 12 stop taking the drug.

Belviq has also shown promise for people with type 2 diabetes, who were twice as likely to keep their blood sugars under control than those on the placebo.

“No medication works by itself,” says Patrick O’Neil, a clinical psychologist at the Medical University of South Carolina and lead author of the diabetes study. “It’s not a replacement for diet, exercise, and lifestyle modification, but it can augment such programmes.”

Source: Scientific American /Nature.

 

 

 

 

Bladder dysfunction in hereditary spastic paraplegia: a clinical and urodynamic evaluation.


Hereditary spastic paraplegia (HSP) is a degenerative central nervous system disorder characterized by progressive spasticity and hyperreflexia of the lower limbs. Often, patients with HSP experience symptoms of voiding dysfunction. Urodynamic evaluations of these patients are rarely reported in the literature and the etiology of voiding dysfunction remains unclear. The present study characterizes lower urinary tract dysfunction in a large series of patients.

Methods:

 

The medical records of 29 HSP patients who underwent urodynamic evaluation were retrospectively analyzed. The history of lower urinary tract symptoms was noted and the urodynamic findings analyzed.

Results:

 

Urgency was the most dominant complaint (72.4%), followed by frequency (65.5%), urinary incontinence (55.2%) and hesitancy (51.7%). The urodynamic findings showed signs of central neurogenic bladder in 24 patients (82.7%), with detrusor overactivity (DO) in 15 patients (51.7%) and detrusor sphincter dyssynergia (DSD) in 19 (65.5%). Post-void residual (PVR) of >10% of the voided volume was found in 12 patients (41.4%). There were significant relationships between detrusor overactivity and PVR (P=0.005), frequency (P=0.046) and nocturia (P=0.045). Ultrasound examination revealed no upper urinary tract complications.

Conclusion:

 

Despite the presence of DO and DSD, HSP patients do not seem to have a high risk of developing ultrasonographically-assessed upper urinary tract complications after a mean follow-up of 22 years, contrary to spinal cord injury population. These results may guide practitioners in their decision-making about the appropriate evaluation and treatment of bladder disturbances that accompany hereditary spastic paraplegia.

Source: Spinal cord research.