Most Cancer Is Beyond Your Control, Breakthrough Study Finds .


There’s a lot we can do to protect ourselves from certain cancers — don’t smoke, avoid prolonged exposure to the sun, and try not to breathe or ingest too many chemical pollutants in the air or our food. But scientists have always known that this was only part of the cancer story. There’s also heredity, but that only explains about 5% to 10% of cancer. The truth of the matter is that some tumors emerge simply at random. But how much of malignancy can be attributed to this unfortunate roll of the dice? What really causes cancer?

Indeed, when they charted out the stem cell data for 31 types of tissues, they found a dramatic connection between the two — the more stem cells the tissue had, the higher its incidence of cancer over a person’s life time on average. “Think of cancer as the risk of having an accident if you are driving a car,” says Tomasetti, a biostatistician who holds positions in the department of oncology at Johns Hopkins Kimmel Cancer Center and the Johns Hopkins Bloomberg School of Public Health. “If you drive the car on a cross country trip, your risk of an accident is much higher than if you take a local trip to the grocery store. The risk correlates to the length of the trip. The trip to the grocery store might be thought of as bone cancer, which has few stem cell divisions. While the cross country trip might be more like colon cancer, which has many more cell divisions.”

In fact, the correlation held strong among cancers that were both common and more rare. The more likely those cells would divide and develop DNA errors or mutations in the process that led to uncontrolled growth, the more likely that tissue would develop tumors.

“It was quite surprising to us. We think it’s pretty big,” he says. “About 65% of cancer incidence across tissue types appears to be explained by the number of stem cell divisions.”

Having a detailed understanding of both how large a tissue’s stem cell population is, as well as how active it is, could be a determining factor in whether it’s likely to develop cancer. Both the brain cells that can cause glioblastoma and medulloblastoma, and the colon contain about the same number of stem cells, Tomasetti estimates — about one hundred million. But the colon stem cells divide about 6000 times on average during lifetime, compared to nearly zero for the brain stem cells. That leads to rates of colon cancer that are 22 times higher than rates of the brain tumors.

Print

Such an explanation could also resolve some of cancer’s mysteries — why people who don’t smoke still get lung cancer in surprising numbers, or why rates of colon cancer are higher than rates of cancer in the small intestine, despite being shorter in length. One reason, says Tomasetti, could have to do with the different stem cell activity in these tissues.

This finding potentially changes the landscape of cancer. In recent decades, cancer rates have come down due to aggressive efforts to educate and motivate people to take positive steps toward preventing cancer in the first place, such as quitting smoking and avoiding the sun’s ultraviolet rays. Have those messages been wrong?

Not exactly. Tomasetti says that the study shows that it’s time to redirect that cancer strategy a bit — not abandon it. For example, he and Vogelstein propose looking at cancers in two categories, those that are primarily due to genetic bad luck, and those that are due to that unfortunate roll of the genetic dice plus environmental or hereditary factors. So melanoma, ovarian cancer, many brain cancers, lung cancer among non-smokers, the most common leukemias and bone cancers, for example, are pretty much out of people’s control. They’re the result of the random mutations caused by the stem cells dividing in these tissues — bone, blood, ovaries, brain and skin — that make mistakes that turn malignant. For these cancers, changing your lifestyle or trying other interventions to stop the cancer from occurring in the first place won’t help. But being vigilant about screening, and picking up the first signs of trouble early, can be life saving.

For the other type of cancers, those that are the product of both stem cell mutations and heredity or other exposures, continuing with proven prevention methods, which include screening in cases of inherited disease, as well as quitting smoking and reducing exposure to radiation and carcinogens, is still critical. That’s what has lowered rates of lung cancer among smokers, for example, and colon cancer among those with hereditary disease.

“Everything we know about altering lifestyles to prevent cancer from the environmental point of view we absolutely need to continue doing,” says Tomasetti. “If anything it puts more stress on the need to spend even more money on early detection. It may be the key tool for quite a few cancer types.”

Tomasetti admits that two common cancers are missing from the study — breast cancer and prostate cancer. That’s because knowledge about their stem cell populations, and how often those tissues renew, isn’t quite as solid as it is for tissues such as colon. “We are working on that,” he says. “We hope this type of work highlighting the importance of self renewal will cause others to investigate these stem cell populations in more detail as well.”

In the meantime, he stresses that while we may not be able to prevent the tumors from forming, it’s still possible to treat them and potentially save lives by finding them early and removing them or using chemotherapy or radiation to keep them under control. “My biggest fear is that people will say forget about it, and then do nothing. The opposite is true. We need to do everything we did before, but we want to do it even more than before,” he says.

Why You Should Break Up With Processed Foods Forever


In their Halloween-themed piece, the New York Times paints a picture of Botica & Co. as an old-school apothecary — a completely different genre from your run-of-the-mill juice place. Naively, I  Read

If I had to find just one good reason to eat processed foods, I’d have an impossible time coming up with one. But when it comes to reasons not to eat processed foods, however … well, I could talk your ear off. In short, I don’t eat processed foods because I care too much about sustaining my health to risk it on anything that might jeopardize it. So what do I eat?

The same things I advise my patients to eat: healing whole foods that deliver energy, vibrance and wellness. When you apply those three simple criteria to everything that goes in your mouth, eating well becomes a pretty simple exercise.

While this approach can be tough at first for those who are trying to turn around a lifetime of poor eating habits, the good news is that in time, with practice, the desire for processed food will fall away and eating well will become second nature.

If you are beginning the journey to better health, but finding it challenging, here are a few thoughts to remember as you work to free your body and mind of processed foods:

1. Processed foods make simple foods complicated (and unhealthy).

When referring to “processed foods,” we’re talking about foods that aren’t in their original, natural state when you buy them. Foods that come with a label listing more than two or three ingredients are generally considered to be processed.

For example, a bag of frozen organic spinach has only one ingredient, spinach — nothing has been added or taken away. A jar of raw almond butter will contain just almonds, so while some processing has taken place, nothing has been added. Then read the label on an average Lean Cuisine. There you’ll find upwards of 50 anything-but-natural ingredients listed! Now that’s what I call processed — taking simple food and pumping it full of stuff nobody ever asked for.

Among processing’s many sins, the first one is that it complicates food, taking the streamlined, simple and pretty-close-to-perfect, then processing out the nutrients and processing in a boat-load of questionable ingredients.

2. Processed foods beat up your body.

A bigger, more alarming problem with processed foods is what’s going on inside them. Virtually all processed foods contain man-made ingredients, whose long-term effects are either highly questionable, seriously detrimental or even possibly carcinogenic (i.e., azodicarbonamide, butylated hydroxyanisole (BHA), butylated hydrozyttoluene (BHT) and aspartame to name a few).

Chemical additives, artificial colors, artificial flavorings, fillers, high fructose corn syrup, vegetable oils, trans-fats and preservatives abound in processed foods, and the trouble is we don’t fully know the amount of damage they may be inflicting on our bodies.

We do know there’s mounting evidence to suggest a link between processed food consumption and our skyrocketing rates of obesity, diabetes, cancer and heart disease, which if you ask me is reason enough to dump them.

With fresh, organic, whole foods however, there’s no need to worry about the long-term health fallout. Whole foods are just as healthy as nature made them, with all their nutrients and health-sustaining properties intact.

3. Processed foods can make you sick or kill you.

The bigger the transformation and the more steps your food passes through to go from raw material to finished “product,” the fewer nutrients survive. They’re literally pounded, pulverized, liquefied, extruded and processed out.

Producers are less concerned with preserving nutrients than they are with turning a profit. They do so by producing the maximum amount of product at the lowest cost, and manufacturing it to maximize shelf life — none of which happens without taking chemical liberties, tossing in a few more preservatives and sacrificing nutrients along the way.

Problem is, despite industry claims to the contrary, many of the common preservatives and artificial colors in processed foods have been linked to a variety of health problems, including moderate-to-severe allergies, neurologic disorders and even cancer. Not very appetizing, eh?

Real, unprocessed or minimally processed foods on the other hand, are far less likely to cause damage or make you sick. Better yet, they tell you when they’re no longer fresh. They’ll start to wilt or smell, loose their color, start sprouting or grow mold – all to naturally signal that their nutrients are starting to pass their peak, no “sell-by” stamp required.

4. Processed foods are designed with addiction in mind.

Can you make a cheese doodle? A Dorito? An Oreo? Probably not, as few of us possess the lab skills or chemical ingredients needed to create Franken-foods — and that’s just as well. What’s so diabolical about processed foods is that their lack of nutrients, good fats, fiber or protein, and excesses of salt and sugar, wind up encouraging the release of your body’s feel-good chemicals.

That release triggers the desire for more sweet or salty crappy foods with no nutritional payoff. If this is happening multiple times a day, it’s easy to see how people wind up trying to fill a belly that’s never satisfied, and it’s weight gain, here you come. For example, most people find it virtually impossible to be satisfied by just one sugar-packed, quickly-digested, fiber and nutrient-free Oreo cookie, so they’ll likely eat a bunch before stopping, and even then, only reluctantly.

By contrast, just one whole piece of fruit, like an orange or a serving of blueberries, is usually enough. Why? Because the fruit will deliver a much larger nutritional payload, including fiber, water and slowly-metabolized carbs, without setting off intense cravings.

5. Want to stay chubby? Processed foods can help!

As evolved as we may think we are, when it comes to processed foods, many of us are closer to lab monkeys than we’d like to admit, repeatedly hitting the processed-food pleasure bar, having fallen prey to the addictive flavors which have been carefully baked right in.

The processed food industry helps keep you fat by devoting countless resources to identifying and developing flavors with appeal. They create sweet, salty, never-fully-satisfying foods, full of the bad fats, that can put you into an almost perpetual state of craving. With your satiety switch suppressed, overeating becomes the norm. The food manufacturers win, and you lose everything but the weight.

6. After eating a Big Mac and fries, nobody ever said, “Wow, I feel fantastic!”

Processed foods are talking to you, but are you listening? Do you feel great after eating a fast food meal? Do you feel energetic after a few slices of pizza? Didn’t think so.

The fact that many people wind up feeling lethargic, sleepy and even depressed after eating processed food is the body’s way of saying this isn’t a good way to eat. Listen to your body. It knows! Eating foods that are unprocessed or minimally processed will deliver actual nourishment, i.e. vitamins, minerals, phytonutrients, that will make you feel good and supply the long-lasting energy your body needs to function at its best.

7. When at least 80% of your diet is nutrient-rich, whole foods, you’ll be likely to optimize your health.

For some, cold turkey is the simplest the way to release the addictive grip of processed foods, while others succeed by slowly tapering off. However you choose to go about it though, look for foods as close to their fresh, unfettered, original state as possible, to minimize your ingestion of chemicals, additives and artificial flavors.

If access to fresh produce is limited, supplement with frozen, which is often just as good as fresh. Look for meat and poultry that’s been raised responsibly, humanely, grass-fed or pasture-raised, without antibiotics, hormones or genetically modified feeds. Let go of food in pouches, boxes and cans. When you get to the point where at least 80% or more of your diet is made up of nutrient-rich, whole foods, you’ll tip the scales in your favor and make a significant positive impact on your health.

Nanoscale neighbors: First use of transformation optics to accurately analyze nonlocality in 3D plasmonic systems.


The ubiquitous van der Waals interaction – a consequence of quantum charge fluctuations – includes intermolecular forces such as attraction and repulsion between atoms, molecules and surfaces. The most long-range force acting between particles, it influences a range of phenomena including surface adhesion, friction and colloid stability. Typically a simple task when parallel surfaces are further apart than 10 nanometers, calculating van der Waals forces between, for example, a pair of nanospheres less than five nanometers apart becomes quite difficult. Moreover, the latter scale requires that the effect of nonlocality (the direct interaction of two objects that are separated in space with no perceivable intermediate agency or mechanism) be considered, introducing complexity into, and thereby further hampering, analysis.

Recently, however, scientists at Imperial College London, London proposed a simple analytic solution, showing – for the first time, the researchers say –that nonlocality in 3D plasmonic systems can be accurately analyzed using . (Plasmons are quasiparticles arising from the quantization of plasma oscillations at optical frequencies; by arranging in a specific way, transformation optics determines the direction in which electromagnetic radiation will propagate.) The scientists also suggest that their results increase the underlying understanding of nonlocal effects in plasmonic nanostructures.

Prof. Sir John Pendry discussed the paper that he, Dr. Yu Luo and Dr. Rongkuo Zhao published in the Proceedings of the National Academy of Sciences. “Nonlocality introduces computational complexity which makes doing the calculations difficult,” Pendry tells Phys.org. “We’ve found a workaround that greatly simplifies the calculations by replacing the nonlocal system with a local system that reproduces the results to a high degree of accuracy.” Specifically, the scientists showed that nonlocality in 3D plasmonic systems can be accurately analyzed using the transformation optics approach – the first time that the technique has been applied to van der Waals forces – which they applied to solve the problem of including nonlocal effects when two nanoscale bodies interact.

“The key to successfully exploiting transformation optics,” Pendry points out, “is to choose the right transformation. In our case we were able to transform the problem of two nearly-touching spheres into the much more symmetric problem of two concentric spheres.” In so doing, the researchers had to address two challenges:

The absorption spectrum for a dimer of spherical particles. The contour plot of the absorption cross section vs. the frequency and the separation for a pair of gold nanospheres with equal radii of (A) 5 and (B) 30 nm. Comparison of our analytical calculations with local and nonlocal numerical simulations for two closely separated (δ =0:2 nm) gold spheres with equal radii of (C) 5 and (D) 30 nm. Credit: Luo Y, Zhao R, Pendry JB (2014) van der Waals interactions at the nanoscale: The effects of nonlocality. Proc Natl Acad Sci USA 111(52):18422-18427.

· the problem involved several length scales, meaning that they had to take into account the spheres themselves (~10nm) as well as the spacing between them, which they tried to push to the limit of one atomic spacing (~0.2nm)

· the fact that the forces depend on contributions from many different frequencies over a range of almost 100eV

Pendry notes that researchers are only now beginning to explore the consequences of nonlocality in nanoscale surface phenomena, and are in the process of building reliable models. “The nanoscale forces in our paper are just one instance of where it’s important to treat nonlocality, where the main complication is that the response of a system at a given point depends not just on the electromagnetic fields at that point, but on the fields in the surrounding region as well – a problem that many traditional approaches fail to address.”

In their paper, the scientists found that nonlocality dramatically weakens the field enhancement between the spheres, and thereby the van der Waals interaction. “van der Waals forces – although long range relative to standard chemical bonds – are only significant when surfaces are quite close to one another,” Pendry explains. “The standard local theory predicts infinite force in the limit that surfaces touch – but of course this is nonsense. Therefore, predictions that make sense and can be compared to experiments need to take nonlocality into account.”

Relatedly, the paper states that chemical bonding – while not an explicit concern in this study – will dominate the final approach just before the surfaces touch at a few tenths of a nanometer, at which point direct contact of the charges will come into play through electron tunneling. “The forces we consider are complementary to chemical bonding,” Pendry clarifies, “in that the current theoretical approach to chemical bonds exploits the local density approximation. In other words, just as a study of pure van der Waals forces omits chemical bonding, so a pure local density study of bonds has nothing to say about the longer range dispersion forces that we calculate. Of course, at some stage the two have to come together…but for that to happen we need experimental input – and theoretical studies of the van der Waals forces are the first steps in making this happen.”

The approach described in the paper makes analytical investigation of 3D nonlocal problems feasible while providing insight into the understanding of nonlocal effects in plasmonic nanostructures. “Calculations are always difficult when treating singular structures – by which we mean situations such as the nearly touching spheres considered in our paper – but also the interaction of needle-sharp points with surfaces,” Pendry explains. “Using transformations to unravel the singularity reveals how the forces work in each of these situations, and in fact often enables us to show a common origin.” For example, regarding how their results might influence the development of functional subnanometer substrates, he adds that “any nanomechanical system must consider the effects of van der Waals forces – and our paper is an attempt to further our understanding of these problems.”

Looking ahead, Pendry tells Phys.org that van der Waals forces are just the first step in a series of investigations the scientists have already planned. “On the near horizon is heat transfer between surfaces that are close but not in physical contact: Electromagnetic fluctuations responsible for the van der Waals force also enable heat to leap across the gap – an effect different from, and much stronger than, radiative cooling.” (Radiative cooling is the process by which a body loses heat by thermal radiation.) “In the longer term, we’ll try to generalize our theory of quantum friction, whereby surfaces which are close but not in physical contact can experience frictional drag. Nonlocality is also an important issue in the effects.”

In closing, Pendry notes that several other areas of research might benefit from their study, given that transformation optics is a very general technique in electromagnetic theory. “The present study is just one in a whole series of applications. We’ve already seen many studies of its application to invisibility, and we have used it extensively to study intense field enhancements in plasmonic structures, such as surface enhanced Raman spectroscopy. In fact, virtually any problem that has electromagnetic radiation interacting with a physical structure could potentially benefit from transformation optics – and in the case of plasmonic systems, will always be an important issue whenever surface in close proximity are considered.”

Russia building a new high speed train that will travel to Beijing in just 48 hours .


Russia plans to build a new high speed railway, with trains that would speed from Moscow to Beijing in just 48 hours.
At the moment, it takes about seven days to commute between the two cities and the route requires changes. 
At the moment, it takes about seven days to commute between the two cities and the route requires changes.

According to Romanian website Glasul, the Kremlin has awarded the project to China Railway High-speed (CRH), a subsidiary of the state-controlled China Railway (CR), which is working in a joint-venture with the local firm Uralvagonzavod.

The simple math that explains why you may (or may not) get cancer


Why? That’s the first word on many lips after a cancer diagnosis. “It’s a perfectly reasonable question,” says Bert Vogelstein, a cancer geneticist at Johns Hopkins University in Baltimore, Maryland, who has spent a lifetime trying to answer it. Thanks to his friendship with a recently minted Ph.D. in applied mathematics, the two now propose a framework arguing that most cancer cases are the result of biological bad luck.

As the number of stem cell divisions in a tissue rises, so does the chance of cancer striking that site.

In a paper this week in Science, Vogelstein and Cristian Tomasetti, who joined the biostatistics department at Hopkins in 2013, put forth a mathematical formula to explain the genesis of cancer. Here’s how it works: Take the number of cells in an organ, identify what percentage of them are long-lived stem cells, and determine how many times the stem cells divide. With every division, there’s a risk of a cancer-causing mutation in a daughter cell. Thus, Tomasetti and Vogelstein reasoned, the tissues that host the greatest number of stem cell divisions are those most vulnerable to cancer. When Tomasetti crunched the numbers and compared them with actual cancer statistics, he concluded that this theory explained two-thirds of all cancers.

“Using the mathematics of evolution, you can really develop an engineerlike understanding of the disease,” says Martin Nowak, who studies mathematics and biology at Harvard University and has worked with Tomasetti and Vogelstein. “It’s a baseline risk of being an animal that has cells that need to divide.”

The idea emerged during one of the pair’s weekly brainstorming sessions in Vogelstein’s office. They returned to an age-old question: How much of cancer is driven by environmental factors, and how much by genetics? To solve that, Tomasetti reasoned, “I first need to understand how much is by chance and take that out of the picture.”

By “chance” Tomasetti meant the roll of the dice that each cell division represents, leaving aside the influence of deleterious genes or environmental factors such as smoking or exposure to radiation. He was most interested in stem cells because they endure—meaning that a mutation in a stem cell is more likely to cause problems than a mutation in a cell that dies more quickly.

Tomasetti searched the literature to find the numbers he needed, such as the size of the stem cell “compartment” in each tissue. Plotting the total number of stem cell divisions over a lifetime against the lifetime risk of cancer in 31 different organs revealed a correlation. As the number of divisions rose, so did risk.

Colon cancer, for example, is far more common than cancer of the duodenum, the first stretch of the small intestine. This is true even in those who carry a mutated gene that puts their entire intestine at risk. Tomasetti found that there are about 1012 stem cell divisions in the colon over a lifetime, compared with 1010 in the duodenum. Mice, by contrast, have more stem cell divisions in their small intestine—and more cancers—than in their colon.

The line between mutations and cancer isn’t necessarily direct. “It may not just be whether a mutation occurs,” says Bruce Ponder, a longtime cancer researcher at the University of Cambridge in the United Kingdom. “There may be other factors in the tissue that determine whether the mutation is retained” and whether it triggers a malignancy.

That said, the theory remains “an extremely attractive idea,” says Hans Clevers, a stem cell and cancer biologist at the Hubrecht Institute in Utrecht, the Netherlands. Still, he points out, the result “hinges entirely on how good the input data are.”

Tomasetti was aware that some of the published data may not be correct. In 10,000 runs of his model, he skewed where various points on the graph were plotted. Always, “the result was still significant,” he says, suggesting the big picture holds even if some of the data points do not. In mathematical jargon, the graph showed a correlation of 0.81. (A correlation of 1 means that by knowing the variable on the x-axis—in this case, the lifetime number of stem cell divisions—one can predict the y-axis value 100% of the time.) Squaring that 0.81 gives 0.65—an indicator of how much of the variation in cancer risk in a tissue is explained by variation in stem cell divisions (see graph above).

For Vogelstein, one major message is that cancer often cannot be prevented, and more resources should be funneled into catching it in its infancy. “These cancers are going to keep on coming,” he says.

Douglas Lowy, a deputy director of the National Cancer Institute in Bethesda, Maryland, agrees, but also stresses that a great deal of “cancer is preventable” and efforts to avert the disease must continue.

Although the randomness of cancer might be frightening, those in the field see a positive side, too. The new framework stresses that “the average cancer patient … is just unlucky,” Clevers says. “It helps cancer patients to know” that the disease is not their fault.

How Space Station Tech Is Helping the Fight Against Cancer


One of the tools used in the fight against cancer is, quite literally, out of this world.

The neuroArm

Research performed on the International Space Station and its predecessors, along with technology developed initially for work in space, play important roles in understanding the disease and improving treatments.

When the Soviet Union launched Salyut 1, the first space station, into orbit in the 1970s, humans began spending more and more time in extremely low-gravity environments. On the International Space Station today, gravity ranges from 1,000 to 1,000,000 times less than the force experienced on Earth. These weightless environments are also known as “microgravity” environments, offering an invaluable platform for cancer research in space.
In a recent article published in the journal Nature Reviews Cancer, cell biologist Jeanne Becker, of Nano3D biosciences in Houston, explored how microgravity environments in space stations of the past and present allow biologists to study the cells in three-dimensional growth environments similar to those experienced in the human body.

IGAR Performing a Biopsy

Getting rid of gravity

On Earth, gravityflattens the cells in a lab, but in space they retain their rounded shapes. At the same time, in microgravity, the cells arrange themselves into three-dimensional groupings, or aggregates, that bear a strong resemblance to what happens inside the human body. Becker was the principle investigatorfor a space station experiment that focused on ovarian cancer cells, according to a NASA statement.

Since 2003, the Japan Aerospace Exploration Agency (JAXA), has studied the high-quality crystals formed by protein molecules in space, where microgravity no longer causes flows based on density differences and the sinking of heavier particles. The resulting orderly formation of protein crystals may hold the key to treating diseases. One newfound protein, H-PGDS, plays a useful role in the treatment of muscular dystrophy.

Another study, Cellbox-Thyroid, examines cancer at a cellular level. Building on findings from a previous investigation, Cellbox-Thyroid studies the spherical structure of cancer cells in microgravity and how they spread, potentially providing an improved understanding on what drives the cells.

Not all space research requires a station.

One team of scientists, led by Daniela Grimm, a researcher with the Laboratory of Space Medicine and Space Pharmacology at Aarhus University in Denmark, studied the Science in Microgravity Box (SIMBOX) on the Shenzhou 8 spacecraft, an unmanned Chinese spacecraft that docked with that country’s Tiangong 1 space module in 2011. The team determined that some tumors seem to become less aggressive in microgravity than they are on Earth. Grimm and her colleagues continue to search for as many genes and proteins as possible that are affected by microgravity. [10 Do’s and Don’ts to Reduce Your Risk of Cancer]

IGAR Performing a Biopsy
Pin It An artist’s rendering of IGAR performing a biopsy. The robotic instrument is based on technology aboard the International Space Station.

Radiation in space

Doctors have plenty of experience fighting cancer on the ground. But astronauts in space are constantly bombarded by cosmic rays, a different form of radiation than is experienced on Earth, where gamma and X-ray radiation prevail. The different types of radiation can produce different changes in human DNA, the genetic material present in nearly every cell in the body.

“In space, living organisms are constantly exposed to cosmic rays resulting in damages in DNA,” Honglu Wu, of NASA’s Johnson Space Center in Houston, told Space.com by email.

“Whether the repair of these damages in space is different from that on the ground will impact the accuracy of assessment of health risks in the astronauts and the mutation rate of microorganisms in space.”

Wu serves as principle investigator for a new study, MicroRNA Expression Profiles in Cultured Human Fibroblast in Space (Micro-7), which examines the effects of microgravity on DNA damage and repair. The experiment induces DNA change using a chemotherapy drug, and observes how microgravity affects the mutations not only in DNA but also in the microRNA that regulate gene expressions. Changes in the microRNA in microgravity can affect how the cell responds to DNA damage in space.

Although the experiment focuses on cancer caused by long exposure in space, it may have ramifications for the Earth-based mutations.

“One of the challenges in radiotherapy of cancer is resistance of certain tumor types to radiation,” Wu said. [Top 10 Cancer-Fighting Foods]

“If our study can identify microRNAs that are associated with the repair of DNA damages, manipulating these microRNAs in tumor cells will hopefully increase the sensitivity to radiation treatment.”

Embryo Rad is a radiation experiment that will search for the transgenerational effects of radiation exposure in rodents. Frozen mouse embryos will fly in the radiation environment of space. On their return to Earth, they will be implanted in surrogate mothers. Scientists will observe possible changes in life spans, as well as cancer development or gene mutations, in order to better understand secondary cancer risks involved in ground-based radiation therapy for humans.

Paige Nickason
Pin It Paige Nickason, the first patient to have brain surgery performed by the robotic neuroArm, points to the location where she had a tumor removed.

Space arm operates on Earth

A robotic arm onboard the space station has inspired a medical tool to combat cancer. The Canadian Space Agency’s Canadarm, Canadarm 2 and Special Purpose Dexterous Manipulator (Dextre) helped build and maintain the space station, and provide heavy lifting and space berthing capabilities. But for neurosurgeon Garnette Sutherland, they also provided the seed for a robot that could operate inside a magnetic resonance imaging (MRI) machine.

Sutherland contacted Macdonald Dettwiler and Associates (MDA), the company that built the space station arms, about the possibility of creating a new medical tool. The company’s space engineers worked in collaboration with the University of Calgary to create neuroArm.

By operating inside an MRI machine, neuroArm performs microsurgery by using detailed brain images. It is also capable of performing biopsies. A doctor controls the tool from the outside, while the arm itself negates tremors and unintentional movements caused by the human arm, allowing for greater precision.

“In building neuroArm, engineers from MDA were challenged to recreate the sight, sound, and touch of surgery at a remote workstation,” Sutherland told Space.com by email.

“We sometimes say, when using neuroArm, its telecapability allows the merging of the precision and accuracy of machine technology with the executive capacity of the human brain.”

Because Sutherland is a neurosurgeon, the tool has primarily been used to operate on brain tumors, and has been utilized in approximately 60 cases.

The Canadian Space Agency arms also inspire another robotic toolto combat breast cancer. The Image-Guided Autonomous Robot (IGAR) also works in combination with an MRI scanner to take biopsies with an accuracy of three-tenths of an inch (1 millimeter), which improves sampling, reduces pain, and decreases time spent in the MRI suite, thus reducing the amount of money spent.

Airway Fistula Closure after Stem-Cell Infusion


Large-airway defects and tracheobronchial dehiscence after lung resection present a problem for clinicians because there are few effective methods of treatment.1 Bronchopleural fistula is a pathologic connection between the airway and the pleural space that may develop after lung resection. For many patients with empyema, the presence or absence of a fistula makes the difference between recovery, chronic illness, and death.2,3

In our previous preclinical experiments, we found that bronchoscopic transplantation of mesenchymal stem cells derived from bone marrow could close a bronchopleural fistula with the extraluminal proliferation of fibroblasts and the development of collagenous matrix.4 Encouraged by this result and by functional human organ replacement elsewhere,5 we transplanted autologous bone marrow–derived mesenchymal stem cells bronchoscopically to treat a 42-year-old male firefighter in whom bronchopleural fistula had developed after right extrapleural pneumonectomy for early-stage malignant mesothelioma. The presence of the bronchopleural fistula was confirmed on flexible bronchoscopy Repair of an Airway Fistula after Stem-Cell Infusion.) and chest computed tomography .

Repair of an Airway Fistula after Stem-Cell Infusion.

The patient underwent bone marrow aspiration followed by mesenchymal stem-cell isolation and expansion; bronchoscopy was performed, and 10 million autologous bone marrow–derived mesenchymal stem cells were injected into the pars membranacea of the right main bronchial stump as close as possible to the orifice of the fistula.

At 60 days, bronchoscopy showed a complete healing of the resection line, and the orifice that was observed before stem-cell implantation was no longer visible (Figure 1C). An analysis of biopsy samples showed a hyperplastic respiratory epithelium lying on a fibrotic lamina propria, and bands of smooth-muscle fibers were reduced and replaced by fibroblasts. Immunocytochemical staining for p40, the DNp63 isoform that is considered to be highly specific for differentiation of squamous and basal cells, showed a well-defined layer of basal cells and basal-cell hyperplasia consistent with repair. Computed tomography showed interruption of the fistula at its orifice from the right bronchus where the cells were injected (Figure 1D).

The bronchoscopic transplantation of bone marrow–derived mesenchymal stem cells in our patient appeared to help close this small-caliber post-resectional bronchopleural fistula. Further work is required to determine whether this approach can be replicated.

New Type of More Problematic Mosquito-Borne Illness Detected in Brazil


A second form of the painful chikungunya virus has appeared in Brazil—one that could more easily spread, including to the U.S.

Aedes albopictus mosquito

When a mosquito-borne disease first arrived in the Western Hemisphere last year, humans were relatively lucky. The disease, which causes crippling joint pain persisting for weeks or even months and for which there is no known therapy or vaccine, hopscotched from the Caribbean islands to eventually land in the U.S. and the rest of the Americas. But the type of chikungunya creeping across the region then was one that could only readily spread via Aedes aegypti, a mosquito that is uncommon in the U.S.

That ecological happenstance provided some modicum of protection. Chikungunya spread by bites from Aedes aegypti was first detected in Saint Martin last year and in the U.S. this summer. The smaller range of that type of mosquito, however, has helped ensure the disease has not spread widely in the U.S. Right now, chikungunya is primarily limited to Florida and the territories of Puerto Rico and the U.S. Virgin Islands. Only 11 cases in Florida have been confirmed as locally transmitted in the U.S. (although another 1,545 were brought in by travelers from other locations).

Americans were particularly fortunate that the other predominate strain of chikungunya—one that derives from Africa and has fueled significant outbreaks in Asian countries for the past decade—was not seen in this hemisphere. The African strain has been accumulating mutations that allow it to be spread more easily by Aedes albopictus. That bug is common in the eastern U.S. and can survive colder temperatures. It also lays it eggs in a wider variety of settings, making it more difficult to exterminate. Chikungunya (pronounced chik-un-GUHN-ya) is named for the joint pain it causes, literally meaning “that which bends up” in the Makonde language of southeastern Africa.

Credit: CDC

Yet new findings from Brazil suggest that risk to the Americas could be on the rise. Pedro Vasconcelos, director of the Evandro Chagas Institute, Brazil’s confirmatory laboratory, warns that in one of the country’s 26 states it has detected the more problematic African strain of chikungunya. That form of chikungunya is the second to arrive in Brazil, joining the Asian-derived strain carried by A. Aegypti that is already circulating throughout the Western Hemisphere, he told Scientific American.

The majority of Brazil’s cases, Vasconcelos says, are in Bahia state along the eastern coast, the same place where the African strain is appearing, so officials think that form of chikungunya is the most common in Brazil. The country currently has more than 200 confirmed cases. Fortunately, the African strain seen in Brazil does not appear to have developed several mutations detected in Southeast Asia. Such genetic adaptations, if present, could make the virus as much as 100 times more infectious to mosquitoes, says Stephen Higgs, a chikungunya expert at Kansas State University. Such single-point mutations could still develop, however, and it is hard to predict how likely that will be, Vasconcelos says. The mutations effectively lower the threshold for what it takes for a mosquito to become infected with chikungunya, replicate the virus in its body and pass it on to humans with its bite.

Brazil’s summer starts next month, a season of copious rain that will create more ideal breeding grounds for the mosquitoes, which can then go on to bite humans and spread chikungunya. The appearance of the African genotype of chikungunya “is just going to make a bad situation worse,” says Scott Weaver, an expert in human infections and immunity at The University of Texas Medical Branch at Galveston.

Having two genotypes of chikungunya in Brazil will not necessarily increase the risk of spreading more chikungunya in the U.S., says Higgs. But global travel and trade could bring the strain now in Brazil to the U.S. The cooling season here will mitigate the situation, says J. Erin Staples, a medical epidemiologist with the Centers for Disease Control and Prevention. “We’re getting into our winter period in the U.S., which will protect most people in the continental United States, but travelers to Brazil or other areas with chikungunya should take preventive steps,” she says, referring to wearing long sleeves or pants and using potent insect repellant.

There are two top vaccine candidates for chikungunya right now, but neither has completed the rigorous testing required before they would be available to patients. One has not yet been tested in humans and the other has not made it through all the mandatory stages of testing to ensure it is effective at preventing the disease. Exactly which organization or nation might fund the mass production of these vaccines, assuming they prove effective, also remains an open question. “There are so many things we don’t know about this pathogen,” Higgs said November 4 at the annual meeting of the American Society of Tropical Medicine and Health, “especially when it comes into new territories.”

7 things Back to the Future predicted for 2015


In the cult film Back to the Future 2, Doc Brown and Marty McFly land in 2015, a futuristic land of flying cars and hovercrafts. As the New Year dawns, which of their predictions were hits – and misses?

7 Back to the Future predicted about 2015

In 1989, a year of double denim, flat tops and the Beastie Boys, 2015 seemed like a very long time away indeed. So far away that it was the setting for Back to the Future II, the second in the cult film trilogy directed by Robert Zemeckis.

Fans watched Doc Brown (Christopher Lloyd), Marty McFly (Michael J Fox) and his girlfriend Jennifer Parker (Elizabeth Shue) take off for the future at the end of the original film – and, in the sequel, they touch down 30 years later in 2015.

From food to fashion, technology and transport, Marty is bowled over by what he finds there.

So, as the New Year dawns, and we find ourselves in the very year Zemeckis envisioned – just how much did Back to the Future get right?

And what was too far-fetched even for 2015?

ADHD is Treatable


One of the world’s leading pediatric neuroscientists, Dr. Bruce D. Perry, M.D., Ph.D, recently stated publicly that Attention Deficit/Hyper-Activity Disorder (ADHD) is not ‘a real disease,’ and warned of the dangers of giving psycho-stimulant medications to children. Speaking to the Observer, Dr. Perry noted that the disorder known as ADHD should be considered a description of a wide range of symptoms that many children and adults exhibit, most of which are factors that everyone of us displays at some point during our lives. “It is best thought of as a description. If you look at how you end up with that label, it is remarkable because any one of us at any given time would fit at least a couple of those criteria,” he said. Dr. Perry is a senior fellow of the ChildTrauma Academy in Houston, Texas, a highly respected member of the pediatric community, and author of several books on child psychology including, The Boy Who Was Raised as a Dog: And Other Stories from a Child Psychiatrist’s Notebook–What Traumatized Children Can Teach Us About Loss, Love, and Healing, and Born for Love: Why Empathy Is Essential–and Endangered. His comments are quite refreshing at a time when diagnoses for ADHD in the UK and the US are sky-rocketing and prescriptions of stimulant medications to children are also rising rapidly, with many parents and concerned activists growing suspicious of the pharmaceutical industry’s motivations in promoting drugs to children. Ritalin, Adderall, Vyvanse and other mind-altering stimulant medications are increasingly prescribed to children between the ages of 4 and 17. Dr. Perry noted that the use of medications like these may be dangerous to the overall physical and mental development of the child, remarking on studies where these medications were given to animals and were proven detrimental to health. “If you give psychostimulants to animals when they are young, their rewards systems change. They require much more stimulation to get the same level of pleasure. “So on a very concrete level they need to eat more food to get the same sensation of satiation. They need to do more high-risk things to get that little buzz from doing something. It is not a benign phenomenon. “Taking a medication influences systems in ways we don’t always understand. I tend to be pretty cautious about this stuff, particularly when the research shows you that other interventions are equally effective and over time more effective and have none of the adverse effects. For me it’s a no-brainer.” Given that the problem of ADHD is complex and the term is more of a blanket term used to describe a wide range of behavioral symptoms, it is important to consider what the root causes of many of the symptoms may be before pharmaceutical intervention should be considered. Citing potential remedies, Dr. Perry suggested an approach that focuses attention on the parents and the child’s environment, while also recommending natural remedies like Yoga, and improved diet. “There are number of non-pharmacological therapies which have been pretty effective. A lot of them involve helping the adults that are around children,” he said. “Part of what happens is if you have an anxious, overwhelmed parent, that is contagious. When a child is struggling, the adults around them are easily disregulated too. This negative feedback process between the frustrated teacher or parent and dis-regulated child can escalate out of control. “You can teach the adults how to regulate themselves, how to have realistic expectations of the children, how to give them opportunities that are achievable and have success and coach them through the process of helping children who are struggling. “There are a lot of therapeutic approaches. Some would use somato-sensory therapies like yoga, some use motor activity like drumming. “All have some efficacy. If you can put together a package of those things: keep the adults more mannered, give the children achievable goals, give them opportunities to regulate themselves, then you are going to minimise a huge percentage of the problems I have seen with children who have the problem labelled as ADHD.” Many people may disagree with the assertion that ADD/ADHD should not be considered a disease, however, the fact remains that the myriad symptoms that are associated with these increasingly common ‘disorders’ can often be addressed and relieved without creating an addiction and dependency on pharmaceutical medications, which disrupt the mind and body in ways that are not fully understood or even researched.