Mind Aglow: Scientists Watch Thoughts Form in the Brain


A new technology shows real-time communication among neurons that promises to reveal brain activity in unprecedented detail.

In a mouse brain, cell-based detectors called CNiFERs change their fluorescence when neurons release dopamine.

When a single neuron fires, it is an isolated chemical blip. When many fire together, they form a thought. How the brain bridges the gap between these two tiers of neural activity remains a great mystery, but a new kind of technology is edging us closer to solving it.

The glowing splash of cyan in the photo above comes from a type of biosensor that can detect the release of very small amounts of neurotransmitters, the signaling molecules that brain cells use to communicate. These sensors, called CNiFERs (pronounced “sniffers”), for cell-based neurotransmitter fluorescent engineered reporters, are enabling scientists to examine the brain in action and up close.

This newfound ability, developed as part of the White House BRAIN Initiative, could further our understanding of how brain function arises from the complex interplay of individual neurons, including how complex behaviors like addiction develop. Neuroscientist Paul Slesinger at Icahn School of Medicine at Mount Sinai, one of the senior researchers who spearheaded this research, presented the sensors Monday at the American Chemical Society’s 252nd National Meeting & Exposition.

Current technologies have proved either too broad or too specific to track how tiny amounts of neurotransmitters in and around many cells might contribute to the transmission of a thought. Scientists have used functional magnetic resonance imaging to look at blood flow as a surrogate for brain activity over fairly long periods of time or have employed tracers to follow the release of a particular neurotransmitter from a small set of neurons for a few seconds. But CNiFERs make for a happy medium; they allow researchers to monitor multiple neurotransmitters in many cells over significant periods of time.

When a CNiFER comes in contact with the neurotransmitter it is designed to detect, it fluoresces. Using a tiny sensor implanted in the brain, scientists can then measure how much light the CNiFER emits, and from that infer the amount of neurotransmitter present. Because they comprise several interlocking parts, CNiFERs are highly versatile, forming a “plug-and-play system,” Slesinger says. Different sections of the sensor can be swapped out to detect individual neurotransmitters. Prior technology had trouble distinguishing between similar molecules, such as dopamine and norepinephrine, but CNiFERs do not.

The sensors are being tested in animals to examine particular brain processes. Slesinger and his colleagues have used CNiFERs to look more closely at a classic psychological phenomenon: Pavlovian conditioning. Just as Pavlov trained his dog to salivate at the sound of a dinner bell, Slesinger and his team trained mice to associate an audio cue with a food reward. At the beginning of the experiment, the mice experienced a release of dopamine and norepinephrine when they received a sugar cube. As the animals became conditioned to associate the audio cue with the sugar, however, the neurotransmitter release occurred earlier, eventually coinciding with the audio cue rather than the actual reward.

Mouse studies might be a far cry from the kind of human impact that neuroscience ultimately strives toward—better treatments for Parkinson’s patients or concussion sufferers, for example—but this is where it all begins. Slesinger is especially interested in using CNiFERs to study addiction. A more nuanced understanding of how addiction develops in mouse brains could help identify novel targets to combat addiction in people.

Focused Ultrasound Cuts Hand Tremor in Trial


Focused ultrasound thalamotomy produced short-term improvement in hand tremor in patients with essential tremor, researchers reported.

In a randomized sham-controlled trial, patients who received the therapy had greater improvements in physician-rated hand tremor scores at 3 months (47% improvement versus 0.1% improvement, P<0.001), Jeffrey Elias, MD, of the University of Virginia Health Sciences Center, and colleagues reported in the New England Journal of Medicine.

But editorialist Elan Louis, MD, of Yale, noted that data reported in a supplementary appendix (available only online) showed that overall tremor scores (i.e., not just hand tremor) rose by 23% and physician-rated tremor scores rose by 38% at month 12, relative to their nadir at one month post-treatment.
“Whether this loss of efficacy, which is also seen to some extent with deep-brain stimulation, is due to disease progression or tolerance is not clear,” Louis wrote, adding that he considered the former “less likely.”
The investigators also noted that the treatment came with sensory and gait disturbances affecting roughly one-third of patients.
The FDA approved the device used in the current study for MRI-guided focused ultrasound thalamotomy in essential tremor last month. The current study was the manufacturer’s primary registration trial.
The procedure involves ablating tissue with high-intensity sound waves. Medical therapies for essential tremor include beta blockers — particularly propranolol — the antiepileptic drug primidone, and other drugs that enhance GABA transmission.

However, efficacy of these treatments is limited, so some patients try deep brain stimulation, which is the standard surgical procedure for this condition. It was approved by the FDA in 1997 for this indication, but there have been few randomized controlled trials of the technology, so evidence of its efficacy remains scant, Elias and colleagues wrote.
To assess whether focused ultrasound thalamotomy could provide another option for patients with moderate-to-severe essential tremor who don’t responded to other medical therapies, the researchers enrolled 76 patients, mean age 71 with a mean disease duration of 17 years, and randomized them in a 3:1 fashion to unilateral focused ultrasound (which, if effective, would improve tremor in only the contralateral hand) or to a sham procedure.
The primary outcome was difference in change from baseline to 3 months in hand tremor, rated on a 32-point scale in which higher scores indicate a more severe tremor.
After 3 months, patients in the sham group were allowed to cross over to the active treatment.
Overall, the researchers found that scores improved more after focused ultrasound than with the sham procedure (from scores of 18.1 to 9.6 versus 16 to 15.8) — a 47% improvement compared with a 0.1% improvement, and a mean difference of 8.3 points (P<0.001). No significant change in tremor scores for the ipsilateral hand was seen in treated patients.
The improvement in the focused ultrasound group persisted at one year, with a 40% improvement from baseline, they reported (P<0.001).
Patients who crossed over into the treatment group improved by 55% at 3 months and by 52% at 6 months, they found (P<0.001).
In terms of secondary outcomes, those who had focused ultrasound had greater improvements in disability (62% reduction versus 3% reduction, P<0.001) and in quality of life (46% improvement versus 3% improvement, P<0.001) at 3 months.
The researchers did note that higher rates of adverse events in thalamotomy group included gait disturbance (36%) and paresthesias or numbness (38%), and these persisted at one year in 9% and 14% of patients, respectively.
In the accompanying editorial, Louis highlighted “several important concerns.” First, the study was limited to 1 year, so the benefits further down the road aren’t clear, which requires more studies with longer follow-up, he said.
In addition to the overall tremor scores and physician-reported tremor scores increasing, Louis noted that not all patients benefitted from the procedure, with a percent change below 20% in 9 of 56 patients. Some skulls may be too thick for the procedure to work properly, he added.
Unlike with deep-brain stimulation, he warned, a thalamotomy creates a fixed brain lesion — and altered sensation remained in 14% of patients at 1 year, he wrote.
“Even with these concerns and caveats, pros and cons, the procedure will take its place among other surgical procedures for medically refractory essential tremor,” Louis wrote. “Given the perception that it is less invasive than other approaches because it does not involve burr holes and intracerebral electrodes, as well as the evidence that patients with essential tremor are perhaps particularly harm avoidant, the procedure may allow more patients to avail themselves of a surgical option for the treatment of this often disabling disease.”
He also called for a head-to-head comparison with deep-brain stimulation.

Zika Virus Can Persist for Months in Newborns, Case Study Suggests


With prolonged infection may come more tissue damage.

An infant born with microcephaly, but with an otherwise normal physical examination at birth, had evidence of the Zika virus in serum, saliva, and urine nearly 2 months after birth, a case report from Brazil found.

The mother of the male infant was potentially infected during her third trimester of pregnancy, and the baby was born at term (40 weeks) with microcephaly. Laboratory testing found evidence of Zika virus in the infant up through 2 months of age, and he began displaying neurological symptoms at 6 months of age, Danielle B.L. Oliveira, PhD, of Universidade de São Paulo in Brazil, and colleagues, reported in a research letter in theNew England Journal of Medicine.

The authors said that despite being born with microcephaly, the infant had a normal vision and hearing test, and analysis of cerebrospinal fluid was normal at birth, with no abnormalities detected during an initial physical examination. In fact, the infant showed “no obvious illness or evidence of any immunocompromising condition” on day 54 of life.

“If Zika is shown to persist as a threat to infected newborns long after in utero exposure, there are serious implications for monitoring and managing exposed babies, even if there are no clinical manifestations noted at birth,” Irwin Redlener, MD, of Columbia University Mailman School of Public Health in New York City, who was not involved with the research, told MedPage Today via email.

But similar to the findings in a recent study, brain imaging revealed that the infant had reduced brain volume in the frontal and parietal lobes, with calcifications in subcortical areas. A polymerase chain reaction test was positive for Zika in serum, urine, and saliva at day 54 of life and positive for serum on day 67. The test was negative on day 216, although the authors noted that Zika-specific IgG titers were higher than in the first and second samples — potentially indicating that the infant had mounted an immune response to the virus.

“Prolonged viral shedding in the infant … may have had a role in the damage the virus was able to incite,” said Amesh Adalja, MD, a spokesperson for the Infectious Diseases Society of America. “It will be important to conduct more research in this vein in order to determine how common prolonged shedding is and if it is associated with a worsened clinical course,” he told MedPage Today via email.

At 6 months of age, the infant showed evidence of neuropsychomotor developmental delay, with global hypertonia, or spasticity, and spastic hemiplegia — a constant state of contraction of muscles on one side of the body, often associated with cerebral palsy. This is also consistent with recent research showing a delayed onset of symptoms in some infants with congenital Zika virus infection.

 The other interesting detail about this case was that not only did the mother appear to contract Zika virus later in her pregnancy, but she may have done so through “suspected” sexual transmission from the father. The authors reported that the mother stayed in São Paulo for the duration of her pregnancy, but the father traveled to northeastern Brazil. The father then had symptoms of Zika virus infection when the mother was 23 weeks pregnant, but she did not show symptoms until 26 weeks.

“This report provides evidence that a third trimester infection with Zika, which has been generally considered to be lower risk than earlier periods in a pregnancy, is not always benign and can lead to microcephaly,” added Adalja.

AML: Conjugate Produces High Remission Rates in Older Patients


Early promising results for antibody-drug conjugate delivered with hypomethylating agents.

Older adults who are newly diagnosed with acute myeloid leukemia (AML) are often not sufficiently fit to withstand the rigors of remission induction therapy with cytarabine and an anthracycline such as daunorubicin or idarubicin. Other patients may decline to have intensive therapy due to frailty or concerns about toxicities. For these patients, clinicians in the United States often prescribe lower-intensity therapy with the hypomethylating agents decitabine and/or azacitdine, but these agents are both associated with low response rates and limited clinical benefits, according to treatment information from theNational Cancer Institute.

However, an investigational therapy consisting of a conjugated monoclonal antibody combined with hypomethylating agents (HMAs) has been shown in early clinical trials to induce high complete or near-complete remission rates in older adults with AML.

At the 2016 annual congress of the European Hematology Association, Amir T. Fathi, MD, from Massachusetts General Hospital Cancer Center in Boston, reported data from a phase I study of older adults with AML who were treated with a combination of the monoclonal antibody drug conjugate vadastuximab talirine (33A; Seattle Genetics) and either azacitdine or decitabine.

Among 49 patients evaluable for efficacy, the combined rate of complete remissions (CR) or CR with incomplete recovery of counts (CRi) was 71%.

“The high remission rate in this traditionally high-risk group and difficult-to-treat population is very compelling,” Fathi said. “Response rates were higher, and were achieved more quickly than would be expected from historical data associated with HMA therapy alone.”

Target: CD33

33A is a highly potent antibody-drug conjugate designed to deliver a cytotoxic agent to myeloid leukemia cells. The agent is targeted to CD33 receptors that are expressed on leukemic blasts in nearly all cases of AML. The antibody is conjugated to two molecules of a pyrrolobenzodiazepine (PBD) dimer. Upon binding to the receptors, the conjugate is internalized and transported to cellular lysosomes where the PBD dimer is released via proteolytic cleavage of the linker, resulting in a crosslinking of DNA, and leading to cell death.

The cell-killing activity of this agent had been shown in preclinical studies to be enhanced when delivered in combination with hypomethylating therapy, Fathi noted.

For the phase I trial, the combination of 33A and a hypomethylating agent was tested in 53 adults with a median age of 75. The patients all had CD33-positive AML, and all had declined to undergo an intensive chemotherapy induction regimen. Five patients had previously received low-intensity therapy for myelodysplastic syndromes, and the remaining 48 patients had not received any prior therapy for AML.

Enrollment criteria included an Eastern Cooperative Oncology Group (ECOG) performance status score of 0 or 1. Nineteen of the patients had adverse cytogenetic-risk disease, and 30 had intermediate-risk disease.

The patients received 33A in intravenous infusions of 10 mcg/kg delivered in an outpatient setting every 4 weeks on the last day of a hypomethylating therapy regimen — either azacitidine at 75 mg/m2 for 7 days, or decitabine at 20 mg/m2 for 5 days. Patients who had clinical benefit could be continued on treatment until disease relapse or unacceptable toxicity.

Responses were assessed by investigators according to the International Working Groupfor Diagnosis, Standardization of Response Criteria, Treatment Outcomes, and Reporting Standards for Therapeutic Trials in Acute Myeloid Leukemia (IWG). CRi was defined as either a platelet count of ≥100,000/µL or neutrophils of ≥1,000/µL.

The combined CR/CRi rate among the 49 patients evaluable for response at the time of data cutoff was 71%, with no difference in the rate of complete or near-complete remissions between patients treated with azacitidine or decitabine.

The overall response rate (CR, CRi, and partial responses) was 76%. Encouragingly, Fathi said, many higher-risk patients had responses, including 15 of 18 patients with adverse cytogenetics, and 16 of 22 with underlying myelodysplasia.

Eight of the 19 patients who had a CR met the criteria for minimal residual disease, as did 5 of 15 who achieved a CRi.

The overall survival results were ongoing at the time of the presentation. After a median follow-up of 12.58 months, the estimated median overall survival for the first 25 patients enrolled in the study was 12.75 months; and as of the most recent follow-up, 27 were alive and remained on study.

The median relapse-free survival was 7.7 months; 30- and 60-day mortality rates were 2% and 8%, respectively. There were no treatment-related deaths reported.

Safety Profile

Patients generally tolerated the therapy well, Fathi said. Grade 3 or greater adverse events reported in at least 20% of patients included, in order of frequency, febrile neutropenia (47% of patients), thrombocytopenia (42%), anemia (34%), and neutropenia (28%).

Other common treatment-emergent adverse events were fatigue, nausea, constipation, decreased appetite, and peripheral edema.

Fathi noted that a phase III trial to evaluate 33A in combination with hypomethylating agents in previously untreated older AML patients is now open for enrollment.

How Isaac Newton could help you beat the casino at roulette


Imagine walking into a casino with a computer strapped to your chest. Solenoid electromagnets thump against your body telling you where to place your bet on the roulette table. Suddenly, you start getting electric shocks. You rush to the toilet to undertake emergency repairs, hoping that the casino staff do not realise what is happening.

In the late seventies, graduate student Doyne Farmer and colleagues did just that – with purpose-built computers that could predict where a roulette ball would land.

The project, described in the book The Newtonian Casino (published as The Eudaemonic Pie in the US), was, however, difficult and fraught with technical problems. The team never really found a reliable way of doing it. But decades later, is it any closer to becoming a reality?

In a game of roulette, the croupier spins a wheel in one direction and a ball in the other direction. Players then place bets on where the ball will land by choosing either a single number, a range of numbers, the colours red or black or odd or even numbers.

Our understanding of the physics behind the movement of the ball and wheel is pretty solid – governed by Newton’s laws of motion. As the ball slows, gravity takes hold and it falls into one of the numbered compartments.

It is predictable when the ball will leave the rim. However once it does, the route it takes to a numbered slot is less so. This is because the ball bounces around as it strikes various obstacles.

Every roulette wheel is slightly different. Atmospheric conditions continually change and the wheel itself has features that encourage randomness – such as the size of the frets between the numbers and the diamond-shaped obstacles that intercept the ball as it falls down to the wheel. This means that you cannot predict the exact number where the ball will land.

But you only need to know which area of the wheel the ball will land and you can gain a massive advantage over the casino – more than 40 percent. This is a huge swing from the 5.26 percent margin that US casinos have over players – often referred to as the house edge.

In Europe it is only 2.7 percent, as the wheel has only one zero (a US wheel has two zeroes).

Sweaty experiments

When Farmer and his team entered the casino for the first time, two people were wearing computers. One had a computer built into his shoes, with the task of inputting data by tapping switches under the toes.

This computer performed two main functions. One was to adjust parameters for each wheel before a game, such as the rate at which the ball and wheel slowed down, and the velocity of the ball when it fell off the track. They also had to determine whether the wheel exhibited any tilt.

The second job was during live play. The player with the shoe computer tapped the toe switches each time a certain point (typically the double zero) on the wheel passed by and also when the ball passed by.

Using this information, the program could calculate the speed of both the wheel and the ball – thus knowing when the ball would start to fall.

Knowing the relative positions of the ball and the wheel meant that a prediction could be made about where the ball would finally land. The computer then had to transmit the prediction to the person wearing the second computer. This was achieved by weak radio signals.

Shoe computer. Hydro.tiger/Wikimedia

The second computer, strapped to someone else, received the radio signals and conveyed this information to the player by the solenoid electromagnets that thumped that player’s stomach.

A code had been developed which relayed the predicted number, with the player placing bets on that number and several numbers either side to account for the randomness. In order that the casinos could not easily see what they were doing, the team altered their betting patterns slightly. For example, not betting on all the consecutive numbers.

However this never gave them the 40 percent advantage observed in the lab – mainly due to technological problems such as short circuits caused by sweating, wires becoming loose and lost radio connections.

It took several years for the team (which now comprised about 20 people who’d worked on the project in varying degrees) to develop an improved computer system. Both computers were now in custom-built shoes. This could protect the operator from being electrocuted but would also make it harder for the casino to detect.

The other innovation was that the computers were set in resin blocks, with only the toe-operated switches and the solenoids that now drummed against the feet, being visible. This was to try and combat the problems such as loose wires and sweating.

They then entered Binion’s casino in Las Vegas ready for an all-out assault. Once the parameters had been set, the first prediction was to bet in the third octant – which included the numbers 1, 13, 24 and 36. The ball landed in 13 and the team got paid off at 35-1.

The years of work looked promising, but the solenoids eventually started to act randomly, so the accurate predictions from one computer were not being transmitted to the other. The team suspected it was due to the electronic noise present in casinos. Eventually they had no choice but to abandon the idea.

Would it work today?

The main issue in the late seventies and early eighties was that the team had to build their own computers from scratch, literally – they had to design the computer, buy all the components and get busy with a soldering iron.

Technology has evolved. These days, all the required processing power could be fitted into a single unit. You could imagine a system based on a mobile phone where the camera videos the ball and the wheel and image processing software extracts the relevant data so that the prediction software can calculate the final position of the ball.

But certain challenges still remain. If several people are involved, which is the best way to avoid detection, how can you work as a team and pass data? Perhaps the use of free wifi in many casinos could be a solution?

Another problem is how to best hide the fact that you are trying to use an electronic device to predict where the ball will land, when you need to input data and receive the prediction. Here, suitably connected glasses may be one get around, used in tandem with toe-operated switches.

The hardest challenge, however, is the casino itself. They are certainly unlikely to simply let you have a camera pointed at the roulette wheel, especially if you are winning.

If they did, they would be likely to ask you to leave and as it is often illegal to use such devices. But with a little creativity it may not be long before scientists prove they are able to outsmart casinos.

The older we get, the happier we are, study finds


Something to look forward to.

 Twitter Icon

People are happier and enjoy better mental health as they get older, a new study has found, so if your 20s are stressing you out right now, don’t worry – better things are coming.

While previous research has found that our enjoyment of life increases as we age – something called the ageing paradox, since extra years are linked to disease and frailty – the new findings show the phenomenon occurs steadily throughout our lives from adulthood on.

A team from the University of California, San Diego examined the physical and mental health of 1,546 adults randomly selected from San Diego County, with participants being aged between 21 to 100 years old.

In terms of mental health measures – including satisfaction with life, and low levels of perceived stress, anxiety, and depression – the old appeared to win out over the young.

“Their improved sense of psychological well-being was linear and substantial,”said one of the team, geriatric neuropsychiatrist Dilip Jeste. “Participants reported that they felt better about themselves and their lives year upon year, decade after decade.”

In contrast with the older generations, the younger participants in the study showed higher levels of perceived stress, symptoms of depression, and anxiety – with the youngest, those aged in their 20s and 30s, having the roughest time of it.

“This ‘fountain of youth’ period is associated with far worse levels of psychological well-being than any other period of adulthood,” said Jeste.

While many of us might assume that the increasing physical hardship of getting older would take its toll on our happiness and mental health, research indicatesthat this isn’t necessarily the case.

“Some investigators have reported a U-shaped curve of well-being across the lifespan, with declines from early adulthood to middle age followed by an improvement in later adulthood,” Jeste said. “The nadir of mental health in this model occurs during middle age, roughly 45 to 55. However, we did not find such a mid-life dip in well-being.”

Instead, the data Jeste’s team collected “suggest the possibility of a linear improvement in mental health beginning in young adulthood,” they report in The Journal of Clinical Psychiatry.

It’s not all good news though. As you might expect, the older participants did demonstrate worsened physical and cognitive functioning than the younger people in the study – which begs the question: why exactly do the old seem to enjoy their lives so much more than the young?

Researchers think the answer could lie in how we develop a new focus in life as we get older, which may help us find greater satisfaction from simple, attainable things.

“When people face endings they tend to shift from goals about exploration and expanding horizons to ones about savouring relationships and focusing on meaningful activities,” ageing researcher Laura Carstensen from the Stanford Centre on Longevity, who wasn’t involved with the study, told Deborah Netburn at the Los Angeles Times.

“When you focus on emotionally meaningful goals, life gets better, you feel better, and the negative emotions become less frequent and more fleeting when they occur.”

Jeste suggests that the improved psychological well-being could also stem from the wisdom that comes with age, including becoming more skilled at emotional regulation and making complex social decisions.

We learn, he says, “not to sweat out the little things. And a lot of previously big things become little.”

The researchers acknowledge that their study provides just a cross-sectional snapshot in time, comparing older people today to younger people today. In other words, it’s not comparing generations over time, which means it doesn’t take into account how, for example, young people may face greater financial or environmental stresses today than their grandparents’ generation did when they were young.

But changes in the brain could also make things seem easier or less negative as we get older. A brain imaging study from 2004 found that older participants showed reduced activity in the amygdala – a region in the brain that plays a primary role in emotional reactions – when shown negative images, suggesting that emotional responses to unpleasant things may become more subdued as we age.

At this stage, there a lot of hypotheses about what’s going on here, but any solid answers will require a lot more research before we can be sure why life seems to get better even as we get closer to its end.

“There’s lots of speculation about why older people are happier and having better moods even when their cognitive and physical health is in decline, but we still don’t have anything that fully explains what is going on,” psychologist Arthur Stone from the USC Dornsife Centre for Self-Report Science, who wasn’t part of the research, told the Los Angeles Times. “It’s a big puzzle, and an important puzzle.”

The human impact on the natural environment is actually slowing down


Researchers have found that while the impact of human activity on the planet is continuing to grow, it’s now doing so at a slower rate than our economic and population growth.

This means that humans are still taking over the planet at the expense of many species and the natural world at large, but the upside is that the slowdown gives us reason for hope, as it suggests we’re getting better at managing what we take from the environment.

“Seeing that our impacts have expanded at a rate that is slower than the rate of economic and population growth is encouraging,” said lead researcher Oscar Venter from the University of Northern British Columbia in Canada. “It means we are becoming more efficient in how we use natural resources.”

The team, comprising researchers from the Wildlife Conservation Society (WCS) and eight universities from around the world, used satellite data and on-ground surveys to track how human activity altered natural habitats across the globe between 1993 and 2009.

In that relatively short timeframe, the global population grew 23 percent, and the global economy grew 153 percent, the researchers found. By contrast, the global human footprint grew only 9 percent in that time – still a troubling statistic, but at least it’s markedly less than the population and economic growth.


The researchers think the increasing efficiency can be attributed to the Environmental Kuznets Curve (EKC) hypothesis, which suggests that environmental pressures are at their worst when industrial societies are in the early stages of development, but then begin to slow down relative to financial growth as markets modernise.

In other words, as countries around the world become more industrialised and developed, their human footprint starts to ease off.

But while that slower footprint growth is a silver lining of sorts, the overall picture of how much humanity has trampled all over the planet is more sobering.

“Our maps show that three quarters of the planet is now significantly altered and 97 percent of the most species-rich places on Earth have been seriously altered,” said one of the team, James Watson from the University of Queensland in Australia. “There is little wonder there is a biodiversity crisis.”

The researchers have made their data available online with an interactive website that allows you to compare the human footprint as it was in 1993 and how it looked by 2009, and lets you explore areas where human activity is putting increasing or decreasing pressure on the environment.

The maps track changes in the extent of built environments, cropland, and pasture land, plus monitor population density and human infrastructure such as night-time lights, railways, and roads.

The increasing human footprint means that habitats that once showed no sign of human activity are becoming smaller in area, albeit more slowly than they once were.

In 1993, approximately 27 percent of the world’s non-Antarctic land area had no measurable human footprint – but the next 16 years saw our activity encroach upon some 23 million square kilometres (8.9 million square miles) of previously intact natural habitat.

And despite how scary that might sound, the good news is that, according to the findings, this kind of incursion is slowing down, meaning responsible environmental practices are making headway around the world.

“Sustainable development is a widely espoused goal, and our data demonstrates clear messages of how the world can get there,” said Venter. “Concentrate people in towns and cities so their housing and infrastructure needs are not spread across the wider landscape, and promote honest governments that are capable of managing environmental impacts.”

The researchers found that the most improvement was taking place in wealthy nations and places that had low levels of corruption. But on the other hand, because rich countries consume more than poor countries, they’ve actually got more ground to make up.

“In broad terms, industrial nations and those with lower corruption appear to be doing a better job of slowing the expansion of their human footprint than poorer countries with weak governance,” said researcher Bill Laurance from James Cook University in Australia. “But the wealthy countries have a much higher per-capita footprint, so each person there is consuming a lot more than those in poorer nations.”

The researchers hope that policy makers will use the maps and data to focus conservation efforts on the untouched habitat we have left, while working to remove existing pressures humans have placed upon the environment.

It’s a big job, but at least we have a brilliant new map to help show the way.

“Humans are the most voracious consumers planet Earth has ever seen. With our land-use, hunting and other exploitative activities, we are now directly impacting three quarters of Earth’s land surface,” said Laurance. “The bottom line is that we need to slow rampant population growth, especially in Africa and parts of Asia, and demand that people in wealthy nations consume less.”

This is what Earth will look like in 100 years


At this point, you’re probably fully aware of how hot it is. But in case you’re unaware: It’s really, really hot.

In fact, 2016 is likely to be the hottest year on record, increasing 2.3 degrees Fahrenheit (1.3 degrees Celsius) above pre-industrial averages.

That brings us dangerously close to the 2.7-degree-Fahrenheit (1.5-degree-Celsius) limit set by international policymakers for global warming.

“There’s no stopping global warming,” Gavin Schmidt, a climate scientist who is the director of NASA’s Goddard Institute of Space Studies, told Business Insider. “Everything that’s happened so far is baked into the system.”

That means that even if carbon emissions dropped to zero tomorrow, we’d still be watching human-driven climate change play out for centuries. And, as we all know, emissions aren’t going to stop tomorrow. So the key thing now, Schmidt said, is slowing climate change down enough to make sure we can adapt to it as painlessly as possible.

This is what Earth could look like within 100 years if we do, barring huge leaps in renewable energy or carbon-capture technology.

“I think the 1.5-degree [2.7-degree F] target is out of reach as a long-term goal,” Schmidt said. He estimated that we will blow past that by about 2030.

Climate1

But Schmidt is more optimistic about staying at or under 3.6 degrees Fahrenheit, or 2 degrees Celsius, above preindustrial levels – the level of temperature rise the UN hopes to avoid.

Climate2

Let’s assume we land between those two targets. At the end of this century, we’re already looking at a world that is on average 3 degrees or so Fahrenheit above where we are now.

Climate3

But average surface temperature alone doesn’t fully capture climate change. Temperature anomalies – or how much the temperature of a given area is deviating from what would be ‘normal’ in that region – will swing wildly.

Climate4

For example, the temperature in the Arctic Circle last winter soared above freezing for one day. It was still cold for Florida, but it was extraordinarily hot for the arctic. That’s abnormal, and it will start happening a lot more.

Climate5

That means years like this one, which had the lowest sea-ice extent on record, will become common. Summers in Greenland could become ice-free by 2050.

Climate6

Source: Journal of Advances in Modeling Earth Systems

Even 2015 was nothing compared with 2012, when 97 percent of the Greenland Ice Sheet’s surface started to melt in the summer. It’s typically a once-in-a-century occurrence, but we could see this kind of extreme surface melt every six years by end of the century.

Climate7

On the bright side, ice in Antarctica will remain relatively stable, making minimal contributions to sea-level rise.

Climate8Andreas

But in our best-case scenarios, oceans are on track to rise 2 to 3 feet (0.6 to 0.9 metres) by 2100. Even a sea-level rise below 3 feet (0.9 metres) could displace up to 4 million people.

Climate9

Oceans not only will have less ice at the poles, but they will also continue to acidify in the tropics. Oceans absorb about a third of all carbon dioxide in the atmosphere, causing them to warm and become more acidic.

Climate10

If climate change continues unabated, nearly all coral reef habitats could be devastated. Under our best-case scenario, half of all tropical coral reefs are still threatened.

Climate11

But the oceans aren’t the only place heating up. Even if we curb emissions, summers in the tropics could increase their extreme-heat days by half after 2050. Farther north, 10 percent to 20 percent of the days in a year will be hotter.

Climate12

But compare that with the business-as-usual scenario, in which the tropics will stay at unusually hot temperatures all summer long. In the temperate zones, 30 percent or more of the days will be what is now unusual.

Climate13

Even a little bit of warming will strain water resources. In a 2013 paper, scientists used models to estimate that the world could see more severe droughts more frequently – about a 10 percent increase. If unchecked, climate change could cause severe drought across 40 percent of all land, double what it is today.

Climate14

And then there’s the weather. If the extreme El Niño event of 2015 to 2016 was any indication, we’re in for much more dramatic natural disasters. More extreme storm surges, wildfires, and heat waves are on the menu for 2070 and beyond.

Climate15

Right now, humanity is standing on a precipice. We can ignore the warning signs and pollute ourselves into what Schmidt envisions as a “vastly different planet” – roughly as different as our current climate is from the most recent ice age.

Climate16

Or we can innovate solutions. Many of the scenarios laid out here assume we’re reaching negative emissions by 2100 – that is, absorbing more than we’re emitting through carbon-capture technology.

Climate17

Schmidt says we are likely to reach 2100 with a planet somewhere between “a little bit warmer than today and a lot warmer than today”.

Climate18

But the difference between ‘a little’ and ‘a lot’ on the scale of Earth is one of millions of lives saved, or not.

Climate19

NASA Scientists Discover Unexpected Mineral on Mars


Scientists have discovered an unexpected mineral in a rock sample at Gale Crater on Mars, a finding that may alter our understanding of how the planet evolved.

NASA’s Mars Science Laboratory rover, Curiosity, has been exploring sedimentary rocks within Gale Crater since landing in August 2012. In July 2015, on Sol 1060 (the number of Martian days since landing), the rover collected powder drilled from rock at a location named “Buckskin.” Analyzing data from an X-ray diffraction instrument on the rover that identifies minerals, scientists detected significant amounts of a silica mineral called tridymite.

This low-angle self-portrait of NASA’s Curiosity Mars rover shows the vehicle at the site from which it reached down to drill into a rock target called “Buckskin.” Bright powder from that July 30, 2015, drilling is visible in the foreground.

This detection was a surprise to the scientists, because tridymite is generally associated with silicic volcanism, which is known on Earth but was not thought to be important or even present on Mars.

The discovery of tridymite might induce scientists to rethink the volcanic history of Mars, suggesting that the planet once had explosive volcanoes that led to the presence of the mineral.

Scientists in the Astromaterials Research and Exploration Science (ARES) Division at NASA’s Johnson Space Center in Houston led the study. A paper on the team’s findings has been published in the Proceedings of the National Academy of Sciences.

“On Earth, tridymite is formed at high temperatures in an explosive process called silicic volcanism. Mount St. Helens, the active volcano in Washington State, and the Satsuma-Iwojima volcano in Japan are examples of such volcanoes. The combination of high silica content and extremely high temperatures in the volcanoes creates tridymite,” said Richard Morris, NASA planetary scientist at Johnson and lead author of the paper. “The tridymite was incorporated into ‘Lake Gale’ mudstone at Buckskin as sediment from erosion of silicic volcanic rocks.”

The paper also will stimulate scientists to re-examine the way tridymite forms. The authors examined terrestrial evidence that tridymite could form at low temperatures from geologically reasonable processes and not imply silicic volcanism. They found none. Researchers will need to look for ways that it could form at lower temperatures.

“I always tell fellow planetary scientists to expect the unexpected on Mars,” said Doug Ming, ARES chief scientist at Johnson and co-author of the paper. “The discovery of tridymite was completely unexpected. This discovery now begs the question of whether Mars experienced a much more violent and explosive volcanic history during the early evolution of the planet than previously thought.”

Recycled Concrete’s Time Has Come


Recycling in the construction industry has become big business. It’s easy to find new life for everything from old windows to wood beams. But until now, concrete — the most widely used building material on Earth — has been largely left behind.

That’s a problem for the environment, says Notre Dame engineering professorYahya Kurama, because concrete has a huge carbon footprint. “It’s very intensive in terms of its demands on energy, water, land space, everything.” Producing concrete accounts for 5 percent of the world’s annual human-generated CO2 emissions. In the U.S., it — along with other demolished building materials — takes up nearly half of all landfill space.

A second life for concrete

To reduce such harm, the industry has concentrated on things like reducing new concrete production and finding new uses for concrete byproducts.

In the United States, recycled concrete is used in sidewalks and roads, but not for load-bearing structures. Kurama and his team, along with scientists from the University of Texas at Tyler and New Mexico State University, set out to determine whether it was strong and durable enough to be used to construct buildings.

“Currently there’s a lot more supply of recycled concrete aggregates than demand,” he pointed out. “What we’re trying to do is bring up the demand and at the same time generate the engineering background that these materials can be used in a higher-level application.”

Researchers are testing two types of concrete aggregates - from pre-cast, unused slabs (left) and from demolished structures, with construction debris (right).

Researchers are testing two types of concrete aggregates – from pre-cast, unused slabs (left) and from demolished structures, with construction debris (right).

Kumara’s team is studying different recycled aggregate combinations in hopes of supplying that demand. Graduate student Michael Brandes says they’re interested in two sources for recycled concrete.

“The traditional RCA, or recycled concrete as we call it, is something that comes from a bridge that was demolished, a building that was demolished,” he said. “Basically what that means is it has the opportunity to accumulate a lot of other materials — wood chips, asphalt, brick — from the construction site. We don’t want to have to sort that out, because it’s an added cost.”

The other, cleaner source is rejected material from a precast concrete plant, which has no steel, wood or other construction debris mixed in. In both cases, the material is crushed down as aggregate and mixed with fresh cement to make a new product.

As strong as new

The team is testing both types of recycled concrete to determine durability and many other qualities: life-cycle costs, weight-bearing abilities, statistical variabilities and properties of the aggregates. Kurama says they are also working out how they might engineer around any differences between the recycled materials and traditional concrete. Currently, without that research, federal building codes bar its use in load-bearing structures, like walls and floors.

Notre Dame engineering professor Yahya Kurama (left) and graduate student Michael Brandes look over a container of aggregate.

Notre Dame engineering professor Yahya Kurama (left) and graduate student Michael Brandes look over a container of aggregate.

Kurama says using recycled material reduces concrete’s environmental impact by about half, from decreased water usage and less mining to decreased transportation costs, because materials are often on site or close by. His analysis also showed that in some instances recycled concrete is stronger than its natural counterpart.

“Nobody’s going to see an immediate effect of this,” he says of his work, “but if you think about the impact of built infrastructure 20 years, 30 years, 40 years, 50 years down the road, this will have a big impact in terms of reducing concrete’s impact on our environment.”

That’s good news, because it’s expected the world will need 4.4 billion metric tons of concrete a year by 2050.