New hologram technology created with tiny nanoantennas.


Researchers have created tiny holograms using a “metasurface” capable of the ultra-efficient control of light, representing a potential new technology for advanced sensors, high-resolution displays and information processing.

The metasurface, thousands of V-shaped nanoantennas formed into an ultrathin gold foil, could make possible “planar photonics” devices and optical switches small enough to be integrated into computer chips for information processing, sensing and telecommunications, said Alexander Kildishev, associate research professor of electrical and computer engineering at Purdue University.

Laser light shines through the nanoantennas, creating the hologram 10 microns above the metasurface. To demonstrate the technology, researchers created a hologram of the word PURDUE smaller than 100 microns wide, or roughly the width of a human hair.

“If we can shape characters, we can shape different types of light beams for sensing or recording, or, for example, pixels for 3-D displays. Another potential application is the transmission and processing of data inside chips for information technology,” Kildishev said. “The smallest features – the strokes of the letters – displayed in our experiment are only 1 micron wide. This is a quite remarkable spatial resolution.”

holograms with laser lights
Laser light shines through the metasurface from below, creating a hologram 10 microns above the structure. (Xingjie Ni, Birck Nanotechnology Center)

Findings are detailed in a research paper appearing on Friday (Nov. 15) in the journal Nature Communications.

Metasurfaces could make it possible to use single photons – the particles that make up light – for switching and routing in future computers. While using photons would dramatically speed up computers and telecommunications, conventional photonic devices cannot be miniaturized because the wavelength of light is too large to fit in tiny components needed for integrated circuits.

Nanostructured metamaterials, however, are making it possible to reduce the wavelength of light, allowing the creation of new types of nanophotonic devices, said Vladimir M. Shalaev, scientific director of nanophotonics at Purdue’s Birck Nanotechnology Center and a distinguished professor of electrical and computer engineering.

“The most important thing is that we can do this with a very thin layer, only 30 nanometers, and this is unprecedented,” Shalaev said. “This means you can start to embed it in electronics, to marry it with electronics.”

The layer is about 1/23rd the width of the wavelength of light used to create the holograms.

The Nature Communications article was co-authored by former Purdue doctoral student Xingjie Ni, who is now a postdoctoral researcher at the University of California, Berkeley; Kildishev; and Shalaev.

Under development for about 15 years, metamaterials owe their unusual potential to precision design on the scale of nanometers. Optical nanophotonic circuits might harness clouds of electrons called “surface plasmons” to manipulate and control the routing of light in devices too tiny for conventional lasers.

The researchers have shown how to control the intensity and phase, or timing, of laser light as it passes through the nanoantennas. Each antenna has its own “phase delay” – how much light is slowed as it passes through the structure. Controlling the intensity and phase is essential for creating working devices and can be achieved by altering the V-shaped antennas.

The work is partially supported by U.S. Air Force Office of Scientific Research, Army research Office, and the National Science Foundation. Purdue has filed a provisional patent application on the concept.

Far-Off Planets Like the Earth Dot the Galaxy.


The known odds of something — or someone — living far, far away from Earth improved beyond astronomers’ boldest dreams on Monday.

Astronomers reported that there could be as many as 40 billion habitable Earth-size planets in the galaxy, based on a new analysis of data from NASA’s Kepler spacecraft.

One out of every five sunlike stars in the galaxy has a planet the size of Earth circling it in the Goldilocks zone — not too hot, not too cold — where surface temperatures should be compatible with liquid water, according to a herculean three-year calculation based on data from the Kepler spacecraft by Erik Petigura, a graduate student at the University of California, Berkeley.

Mr. Petigura’s analysis represents a major step toward the main goal of the Kepler mission, which was to measure what fraction of sunlike stars in the galaxy have Earth-size planets. Sometimes called eta-Earth, it is an important factor in the so-called Drake equation used to estimate the number of intelligent civilizations in the universe. Mr. Petigura’s paper, published Monday in the journal Proceedings of the National Academy of Sciences, puts another smiley face on a cosmos that has gotten increasingly friendly and fecund-looking over the last 20 years.

“It seems that the universe produces plentiful real estate for life that somehow resembles life on Earth,” Mr. Petigura said.

Over the last two decades, astronomers have logged more than 1,000 planets around other stars, so-called exoplanets, and Kepler, in its four years of life before being derailed by a mechanical pointing malfunction last winter, has compiled a list of some 3,500 more candidates. The new result could steer plans in the next few years and decades to find a twin of the Earth — Earth 2.0, in the argot — that is close enough to here to study.

The nearest such planet might be only 12 light-years away. “Such a star would be visible to the naked eye,” Mr. Petigura said.

His result builds on a report earlier this year by David Charbonneau and Courtney Dressing of the Harvard-Smithsonian Center for Astrophysics, who found that about 15 percent of the smaller and more numerous stars known as red dwarfs have Earth-like planets in their habitable zones. Using slightly less conservative assumptions, Ravi Kopparapu of Pennsylvania State University found that half of all red dwarfs have such planets.

[Video: Galaxy contains billions of potentially habitable planets, say University of California at Berkeley and University of Hawaii at Manoa astronomers. Watch on YouTube.]

Geoffrey Marcy of the University of California, Berkeley, who supervised Mr. Petigura’s research and was a co-author of the paper along with Andrew Howard of the University of Hawaii, said: “This is the most important work I’ve ever been involved with. This is it. Are there inhabitable Earths out there?”

“I’m feeling a little tingly,” he said.

At a news conference Friday discussing the results, astronomers erupted in praise of the Kepler mission and its team. Natalie Batalha, a Kepler leader from the NASA Ames Research Center, described the project and its members as “the best of humanity rising to the occasion.”

According to Mr. Petigura’s new calculation, the fraction of stars with Earth-like planets is 22 percent, plus or minus 8 percent, depending on exactly how you define the habitable zone.

There are several caveats. Although these planets are Earth-size, nobody knows what their masses are and thus whether they are rocky like the Earth, or balls of ice or gas, let alone whether anything can, or does — or ever will — live on them.

There is reason to believe, from recent observations of other worlds, however, that at least some Earth-size planets, if not all of them, are indeed rocky. Last week, two groups of astronomers announced that an Earth-size planet named Kepler 78b that orbits its sun in 8.5 hours has the same density as the Earth, though it is too hot to support life.

“Nature,” as Mr. Petigura put it, “knows how to make rocky Earth-size planets.”

Also, the number is more uncertain than it might have been because Kepler’s pointing system failed before it could complete its prime survey. As a result, Mr. Petigura and his colleagues had to extrapolate from planets slightly larger than Earth and with slightly smaller, tighter orbits. For the purposes of his analysis “Earth-size” was anything from one to two times the diameter of the Earth, and Earth-like orbits were between 400 and 200 days.

Dr. Batalha said, “We don’t yet have any planet candidates that are exact analogues of the Earth in terms of size, orbit or star type.”

Though Kepler itself is sidelined while astronomers devise a new program it can accomplish with less flexible pointing ability, it has sent back so much data that there is still a whole year’s worth of results left to analyze, Dr. Batalha said. “Scientists,” she said, “are going to work on Kepler data for decades.” Kepler was launched in 2009 to perform a kind of cosmic census, monitoring the brightness of 150,000 far-off stars in the Cygnus and Lyra constellations, looking for dips in brightness when planets pass in front of them.

Mr. Petigura and his colleagues restricted themselves to a subset of some 42,000 brighter and well-behaved stars. They found 603 planets, of which 10 were between one Earth and two Earths in diameter, and circled in what Mr. Petigura defined as the habitable zone, where they would receive between a quarter of the light the Earth gets, and four times as much. In the solar system, that zone would spread from inside the orbit of Venus to just outside the orbit of Mars.

Meanwhile, in an innovation borrowed from other data-intensive fields like particle physics, Mr. Petigura designed a computer pipeline so that he could inject fake planets into the data — 40,000 in all — and see how efficiently his program could detect planets of different sizes and orbits. In addition to that correction, he and his colleagues had to correct for geometry; only about one in 100 planet systems is aligned edge-on so that earthlings would see the telltale wink of an exoplanet transit.

“It was a ton of work,” he recalled, explaining that he had to try out tens of billions of different periods for each star in order to find planets.

Sara Seager, an exoplanet astronomer at the Massachusetts Institute of Technology who was not involved in the work, said the pipeline testing had made the results believable. “I would say that small planets are everywhere and very common,” she said, “no matter how you slice and dice the data. But Kepler is dead and we have no way to get any further data. So we’ll have to be satisfied with this as the final word, for now.”

There may be other planets like ours.


We are not alone.

There are likely “tens of billions” of Earth-like planets in our Milky Way galaxy, according to a study released Monday by astronomers from the University of California-Berkeley and the University of Hawaii.

AP Many Earths

“Planets like our Earth are relatively common throughout the Milky Way galaxy,” said astronomer Andrew Howard of the University of Hawaii, who estimates the number at about 40 billion.

In fact, the nearest Earth-like planet may be “only” 12 light years away, which is roughly 72 trillion miles.

In all, about 8.8 billion stars in our galaxy have planets that are nearly the size of Earth and also have a surface temperature conducive to the development of life. But many more stars (those not similar to our sun) also have planets where life could form, which is where the 40 billion-planet figure comes from.

Like Goldilocks tasting the porridge, temperatures must be “just right” for life to develop: Planets must have a so-called “habitable zone” with “lukewarm temperatures, so that water would not be frozen into ice or vaporized into steam but instead remain a liquid, because liquid water is now understood to be the prerequisite for life,” said Geoffrey Marcy, a professor of astronomy at Berkeley.

The discovery was based on the most accurate statistical analysis yet of all the observations from the Kepler telescope, a space observatory launched in 2009 specifically designed to locate planets around other stars.

The research was based mainly on an exhaustive, three-year search of Kepler data undertaken by Erik Petigura, a graduate student at the University of California, Berkeley.

“Now, for the first time, humanity has a measure of how common Earth-size planets are around sun-like stars,” Marcy added.

Howard says the new estimate of planets means there are 40 billion chances “for life to get started and to evolve.”

“The findings are robust, but you have to read the fine print to understand that the numbers are somewhat uncertain,” noted MIT astronomer Sara Seager, who was not part of the study. “Overall the result speaks to the growing findings that small planets are everywhere.”

“For the past couple of years there has been an emerging consensus that Earth-size planets are common, so in that sense, the result is not hugely surprising,” said astronomer David Kipping of the Harvard-Smithsonian Center for Astrophysics, who was also not part of the study. “What is special about this work is the huge effort of the authors to develop a completely independent way of measuring this occurrence rate to that of the Kepler team.”

And going beyond our galaxy, Marcy reminds us that the Milky Way is just a typical galaxy within our universe, which contains hundreds of billions of galaxies, each of which has about the same number of sun-like stars as does our Milky Way.

“With tens of billions of Earth-like planets in each galaxy, our entire universe must contain billions of billions of Earth-like planets,” Marcy said,

The study was published online Monday in the Proceedings of the National Academy of Sciences using data from the Kepler telescope. The $591 million Kepler telescope is now crippled and nearing the end of its four-year mission.

Astronomers answer key question: How common are habitable planets?


UC Berkeley and University of Hawaii astronomers analyzed all four years of Kepler data in search of Earth-size planets in the habitable zones of sun-like stars, and then rigorously tested how many planets they may have missed. Based on this analysis, they estimate that 22 percent of stars like the sun have potentially habitable Earth-size planets, though not all may be rocky or have liquid water, a presumed prerequisite for life.

NASA’s Kepler spacecraft, now crippled and its four-year mission at an end, nevertheless provided enough data to complete its mission objective: to determine how many of the 100 billion stars in our galaxy have potentially habitable planets.

Based on a statistical analysis of all the Kepler observations, University of California, Berkeley, and University of Hawaii, Manoa, astronomers now estimate that one in five stars like the sun have about the size of Earth and a surface temperature conducive to .

“What this means is, when you look up at the thousands of stars in the night sky, the nearest sun-like star with an Earth-size planet in its habitable zone is probably only 12 light years away and can be seen with the naked eye. That is amazing,” said UC Berkeley graduate student Erik Petigura, who led the analysis of the Kepler data.

“It’s been nearly 20 years since the discovery of the first extrasolar planet around a normal star. Since then we have learned that most stars have planets of some size and that Earth-size planets are relatively common in close-in orbits that are too hot for life,” said Andrew Howard, a former UC Berkeley post-doctoral fellow who is now on the faculty of the Institute for Astronomy at the University of Hawaii. “With this result we’ve come home, in a sense, by showing that planets like our Earth are relatively common throughout the Milky Way galaxy.”

Petigura, Howard and Geoffrey Marcy, UC Berkeley professor of astronomy, will publish their analysis and findings online the week of Nov. 4 in the journal Proceedings of the National Academy of Sciences.

Earth-size may not mean habitable

“For NASA, this number – that every fifth star has a planet somewhat like Earth – is really important, because successor missions to Kepler will try to take an actual picture of a planet, and the size of the telescope they have to build depends on how close the nearest Earth-size planets are,” Howard said. “An abundance of planets orbiting nearby stars simplifies such follow-up missions.”

The team cautioned that Earth-size planets in Earth-size orbits are not necessarily hospitable to life, even if they orbit in the habitable zone of a star where the temperature is not too hot and not too cold.

“Some may have thick atmospheres, making it so hot at the surface that DNA-like molecules would not survive. Others may have rocky surfaces that could harbor liquid water suitable for living organisms,” Marcy said. “We don’t know what range of planet types and their environments are suitable for life.”

Last week, however, Howard, Marcy and their colleagues provided hope that many such planets actually are rocky. They reported that one Earth-size planet discovered by Kepler – albeit, a planet with a likely temperature of 2,000 Kelvin, which is far too hot for life as we know it – is the same density as Earth and most likely composed of rock and iron, like Earth.

“This gives us some confidence that when we look out into the habitable zone, the planets Erik is describing may be Earth-size, rocky planets,” Howard said.

Transiting planets

NASA launched the Kepler space telescope in 2009 to look for planets that cross in front of, or transit, their stars, which causes a slight diminution – about one hundredth of one percent – in the star’s brightness. From among the 150,000 stars photographed every 30 minutes for four years, NASA’s Kepler team reported more than 3,000 planet candidates. Many of these are much larger than Earth – ranging from large planets with thick atmospheres, like Neptune, to gas giants like Jupiter – or in orbits so close to their stars that they are roasted.

To sort them out, Petigura and his colleagues are using the Keck Telescopes in Hawaii to obtain spectra of as many stars as possible. This will help them determine each star’s true brightness and calculate the diameter of each transiting planet, with an emphasis on Earth-diameter planets.

Astronomers use the term “habitable zone” to indicate an orbit not too far from the star such that water freezes, and not too close such that water vaporizes. Habitable zones are orbital areas where the heat from the star creates lukewarm temperatures at which liquid water can exist, and water is the presumed prerequisite for life. Credit: Petigura/UC Berkeley, Howard/UH-Manoa, Marcy/UC Berkeley

Independently, Petigura, Howard and Marcy focused on the 42,000 stars that are like the sun or slightly cooler and smaller, and found 603 candidate planets orbiting them. Only 10 of these were Earth-size, that is, one to two times the diameter of Earth and orbiting their star at a distance where they are heated to lukewarm temperatures suitable for life. The team’s definition of habitable is that a planet receives between four times and one-quarter the amount of light that Earth receives from the sun.

https://i0.wp.com/cdn.physorg.com/newman/gfx/news/2013/1-astronomersa.jpg

A census of extrasolar planets

What distinguishes the team’s analysis from previous analyses of Kepler data is that they subjected Petigura’s planet-finding algorithms to a battery of tests in order to measure how many habitable zone, Earth-size planets they missed. Petigura actually introduced fake planets into the Kepler data in order to determine which ones his software could detect and which it couldn’t.

“What we’re doing is taking a census of , but we can’t knock on every door. Only after injecting these fake planets and measuring how many we actually found, could we really pin down the number of real planets that we missed,” Petigura said.

Analysis of four years of precision measurements from Kepler shows that 22±8% of Sun-like stars may have Earth-sized planets in the habitable zone. Credit: Erik A. Petigura.

Accounting for missed planets, as well as the fact that only a small fraction of planets are oriented so that they cross in front of their host star as seen from Earth, allowed them to estimate that 22 percent of all sun-like stars in the galaxy have Earth-size planets in their .

“The primary goal of the Kepler mission was to answer the question, When you look up in the night sky, what fraction of the stars that you see have Earth-size planets at lukewarm temperatures so that water would not be frozen into ice or vaporized into steam, but remain a liquid, because liquid water is now understood to be the prerequisite for life,” Marcy said. “Until now, no one knew exactly how common potentially were around Sun-like stars in the galaxy.”

https://i0.wp.com/cdn.physorg.com/newman/gfx/news/2013/cbfbfdd.jpg

All of the potentially habitable planets found in their survey are around K stars, which are cooler and slightly smaller than the sun, Petigura said. But the team’s analysis shows that the result for K stars can be extrapolated to G stars like the sun. Had Kepler survived for an extended mission, it would have obtained enough data to directly detect a handful of Earth-size planets in the habitable zones of G-type stars.

“If the stars in the Kepler field are representative of stars in the solar neighborhood, … then the nearest (Earth-size) planet is expected to orbit a star that is less than 12 light-years from Earth and can be seen by the unaided eye,” the researchers wrote in their paper. “Future instrumentation to image and take spectra of these Earths need only observe a few dozen nearby stars to detect a sample of Earth-size planets residing in the habitable zones of their host stars.”

In January, the team reported a similar analysis of Kepler d

How to Build a Happier Brain.


There is a motif, in fiction and in life, of people having wonderful things happen to them, but still ending up unhappy. We can adapt to anything, it seems—you can get your dream job, marry a wonderful human, finally get 1 million dollars or Twitter followers—eventually we acclimate and find new things to complain about.

If you want to look at it on a micro level, take an average day. You go to work; make some money; eat some food; interact with friends, family or co-workers; go home; and watch some TV. Nothing particularly bad happens, but you still can’t shake a feeling of stress, or worry, or inadequacy, or loneliness.

According to Dr. Rick Hanson, a neuropsychologist, a member of U.C. Berkeley‘s Greater Good Science Center‘s advisory board, and author of the book Hardwiring Happiness: The New Brain Science of Contentment, Calm, and Confidence, our brains are naturally wired to focus on the negative, which can make us feel stressed and unhappy even though there are a lot of positive things in our lives. True, life can be hard, and legitimately terrible sometimes. Hanson’s book (a sort of self-help manual grounded in research on learning and brain structure) doesn’t suggest that we avoid dwelling on negative experiences altogether—that would be impossible. Instead, he advocates training our brains to appreciate positive experiences when we do have them, by taking the time to focus on them and install them in the brain.

I spoke with Hanson about this practice, which he calls “taking in the good,” and how evolution optimized our brains for survival, but not necessarily happiness.

“Taking in the good” is the central idea of your book. Can you explain what that is as a practice and how it works in the brain?

The simple idea is that we we all want to have good things inside ourselves: happiness, resilience, love, confidence, and so forth. The question is, how do we actually grow those, in terms of the brain? It’s really important to have positive experiences of these things that we want to grow, and then really help them sink in, because if we don’t help them sink in, they don’t become neural structure very effectively. So what my book’s about is taking the extra 10, 20, 30 seconds to enable everyday experiences to convert to neural structure so that increasingly, you have these strengths with you wherever you go.

Do you want to explain how that actually works in terms of brain structure? What is the connection between having this good experience and making tangible changes in the brain?

There’s a classic saying: “Neurons that fire together, wire together.” What that means is that repeated patterns of mental activity build neural structure. This process occurs through a lot of different mechanisms, including sensitizing existing synapses and building new synapses, as well as bringing more blood to busy regions. The problem is that the brain is very good at building brain structure from negative experiences. We learn immediately from pain—you know, “once burned, twice shy.” Unfortunately, the brain is relatively poor at turning positive experiences into emotional learning neural structure.

On page one of the intro you said: “Positive thinking … is usually wasted on the brain.” Can you explain how positive thinking is different from taking in the good?

That’s a central, central question. First, positive thinking by definition is conceptual and generally verbal. And most conceptual or verbal material doesn’t have a lot of impact on how we actually feel or function over the course of the day. I know a lot of people who have this kind of positive, look on the bright side yappity yap, but deep down they’re very frightened, angry, sad, disappointed, hurt, or lonely. It hasn’t sunk in. Think of all the people who tell you why the world is a good place, but they’re still jerks.

I think positive thinking’s helpful, but in my view, it’s not so much as positive thinking as clear thinking. I think it’s important to be able to see the whole picture, the whole mosaic of reality. Both the tiles that are negative, as well as the tiles that are neutral and positive. Unfortunately, we have brains that are incentivized toward seeing the negative tiles, so if anything, deliberately looking for the positive tiles just kind of levels the playing field. But deep down, I’m a little leery of the term positive thinking because I think it could imply that we’re overlooking the negative, and I think it’s important to face the negative.

The second reason why I think most positive thinking is wasted on the brain goes to this fundamental distinction between activation and installation. When people are having positive thinking or even most positive experiences, the person is not taking the extra 10, 20 seconds to heighten the installation into neural structure. So it’s not just positive thinking that’s wasted on the brain; it’s most positive experiences that are wasted on the brain.

Why did our brains evolve to focus on the negative?

As our ancestors evolved, they needed to pass on their genes. And day-to-day threats like predators or natural hazards had more urgency and impact for survival. On the other hand, positive experiences like food, shelter, or mating opportunities, those are good, but if you fail to have one of those good experiences today, as an animal, you would have a chance at one tomorrow. But if that animal or early human failed to avoid that predator today, they could literally die as a result.

That’s why the brain today has what scientists call a negativity bias. I describe it as like Velcro for the bad, Teflon for the good. For example, negative information about someone is more memorable than positive information, which is why negative ads dominate politics. In relationships, studies show that a good, strong relationship needs at least a 5:1 ratio of positive to negative interactions.

Positive experiences use standard memory systems: moving from short-term buffers to long-term storage. But to move from a short-term buffer to long-term storage, an experience needs to be held in that short-term buffer long enough for it to transfer to long-term storage—but how often do we actually do that? We might be having one passing, normal, everyday positive experience after another: getting something done, look outside and flowers are blooming, children are laughing, chocolate tastes great, but these experiences are not transferring to storage or leading to any lasting value.

(mape_s/flickr

When you’re trying to avoid these threats, that’s what you call, in the book, “reactive mode” for the brain. But even though we’re wired to dwell on negative things, you still say the default state is still the relaxed or “responsive mode,” right?

Let’s take the example of zebras, borrowing from Robert Sapolsky’s great book Why Zebras Don’t Get Ulcers. Zebras in the wild spend most of their time in a state of relative well-being. Sometimes they’re hungry, but often they’re in a fairly relaxed place; they’re eating grass, they’re with each other in the herd. They’re in the responsive mode of the brain, what I call the green zone. Then all of a sudden, a bunch of lions attack. All the zebras go into to the reactive mode, they have this burst of fight-or-flight stress, they go into the red zone, and then this episode of stress, as Sapolsky writes, ends quickly one way or another. And then they go back to the responsive mode.

So, Mother Nature’s plan is for us to spend long periods in the responsive mode. And it’s good for animals to seek to rest in the responsive mode, which is when the body repairs itself. But we have also evolved the capacity to switch out of the responsive mode very, very quickly, for a fight or flight or freeze purpose. And then we need to learn intensely what happened, to try to avoid going there ever again. So the resting state is actually very good for humans, for our long-term physical and mental health. On the other hand, it’s very important for us to learn from our negative experiences to try to prevent them in the future.

You write that people are more likely to get stuck in the reactive mode today, but if modernity takes care of most of our basic needs, why are we more likely to be in the reactive mode today than, say, in the wild?

It’s a deep question. I think it’s easy to sentimentalize hunter-gatherer life. There was a lot about it that was very hard: there was no pain control, there was no refrigeration, there was no rule of law. Childbirth was a dangerous experience for many people. There’s a lot about modernity that’s good for the Stone Age brain. We do have the ability in the developed world—far from perfect, of course—to control pain. We have modern medicine, sanitation, flushed toilets and so forth and, in many places, the rule of law. But on the other hand, modernity exposes us to chronic mild to moderate stresses, which are not good for long-term mental or physical health.

For me, one of the takeaways from that is to repeatedly internalize the sense of having our three core needs met: safety, satisfaction, and connection. By repeatedly internalizing that self-sense, we essentially grow the neural substrates of experiencing that those needs are met, even as we deal with challenges, so that we become increasingly able to manage threats or losses or rejections without tipping into the red zone.

Could you talk a little more about those core needs—safety, satisfaction, and connection, and how to meet them?

There are certain kinds of key experiences that address key issues. For example, experiences of relaxation, of calming, of feeling protected and strong and resourced, those directly address issues of our safety system. And having internalized again and again a sense of calm, a person is going to be more able to face situations at work or in life in general without getting so rattled by them, without being locked into the reactive mode of the brain.

In terms of our need for satisfaction, of experiences of gratitude, gladness, accomplishment, feeling successful, feeling that there’s a fullness in your life rather than an emptiness or a scarcity. As people increasingly install those traits, they’re going to be more able to deal with issues such as loss, or being thwarted, or being disappointed.

Lastly, in terms of our need for connection, the more that people can have a sense of inclusion or a sense of being seen, or appreciated, or liked or loved; the more that people can cultivate the traits of being compassionate, kind, and loving themselves, the more that they’re going to be able to stay in a responsive mode of the brain, even if they deal with issues in this connection system like being rejected or devalued or left out by somebody else.

Do people differ in the sort of mode that they tend to be in, reactive or responsive, based on their personal history or personality?

The short answer, I’m sure, is yes. There’s a general finding in psychology that, on average, about a third of our personal characteristics are innate, and roughly two-thirds are acquired one way or another. And so, it’s true, I think, that some people are just by tendency more reactive, more sensitive, fiery. They come out of the box that way. On the other hand, anybody can gradually develop themselves over time through repeatedly internalizing positive experiences and also learning from negative ones. There’s been research on the development of resilience, as well as many anecdotal tales of people who were very reactive because they grew up in a reactive environment—a lot of poverty or chaos in their home or within the family—but then over time, become increasingly sturdy and even-keeled as they navigate the storms of life.

You said in the book that regular exercise can be a factor; can you explain how that helps?

It’s interesting, and I’m someone that doesn’t like exercise. Research shows that exercise is a very good physical health factor obviously, but it also confers mental health benefits. For example, regular exercise is roughly as powerful on average for mild depression as medication is, studies show.

People who are depressed, mildly to moderately depressed, are still having positive experiences, but they’re not changing from them; they’re not learning from them. One of the theories about why exercise seems to have such a powerful effect on depression in terms of lifting the mood, is that exercise promotes the growth of new neurons in the hippocampus, which is involved with learning—both learning from specific life experiences, as well as learning how to put things into context, see things in the bigger picture. It’s possible that as exercise promotes the growth of neurons in the hippocampus, people become more able to cope with life and make use of positive experiences.

Taking in the good seemed like something you started to do on your own in college, and then later you found that research supported the practice, is that right?

A lot of people stumble upon something that works for them, and then later on they find out there’s a lot of research that’s related to it. For me, the research that’s relevant is on learning, both cognitive learning and especially emotional learning. How do people grow psychologically? The research on that shows that it’s a two-stage process of activation and installation. Also as a long-time clinician, I began to think about how relatively good we are as clinicians at activating positive mental states, but how bad we generally are at helping people actually install those activated states into neural structure. That was a real wake-up call for me, as a therapist.

You include a lot of testimonials, examples from people in the book. Is this something you do in your work with your patients?

Yeah, definitely. It’s changed the way I do therapy and more generally it’s changed the way I talk with people in life in general. Let me turn it around, to go back to your question about modernity. On the one hand, due to modernity, many people report that moment to moment, they’re having fairly positive experiences, they’re not being chased by lions, they’re not in a war zone, they’re not in agonizing pain, they have decent medical care. And yet on the other hand, many people today would report that they have a fundamental sense of feeling stressed and pressured and disconnected from other people, longing for closeness that they don’t have, frustrated, driven, etc. Why is that? I think one reason is that we’re simply wasting the positive experiences that we’re having, in part due to modernity, because we’re not taking into account that design bug in the Stone Age brain that it doesn’t learn very well.

How Staying Mentally Fit Can Make a Difference


Your brain isn’t a muscle, but you can treat it like one

Many people focus on physical fitness, but few know that brain fitness is also something you can work on. In fact, you can exercise your brain as often as you would your arms or abs–and the results can be positive and empowering.

How Staying Mentally Fit Can Make a Difference

It’s helpful to think of your brain as you would a muscle. To improve your brain, you can’t simply repeat the same exercises over and over. Just as lifting a two pound weight will cease to challenge you, so will repetitive exercises such as crosswords or Sudoku. Once you master easy exercises, you must move on to harder ones in order to push your brain—like your muscles—to a new level.

This is based on your brain’s innate neuroplasticity, or its ability to grow and change in response to new challenges. In other words, the right types of stimulating exercises can physically change your brain.

The science behind brain training

Scientists once believed that your mental abilities were fixed in adulthood. Now that studies on neuroplasticity have shown just the opposite, millions of people around the world have adopted the new practice of brain training.

The most popular of these brain training products is made by the San Francisco-based Lumosity, which employs a team of in-house neuroscientists with degrees from Stanford and UC Berkeley.

Realizing that brains need more sophisticated programs and guidance to grow and change, Lumosity’s scientists work with an experienced team of game designers. Together they’ve developed a fun, effective online brain training program that measure, tracks, and adapts to your progress so you’ll always be challenged.

Lumosity’s training algorithm and 40+ games are based on well-studied tests used in clinical neuropsychology research.

Promising studies on the effects of brain training

In a 2013 Stanford study, a treatment group of 21 breast cancer survivors used 12 weeks of Lumosity training to work on processing speed, mental flexibility, and working memory tasks. On average, those who trained improved on tests of these abilities, compared to a group that did not train with Lumosity.

There is even some preliminary evidence suggesting that Lumosity may be beneficial to normal, healthy adults. In a 2011 study by Lumosity and San Francisco State University researchers, 13 people who did Lumosity training over 5 weeks improved on tests of brain performance compared to a group that did not train. On average, those who trained improved working memory scores by 10% and attention scores by 20%.

Brain training is designed to address real-life needs

The goal of brain training is not to improve game scores: it’s to improve the underlying core abilities that those games rely on. Neuroscientists like those at Lumosity design brain games meant to translate into real-life benefits; with continued testing and research, the body of evidence behind brain training continues to grow.

Better attention, for example, can mean greater focus in the classroom or at an important business meeting. With improved processing speed, you might react and adapt faster to the demands of a busy life. And a better memory could mean stronger, longer relationships with the people closest to you.

Brain training is an investment

Training can take just a few minutes a day, but the rewards can make a difference in many aspects of life.

GM yeast brews fuel from rubbish.


US researchers have used genetically modified yeast to enhance the production of biofuels from waste materials.

The new method solves some of the problems in using waste like straw to make bioethanol fuel.

The scientists involved say the development could help overcome reservations about using land for fuel production.

The research is published in the journal Nature Communications.

maize

Many states around the world have plans to replace gasoline with bioethanol, but this has typically been by changing land-use from food crops to biofuel.

“Start Quote

We sort of rebuilt how yeast uses carbon”

Dr Jamie Cate University of California

Just this week, a representative of South Africa’s farming community announced that sorghum harvests would need to increase five fold to meet their government’s commitment to incorporate at least 2% bioethanol in petrol.

Sorghum is South Africa’s second biggest summer crop and is a staple food as well as being used in brewing and livestock feed.

However, scientists are now seeking more sustainable routes to generating biofuel – routes that would have a lighter impact on food prices and production.

Breakdown breakthrough

One is to consider using non-conventional plants such as seaweed. But among the most radical ideas is the suggestion that biowastes should be used to produce bioethanol, which is added to petrol replacing some fossil fuel.

“Wastes present a major opportunity in this respect. We have to start to think about wastes, such as sewage or landfill waste as resources – not problems to be disposed of,” Dr Gavin Collins, an environmental microbiologist at the National University of Ireland, Galway, told BBC News.

Using microbes to make fuel from biomass involves breaking down large complex biopolymer molecules.

These are indigestible to most bugs, and attempts to incorporate them into fuel production have slowed down the biotechnology, creating bottlenecks.

Biofuel boom.

Fuel plant

The European Union also has a declared aim that 10% of its transport energy should be from renewable sources, such as biofuels, by 2020.

To help meet this target, Europe’s largest biofuel plant opened this week at Crescentino, Italy.

It is designed to generate 75 million litres of ethanol a year from straw and a crop called Arundo donax, which can be grown on marginal land, and does not compete for resources with food.

One chemical that is produced when processing biowastes is a large sugar molecule called xylose.

When you try and use yeast to ferment xylose, rather than making alcohol for fuel directly, it generates acetic acid – essentially vinegar. This is poisonous to the yeast, and stops the fermentation.

Breaking down xylose and making acetic acid non-toxic are the two major problems that must be solved if biowastes such as straw are to be fermented to make fuel.

Now, US biotechnologists appear to have solved both problems, by developing a genetically engineered strain of yeast that simultaneously breaks down xylose and converts acetic acid to fuel.

“Xylose is a sugar; we can engineer yeast to ferment xylose,” said University of Illinois Prof Yong-Su Jin, one of the authors of the study.

“However, acetic acid is a toxic compound that kills yeast. That is one of the biggest problems in cellulosic ethanol production.”

The yeast digests the sugars in oxygen-poor conditions, making the process more efficient than digesters that rely on active mixing of air into the system.

Microbe driven

A new pathway, not yet discovered in nature, has been genetically engineered in the lab. This breakthrough means yeasts can be used much more efficiently to convert biowaste into biofuel.

“We sort of rebuilt how yeast uses carbon,” said principal investigator Dr Jamie Cate, of the University of California at Berkeley

One hurdle to implementing the discovery is that the new yeast that has been developed is genetically modified, and it is not yet clear how easily GM yeasts might be accepted for use on an industrial scale.

Dr Gavin Collins, however, remains upbeat about the prospects for biotechnology.

“We probably know the function of only about 0.01% of all living microbes on Earth,” he said.

“It may be that many of them can efficiently degrade even complex plant material and other wastes under anaerobic conditions. They may be present in nature but we haven’t found them yet.

“However, just look at what we have been able to do with the small fraction of microbes we understand – everything from antibiotic production; food and alcohol production; and biofuel production.

“Just think what we could do, or what we might discover, if we understood the function of just another 1%.”

Climate Change and Violence.


An analysis of 60 studies finds that warmer temperatures and extreme rainfall lead to a rise in violence.

Image

In the coming decades, the world’s changing climate could herald a rise in violence at every scale—from individuals to nations, from assault to war—according to a comprehensive new analysis of the link between climate and conflict.

Analyzing data from 60 earlier studies, Solomon Hsiang from the University of California, Berkeley, found that warmer temperatures and extremes in rainfall can substantially increase the risk of many types of conflict. For every standard deviation of change, levels of interpersonal violence, such as domestic violence or rape, rise by some 4 percent, while the frequency of intergroup conflict, from riots to civil wars, rise by 14 percent. Global temperatures are expected to rise by at least two standard deviations by 2050, with even bigger increases in the tropics.

“The paper is remarkably strong,” said Thomas Homer-Dixon, an environmental and political scientist at the University of Waterloo in Ontario, who was not involved in the study. “[It means] the world will be a very violent place by mid-century if climate change continues as projected.”

Hsiang, together with UC Berkeley colleagues Marshall Burke and Edward Miguel, focused only on studies that provided the strongest evidence for a causal connection. “The ideal thing would be to take two identical Earths, heat one up and watch how conflict evolves,” said Hsiang dryly. “We can’t do that, so we looked for these natural experiments.”

The researchers ignored any studies that compared levels of conflict between different countries, which also differ in their history, culture, and politics. Instead, they focused on data that revealed how violence rises and falls in a single place as climate changes.

For example, crime statistics in the United States reveal that the number of rapes, murders, or assaults increases on a hot day. Civil conflicts in the tropics become twice as frequent during the hot and dry years caused by El Niño events. Farmers in Brazil are more likely to invade each other’s land if they have a particularly wet or dry year. And Chinese dynasties all collapsed during long dry periods.

The team analyzed these studies and more using a common statistical framework to control for any biases on the part of the individual authors. Together, the data sets stretch back to 10,000 BC, and cover all major world regions. They represent the collective efforts of more than 190 researchers working across varied disciplines, from psychologists looking at the effects of temperature on aggressive behavior to archaeologists studying levels of violence in the ancient civilizations.

Despite this diversity, “we were shocked at how well the results from all these fields lined up,” said Hsiang. “Given how some people had been talking, we thought they’d be all over the map,” but the data consistently showed that temperature and rainfall affect violence, across locations, times and disciplines.

“This is a contested area and the convergence of results in this meta-analysis represents a significant step forward,” said Neil Adger, an environmental geographer at the University of Exeter, who was not involved in the study. He notes that responses to climate change, such as “widespread growing of biofuels or displacing populations to build dams” could exacerbate any increased propensity for conflict. “The impacts of climate change will factor large in the future, especially if the world is already fractured and unequal,” he said.

David Zhang, a geographer from the University of Hong Kong who was not involved in the study, said the results were robust and important. However, he noted that most of the data sets cover the last century, and the effects of climate on conflict, may have differed across centuries of human history. Hsiang acknowledges this gap, but said that a few century-spanning data sets have found similar trends.

“Another obvious criticism is that older societies may not be a good analogue for modern ones,” Hsiang added. For example, technological advances might help us to adapt to changing climate more effectively than past generations.

Hsiang also emphasized that climate is just one of many factors that influence the frequency of conflict, and that his study does not address why such a link between conflict and climate exists. Shifts in climate could change the availability of important resources like water or crops, leading to failing economies, weaker governments, and more incentives to fight or rebel. They could also lead to mass migration, rapid urbanization, or growing inequalities.

“We now want to understand what the underlying mechanisms are,” said Hsiang. “If we understand them, we could come up with policies that could decouple climate and conflict, which might help society to adapt. That’s a good reason to push the research ahead.”

Source: http://www.the-scientist.com

Scientists Capture First Images of Molecules Before and After Reaction.


Every chemist’s dream — to snap an atomic-scale picture of a chemical before and after it reacts — has now come true, thanks to a new technique developed by chemists and physicists at the University of California, Berkeley.

Using a state-of-the-art atomic force microscope, the scientists have taken the first atom-by-atom pictures, including images of the chemical bonds between atoms, clearly depicting how a molecule’s structure changed during a reaction. Until now, scientists have only been able to infer this type of information from spectroscopic analysis.

“Even though I use these molecules on a day to day basis, actually being able to see these pictures blew me away. Wow!” said lead researcher Felix Fischer, UC Berkeley assistant professor of chemistry. “This was what my teachers used to say that you would never be able to actually see, and now we have it here.”

The ability to image molecular reactions in this way will help not only chemistry students as they study chemical structures and reactions, but will also show chemists for the first time the products of their reactions and help them fine-tune the reactions to get the products they want. Fischer, along with collaborator Michael Crommie, a UC Berkeley professor of physics, captured these images with the goal of building new graphene nanostructures, a hot area of research today for materials scientists because of their potential application in next-generation computers.

“However, the implications go far beyond just graphene,” Fischer said. “This technique will find application in the study of heterogeneous catalysis, for example,” which is used widely in the oil and chemical industries. Heterogeneous catalysis involves the use of metal catalysts like platinum to speed reactions, as in the catalytic converter of a car.

“To understand the chemistry that is actually happening on a catalytic surface, we need a tool that is very selective and tells us which bonds have actually formed and which ones have been broken,” he added. “This technique is unique out there right now for the accuracy with which it gives you structural information. I think it’s groundbreaking.”

“The atomic force microscope gives us new information about the chemical bond, which is incredibly useful for understanding how different molecular structures connect up and how you can convert from one shape into another shape,” said Crommie. “This should help us to create new engineered nanostructures, such as bonded networks of atoms that have a particular shape and structure for use in electronic devices. This points the way forward.”

Fischer and Crommie, along with other colleagues at UC Berkeley, in Spain and at the Lawrence Berkeley National Laboratory (LBNL), published their findings online May 30 in the journal Science Express.

From shadow to snapshot

Traditionally, Fischer and other chemists conduct detailed analyses to determine the products of a chemical reaction, and even then, the actual three-dimensional arrangement of atoms in these products can be ambiguous.

“In chemistry you throw stuff into a flask and something else comes out, but you typically only get very indirect information about what you have,” Fischer said. “You have to deduce that by taking nuclear magnetic resonance, infrared or ultraviolet spectra. It is more like a puzzle, putting all the information together and then nailing down what the structure likely is. But it is just a shadow. Here we actually have a technique at hand where we can look at it and say this is exactly the molecule. It’s like taking a snapshot of it.”

Fischer is developing new techniques for making graphene nanostructures that display unusual quantum properties that could make them useful in nano-scale electronic devices. The carbon atoms are in a hexagonal arrangement like chicken wire. Rather than cutting up a sheet of pure carbon — graphene — he hopes to place a bunch of smaller molecules onto a surface and induce them to zip together into desired architectures. The problem, he said, is how to determine what has actually been made.

That’s when he approached Crommie, who uses atomic force microscopes to probe the surfaces of materials with atomic resolution and even move atoms around individually on a surface. Working together, they devised a way to chill the reaction surface and molecules to the temperature of liquid helium — about 4 Kelvin, or 270 degrees Celsius below zero — which stops the molecules from jiggling around. They then used a scanning tunneling microscope to locate all the molecules on the surface, and zeroed in on several to probe more finely with the atomic force microscope. To enhance the spatial resolution of their microscope they put a single carbon monoxide molecule on the tip, a technique called non-contact AFM first used by Gerhard Meyer and collaborators at IBM Zurich to image molecules several years ago.

After imaging the molecule — a “cyclic” structure with several hexagonal rings of carbon that Fischer created especially for this experiment — Fischer, Crommie and their colleagues heated the surface until the molecule reacted, and then again chilled the surface to 4 Kelvin and imaged the reaction products.

“By doing this on a surface, you limit the reactivity but you have the advantage that you can actually look at a single molecule, give that molecule a name or number, and later look at what it turns into in the products,” he said.

“Ultimately, we are trying to develop new surface chemistry that allows us to build higher ordered architectures on surfaces, and these might lead into applications such as building electronic devices, data storage devices or logic gates out of carbon materials.”

The research is coauthored by Dimas G. de Oteyza, Yen-Chia Chen, Sebastian Wickenburg, Alexander Riss, Zahra Pedramrazi and Hsin-Zon Tsai of UC Berkeley’s Department of Physics; Patrick Gorman and Grisha Etkin of the Department of Chemistry; and Duncan J. Mowbray and Angel Rubio from research centers in San Sebastián, Spain. Crommie, Fischer, Chen and Wickenburg also have appointments at Lawrence Berkeley National Laboratory.

The work is sponsored by the Office of Naval Research, the Department of Energy and the National Science Foundation.

Source: physics.org

Tiny robot flies like a fly.


Engineers create first device able to mimic full range of insect flight.

A robot as small as a housefly has managed the delicate task of flying and hovering the way the actual insects do.

“This is a major engineering breakthrough, 15 years in the making,” says electrical engineer Ronald Fearing, who works on robotic flies at the University of California, Berkeley. The device uses layers of ultrathin materials that can make its wings flap 120 times a second, which is on a par with a housefly’s flapping rate. This “required tremendous innovation in design and fabrication techniques”, he adds.

The robot’s wings are composed of thin polyester films reinforced with carbon fibre ribs and its ‘muscles’ are made from piezoelectric crystals, which shrink or stretch depending on the voltage applied to them.

Kevin Ma and his colleagues, all based at Harvard University in Cambridge, Massachusetts, describe their design today inScience1.

The tiny components, some of which are just micrometres across, are extremely difficult to make using conventional manufacturing technologies, so the researchers came up with a folding process similar to that used in a pop-up book. They created layers of flat, bendable materials with flexible hinges that enabled the three-dimensional structure to emerge in one fell swoop. “It is easier to make two-dimensional structures and fold them into three dimensions than it is to make three dimensional structures directly,” explains Ma.

Manufacturing marvel

“The ability to manufacture these little flexure joints is going to have implications for a lot of aspects of robotics that have nothing to do with making a robotic fly,” notes Michael Dickinson, a neuroscientist at the University of Washington in Seattle.

The work “will also lead to better understanding of insect flapping wing aerodynamics and control strategies” because it uses an engineering system “that can be more easily modified or controlled than an animal”, Fearing adds.

Weighing in at just 80 milligrams, the tiny drone cannot carry its own power source, so has to stay tethered to the ground. It also relies on a computer to monitor its motion and adjust its attitude. Still, it is the first robot to deploy a fly’s full range of aerial motion, including hovering.

The biggest technical obstacle to independent flight is building a battery that is small enough to be carried by the robotic fly, says Fearing. At present, the smallest batteries with enough power weigh about half a gram — ten times more than what the robotic fly can support. Ma says he believes that the battery obstacle might be overcome in 5-10 years.

If researchers can come up with such a battery, and with lightweight onboard sensors, Ma says that the robots could be useful in applications such as search and rescue missions in collapsed buildings, or as ways to pollinate crops amid dwindling bee populations.

Source: Nature