Chin strap turns chewing into power


 

gum
Could you power a hearing aid by chewing gum?

Engineers in Canada have built a chin strap that harnesses energy from chewing and turns it into electricity.

They say the device could one day take the place of batteries in hearing aids, earpieces and other small gadgets.

Made from a “smart” material that becomes electrically charged when stretched, the prototype needs to be made 20 times more efficient in order to generate useful amounts of power.

The researchers claim they can achieve this by adding layers of the material.

Their work appears in the Institute of Physics journal Smart Materials and Structures.

Dr Aidin Delnavaz and Dr Jeremie Voix, mechanical engineers at theÉcole de Technologie Supérieure in Montreal, Canada, suggest that jaw movements are a promising candidate for harvesting natural energy.

The pair, who work on auditory technology like powered ear-muffs and cochlear implants, are keen to put that energy to work, and decrease reliance on disposable batteries.

“We went through all the available power sources that are there,” Dr Voix told the BBC. These included the heat found inside the ear canal, and the overall movement of the head, which might have been used in a similar way to the wrist movements that power automatic watches.

In other experiments, they also explored the movements that the jaw creates inside the ear canal itself.

“But on the way, we realised that when you’re moving your jaw, the chin is really moving the furthest,” said Dr Voix. “And if you happen to be wearing some safety gear… then obviously the chin strap could be actually harvesting a lot of energy.”

chin strap
For testing, the strap was fitted snugly to Dr Delnavaz’s chin and attached to a set of earmuffs

So he and Dr Delnavaz decided to try and harvest energy from the chewing chin, using what is called the “piezoelectric effect”: when certain materials are pressed or stretched (“piezo” comes from the Greek word for squeeze), they acquire an electrical charge.

By making a strap from commercially available piezoelectric material, then attaching it to earmuffs and fitting it snugly around Dr Delnavaz’s chin, the researchers built a prototype. And sure enough, when Dr Delnavaz chewed gum for 60 seconds, they measured up to 18 microwatts of generated power.

This might not sound like much – and indeed, even powering something as small as a hearing aid would require perhaps twenty of these straps. But Dr Delnavaz said this could be achieved by bundling up more of the material.

“We can multiply the power output by adding more PFC layers to the chin strap,” he said. (PFC refers to the “piezoelectric fibre composites” used in the strap.)

“For example, 20 PFC layers, with a total thickness of 6mm, would be able to power a 200 microwatt intelligent hearing protector.”

The strap would still be perfectly comfortable, Dr Delnavaz said. He wore the prototype version “for many hours” to test it out, and never felt his chewing or talking were restricted.

Chewing can produce about 580 joules of energy in a day

“We showed in this paper, it’s not necessary to have a stiff strap,” he explained. “A loose strap is enough to harvest energy.”

The team is also investigating other, more efficient starting materials. But even with these improvements, the idea will probably never be transferred to more power-hungry devices.

“You could power cochlear implants and things like that,” commented Prof Steve Beeby from the University of Southampton. “But it’s not going to be useful for recharging your mobile phone or anything like that.”

Dr Voix agrees: his vision is mostly for situations where people are already wearing a strap, and could plug in a small but essential gadget. People who work with heavy machinery, for example, wear a helmet and also need ear protectors.

He also pointed to military applications, such as soldiers wearing head protection and communicating using earpieces – and even one or two consumer products.

“I cycle to work every day, I wear my helmet… Why not have my bluetooth dongle recharged by that strap?”

These possibilities remain distant for now, although the team has already been approached by companies interested in new charging solutions for bluetooth headsets.

“This is just a proof of concept,” Dr Voix emphasised. “The power is very limited at the moment.”

Blindness, by those in the know


 

A blind person holding a cane about o cross a road

There are a small number of questions that blind people seem to get asked regularly. But here are five lesser-known things about blindness from those who know.

Can blind people hear better than sighted people?

There is an often-quoted view that a blind person’s remaining four senses are heightened to compensate for their lack of vision. In popular culture, sightless superhero Daredevil makes use of his super senses to save the world, and in the film Scent of a Woman, Al Pacino’s blind character could tell one perfume from another at the drop of a hat.

Many blind people feel their hearing is no better than sighted people’s – it’s just that they have to listen more intently to sounds around them. They gauge distance and direction of traffic by ear to avoid being hit by a car, and will tune into announcements at stations to find out which platform their train is on. Sighted people are more likely to focus on the display boards when travelling.

But there is some evidence to support the heightened senses theory. Research at the University of Montreal in 2012 suggests that a blind person’s brain does re-wire itself to use the visual cortex. Normally preoccupied with seeing, it’s hijacked to improve the processing of other information such as sound and touch.

Many blind people use reflected sound waves to build a mental picture of their surroundings (similar to bats and dolphins) in a process known as echolocation. Most use it all the time without realising, to avoid walking into things. Others claim to be able to tell an object’s distance, size, texture and density by clicking their tongue against the roof of their mouth about three times per second and are able to go hiking and cycling without a white cane or a dog.

Can blind people see in their dreams?

Two people asleep in bed, a man and woman

People who were born blind have no understanding of how to see in their waking lives, so they can’t see in their dreams. But most blind people lose their sight later in life and can dream visually. Danish research in 2014 found that as time passes, a blind person is less likely to dream in pictures.

The same research says that people who are born blind have more nightmares than sighted people. The theory is that nightmares are mental rehearsals of potentially distressing events, and they can help develop coping mechanisms. For example, blind people in the study reported dreaming about getting lost, being hit by a car or losing their guide dog.

How do blind people choose their clothes?

Over time, many blind people will get a feel for the shape and style of clothes that suit them and they will tend to shop with trusted people.

The fashion-conscious blind person puts considerable energy into ensuring that their outfits match, but technology is often needed for differentiating between colours. A colour detector is a talking gadget which, when pressed against a piece of clothing for a second or two, loudly announces “light olive green” or “dark blue” in a posh English accent. They aren’t totally accurate and tend to be used occasionally when sorting laundry and checking items which feel similar.

Blind people have various systems for keeping track of their clothes. Some will sew different shaped buttons on to labels to denote colours. Others might cut the labels in various ways. Some favour the Pen Friend, a barcode reader with labels that can be loaded with information about the clothing, including colour and washing instructions. Others will just try and remember the information or buy clothes that all match.

What do guide dogs actually do?

Contrary to popular belief, guide dogs do not tell their owner when it is time to cross the road and they do not take their owner where they want to go based on an instruction such as “find the shops”.

Guide dogs walk in a straight line, always on the left of a person, and are trained to keep an eye on their owner’s right shoulder to protect against collisions. They avoid obstacles and stop at kerbs. They know their left from right. Sometimes dogs might lead their owner into overhanging branches because its trickier for them to judge overhead obstacles. It all takes practice. It’s a partnership and owners often consider they’re driving the dog rather than being led by it.

A guide dog lying down asleep in the middle of a group of people

Unofficially, guide dogs can provide good companionship for isolated blind people. Their presence can help owners feel safer while out and about. And of course, a dog can be a good ice-breaker in a social situation.

How do blind people use computers and smartphones?

Blind and visually impaired people use computer technology in three ways. Some, who can see a bit, can get software that magnifies everything on the screen to a size they can easily read.

Totally blind people have two options. A Braille display can sit underneath a keyboard and provide a tactile version of words on the screen, one line at a time. But less than 1% of the two million people with vision problems in the UK can read Braille, and anyway, the display can cost thousands of pounds.

A far more popular option is a screenreader – software which reads the screen in an intelligent way, using a synthetic voice. Voices are improving in quality all the time but many old-school blind computer users stick to the one that sounds like Stephen Hawking, because it can be understood at a fast speed and because they’re used to its pronunciation. After a while, users stop noticing what their screenreader sounds like and crank it up to a speed that’s unintelligible to the average person. Some use both Braille and speech together.

The Cancer-Causing Metal Millions Eat, Wear or Have Injected Into Their Kids .


Aluminum is considered by most health authorities perfectly acceptable to eat, wear as an antiperspirant, and inject into your body as a vaccine adjuvant, but research indicates it has cancer-causing properties, even at levels 100,000 times lower than found in certain consumer products.

A concerning study published in the Journal of Inorganic Biochemistry demonstrates clearly that exposure to aluminum can increase migratory and invasive properties of human breast cancer cells. This has extremely important implications, becausemortality from breast cancer is caused by the spread of the tumor and not from the presence of the primary tumor in the breast itself. This profound difference, in fact, is why a groundbreaking National Cancer Institute commissioned expert panel called for the complete reclassification of some types of non-progressive ‘breast cancer’ and ‘prostate cancer’ as essentially benign lesions bittersweet news for the millions who were already misdiagnosed/overdiagnosed and mistreated/overtreated for ‘cancer’ over the past 30 years.

aluminum_toxicity_cancer

Another recent relevant study, also published in the Journal of Inorganic Biochemistry,foundincreased levels of aluminum in noninvasively collected nipple aspirate fluids from 19 breast cancer patients compared with 16 healthy control subjects. The researchers commented on their findings:In addition to emerging evidence, our results support the possible involvement of aluminium ions in oxidative and inflammatory status perturbations of breast cancer microenvironment, suggesting aluminium accumulation in breast microenvironment as a possible risk factor for oxidative/inflammatory phenotype of breast cells.”[1]

A key implication of this research is that the common ingestion (food additive), injection (as a vaccine adjuvant), and topical application (antiperspirant) of forms of aluminum may be contributing to the burgeoning cancer epidemic in exposed populations. Given this possibility, the further use of aluminum in foods, cosmetics and drugs should be halted until adequate risk assessments can be made thoroughly proving its safety. (Since we do not use the precautionary principle to guide risk assessments and their regulation in the US, instead opting for a chemical and drug-industry favoring “weight of evidence” standard, this likely will not happen; however, we can use this information to apply the precautionary principle in our own lives)

When it comes to aluminum’s presence in antiperspirant formulas, a very concerning study published last year in the Journal of Applied Toxicology identified the primary form of aluminum used in underarm cosmetics – aluminum chloride – as capable of altering breast cancer cells in a way indicative of ‘neoplastic transformation,’ or, the transformation of a healthy cell into a cancerous one:

These results suggest that aluminium is not generically mutagenic, but similar to an activated oncogene [cancer-causing gene], it induces proliferation stress, DSBs and senescence in normal mammary epithelial cells; and that long-term exposure to AlCl(3) generates and selects for cells able to bypass p53/p21(Waf1) -mediated cellular senescence. Our observations do not formally identify aluminium as a breast carcinogen, but challenge the safety ascribed to its widespread use in underarm cosmetics.

Even more disturbing was their finding that these changes, which included “contact inhibition and anchorage-independent growth” (two markers of malignancy), were caused by concentrations “…up to100 000-fold lower than those found in antiperspirants, and in the range of those recently measured in the human breast.”[2]

This study dovetails with recent research demonstrating that aluminum binds to cellular estrogen receptors, indicating it may disrupt and/or drive proliferation within hormone-sensitive tissues. One research team coined a new term – “metalloestrogen” – to describe an entirely new class of metal-based endocrine disrupters, including aluminum, antimony, arsenite, barium, cadmium, chromium (Cr(II)), cobalt, copper, lead, mercury, nickel, selenite, tin and vanadate. This reclassification of what were formerly perceived to be hormonally inert substances should help to alert consumers to the significant health risk associated with the use of ‘unnatural’ products containing these elements.

While there is little extant animal research demonstrating aluminum’s cancer causing properties, which is why it has not yet been classified with respect to carcinogenicity, “aluminum production” has been classified as carcinogenic to humans by the International Agency for Research on Cancer (IARC).[3] There is also a 2011 study published in the Journal of Applied Toxicology that found aluminum content is higher in nipple aspirate fluid of breast cancer-affected women versus healthy controls.

Aluminum, of course, is widely distributed within our environment (reaching, at present, the highest level in documented history), and has even been implicated in atmospheric aerosols (i.e. geoengineering/ ‘chemtrails’); which, incidentally, may be one reason why our soils are becoming saturated with the metal to levels toxic to plants, and why biotech corporations are presently working on developing aluminum-tolerant GM plants.

Because our regulators consider aluminum perfectly ‘safe to eat,’ apply topically, and inject into our bodies to “improve natural immunity,” the emerging view of aluminum as possessing cancer-causing effects will put additional responsibility on consumers to educate themselves and make choices to protect themselves from avoidable exposure.

Education equals empowerment. Learn more about how to protect yourself against aluminum by reading the following articles:

Is Eating and Injecting Aluminum Safe As Our Regulators Say?

Can We Continue To Justify Injecting Aluminum Into Our Children?

Article Resources

[1] F Mannello, D Ligi, M Canale. Aluminium, carbonyls and cytokines in human nipple aspirate fluids: Possible relationship between inflammation, oxidative stress and breast cancer microenvironment. J Inorg Biochem. 2013 Jul 12. Epub 2013 Jul 12. PMID: 23916117

[2] André-Pascal Sappino, Raphaële Buser, Laurence Lesne, Stefania Gimelli, Frédérique Béna, Dominique Belin, Stefano J Mandriota. Aluminium chloride promotes anchorage-independent growth in human mammary epithelial cells. J Appl Toxicol. 2012 Jan 6. Epub 2012 Jan 6. PMID: 22223356

[3] Daniel Krewski, Robert A Yokel, Evert Nieboer, David Borchelt, Joshua Cohen, Jean Harry, Sam Kacew, Joan Lindsay, Amal M Mahfouz, Virginie Rondeau. Human health risk assessment for aluminium, aluminium oxide, and aluminium hydroxide. J Toxicol Environ Health B Crit Rev. 2007 ;10 Suppl 1:1-269. PMID: 18085482

Research Proves Wheat Can Cause Harm To Everyone’s Intestines .


The myth that you need to have ‘bad genes’ to experience intestinal damage from consuming wheat was disproved years ago.

It is a common myth that wheat only causes immune-mediated intestinal damage within those with a rare genetically based aberration called celiac disease. Still relatively unknown research from 2007 clearly demonstrated that everyone’s body likely experiences adverse intestinal effects from gluten (gliadin) exposure.

As far as celiac disease, the specific mechanisms by which wheat causes damage are well-known, and they go like this…

villi_intestines_wheat

In celiac disease, an alcohol-soluble wheat storage protein known as gliadin is partially degraded (i.e. deamidated) by the enzyme tissue transglutaminase, the effect of which is to activate susceptible host T-cells to mistakenly identify and attack intestinal villi as if they were ‘foreign’ invaders. This highly destructive autoimmune process can be verified through blood tests, or through the so-called “gold standard” of an intestinal biopsy that clearly reveals destroyed villi and/or flattened intestinal surfaces, the hallmark pathology of celiac disease.

The reality, however, is that one does not need to be celiac, or have a particular genetic mutation, in order to experience damage associated with exposure to wheat gliadin.

In a study published in the journal GUT in 2007, a group of researchers asked the question:Is gliadin really safe for non-coeliac individuals?   In order to test their hypothesis that an innate immune response to gliadin is common in both patients with celiac disease and without celiac disease, intestinal biopsy cultures were taken from both groups and challenged with crude gliadin, the gliadin synthetic 19-mer (19 amino acid long gliadin peptide) and 33-mer deamidated peptides.

Results showed that all patients with or without celiac disease, when challenged with the various forms of gliadin, produced an interleukin-15-mediated response. The researchers concluded:

The data obtained in this pilot study supports the hypothesis that gluten elicits its harmful effect, throughout an IL15 innate immune response, on all individuals [my italics].

The primary difference between the two groups is that the celiac disease patients experienced both an innate and an adaptive immune response to the gliadin, whereas the non-celiacs experienced only the innate response.

The researchers hypothesized that the difference between the two groups may be attributable to greater genetic susceptibility at the HLA-DQ gene locus (on chromosome 6) for triggering an adaptive immune response, higher levels of immune mediators or receptors, or perhaps greater permeability in the celiac intestine.

It is also possible that over and above the possibility of greater genetic susceptibility, most of the differences are from epigenetic factors that are influenced by the presence or absence of certain nutrients in the diet, bacterial strains within the gut flora, and environmental exposures, which include NSAID drugs like naproxen or aspirin which can profoundly increase intestinal permeability in the non-celiac, rendering them susceptible to gliadin’s potential for activating secondary adaptive immune responses.

This may explain why, in up to 5% of all cases of classically defined celiac disease, the typical HLA-DQ haplotypes are not found. However, determining the factors associated with greater or lesser degrees of susceptibility to gliadin’s intrinsically toxic effect should be secondary to the fact that it has been demonstrated to be toxic to both non-celiacs and celiacs.[1]

In other words, rather than look up the adverse gut responses associated with wheat, and particularly, wheat gliadin, as being a rare genetically-based aberration, we may want to reconsider the common, culturally reinforced view that wheat is an intrinsically healthy food that only an ‘abnormal’ subset of the human population has an ‘unhealthy’ response to. To the contrary, perhaps the immunoreactive effects that wheat gliadin induces indicates that we have a human species-specific intolerance to this ‘food,’ and that rather than look at these adverse effects as being ‘unhealthy reactions to a healthy food,’ perhaps we should look at them as ‘healthy reactions to an intrinsically unhealthy (or metabolically incompatible) food.’

Ultimately, intestinal damage is only the tip of the so-called “celiac” or “non-celiac gluten sensitivity” icebergs. GreenMedInfo.com has indexed research from the National Library of Medicine on over 300 adverse health effects associated with wheat and/or wheat components. You can view the first-hand research here: http://www.greenmedinfo.com/toxic-ingredient/wheat

Also, learn more about wheat’s adverse effects to gastrointestinal health by reading our recent article: Wheat As A Common Cause of Dyspepsia and IBS, and a broader perspective on the dangers of wheat in our essay ‘The Dark Side of Wheat.’

Article Reference

[1] Mustalahti, K., P. Holopainen, K. Karell, M. Maki, J. Partanen,  “Genetic Dissection Between Silent and Clinically Diagnosed Symptomatic Forms of Coeliac Disease in Multiplex Families”, Digestive and Liver Disease,  Amsterdam: Elsevier BV, 2002, http://www.sciencedirect.com, accessed December 2007.

Researchers create materials that reproduce cephalopods’ ability to quickly change colors and textures


Cephalopods, which include octopuses, squid, and cuttlefish, are among nature’s most skillful camouflage artists, able to change both the color and texture of their skin within seconds to blend into their surroundings—a capability that engineers have long struggled to duplicate in synthetic materials. Now a team of researchers has come closer than ever to achieving that goal, creating a flexible material that can change its color or fluorescence and its texture at the same time, on demand, by remote control.

The results of their research have been published in the journal Nature Communications, in a paper by a team led by MIT Assistant Professor of Mechanical Engineering Xuanhe Zhao and Duke University Professor of Chemistry Stephen Craig.

Zhao, who joined the MIT faculty from Duke this month and holds a joint appointment with the Department of Civil and Environmental Engineering, says the new material is essentially a layer of electro-active that could be quite easily adapted to standard manufacturing processes and uses readily available materials. This could make it a more economical dynamic camouflage material than others that are assembled from individually manufactured electronic modules.

While its most immediate applications are likely to be military, Zhao says the same basic approach could eventually lead to production of large, flexible display screens and anti-fouling coatings for ships.

In its initial proof-of-concept demonstrations, the material can be configured to respond with changes in both texture and fluorescence, or texture and color. In addition, while the present version can produce a limited range of colors, there is no reason that the range of the palette cannot be increased, Craig says.

Learning from nature

Cephalopods achieve their remarkable color changes using muscles that can alter the shapes of tiny pigment sacs within the skin—for example, contracting to change a barely visible round blob of color into a wide, flattened shape that is clearly seen. “In a relaxed state, it is very small,” Zhao says, but when the muscles contract, “they stretch that ball into a pancake, and use that to change color. The muscle contraction also varies skin textures, for example, from smooth to bumpy.” Octopuses use this mechanism both for camouflage and for signaling, he says, adding, “We got inspired by this idea, from this wonderful creature.”

The new synthetic material is a form of elastomer, a flexible, stretchable polymer. “It changes its fluorescence and texture together, in response to a change in voltage applied to it—essentially, changing at the flip of a switch,” says Qiming Wang, an MIT postdoc and the first author of the paper.

“We harnessed a physical phenomenon that we discovered in 2011, that applying voltage can dynamically change surface textures of elastomers,” Zhao says.

“The texturing and deformation of the elastomer further activates special mechanically responsive molecules embedded in the elastomer, which causes it to fluoresce or change color in response to voltage changes,” Craig adds. “Once you release the voltage, both the elastomer and the molecules return to their relaxed state—like the cephalopod skin with muscles relaxed.”

Multiple uses for quick changes

While troops and vehicles often move from one environment to another, they are presently limited to fixed camouflage patterns that might be effective in one environment but stick out like a sore thumb in another. Using a system like this new elastomer, Zhao suggests, either on uniforms or on vehicles, could allow the camouflage patterns to constantly change in response to the surroundings.

“The U.S. military spends millions developing different kinds of camouflage patterns, but they are all static,” Zhao says. “Modern warfare requires troops to deploy in many different environments during single missions. This system could potentially allow dynamic camouflage in different environments.”

Another important potential application, Zhao says, is for an anti-fouling coating on the hulls of ships, where microbes and creatures such as barnacles can accumulate and significantly degrade the efficiency of the ship’s propulsion. Earlier experiments have shown that even a brief change in the surface texture, from the smooth surface needed for fast movement to a rough, bumpy texture, can quickly remove more than 90 percent of the biological fouling.

Zhenan Bao, a professor of chemical engineering at Stanford University who was not involved in this research, says this is “inspiring work” and a “clever idea.” She adds, “I think the significant part is to combine the ability of mechanochemical response with electrical addressing so that they can induce fluorescence patterns by demand, reversibly.” Bao cautions that the researchers still face one significant challenge: “Currently they can only induce one kind of pattern in each type of material. It will be important to be able to change the patterns.”

How To Develop Mental Toughness


When I was a young man, training to become an Army officer, we did exercises in some of the most miserable conditions and places. As with most things, exposure leads to tolerance – so, over time, we became better able to ignore the conditions and get on with the job.

There was a running ‘joke’ we’d use to keep our chins up: The basic concept was, whenever you saw a mate doing it tough, you’d say: “Things could always be worse. (Then you’d insert whatever unpleasant thing was about to happen).”

So you’d hear or make comments like “It could be worse – the extraction could be cancelled and we’ll have to walk out” or “It could be worse – the resupply might not come through” or, commonly, “It could be worse – it could be raining.”

For extra impact, you’d try to time the comment so it was said in the moments before the reality presented itself. You’d be shivering, hungry and less-than-impressed on a cold, windswept hilltop and you’d see the wall of rain approaching. Just before it drenched everyone, you’d say “Cheer up guys, it could be worse. It could be raining.” Cue downpour.

Army humour.

Point being, we all came to believe in this principle. Things CAN always be worse. No matter how sad, sorry or dire the situation might seem … things could still be a whole lot less satisfactory.

Thinking this way has an advantage. It helps reduce the intensity of our current pain, stress, discomfort, difficulties or unpleasantness – by reminding us we’re actually lucky it isn’t worse. In a perverse way, ‘the joke’ forced us to recognise the relatively positive things we were ‘enjoying’.

The joke never ended – because there was always something potentially worse. When it started pouring down rain, someone would point out ‘It could be worse. It could be snowing.’ I can even remember a warning about sharks being topped by “sharks with freaking lasers”.

The thing is, most of us are fortunate in so many ways. The country we live in, the roof over our heads, the food we get to eat, the health we enjoy, the friends we have, the perfect sunset we saw … and much more.

It’s easy to take these things for granted. It’s easy to overlook the many things we have going for us and instead focus on the negatives.

Which is usually counter-productive. Fact is, negatives are an unavoidable part of life. Things won’t always work out. We’ll experience difficulties in different areas at different times. But we can choose to dilute the distress those problems cause by reminding ourselves how well off we still are (especially relative to how things could be).

As the old adage says, ‘I complained about the hole in my shoe until I saw someone with no feet.’

Try this idea for yourself. Next time you’re having a hard time and things are grim, just say to yourself … ‘It could be worse. It could be raining’. (Feel free to insert your own ‘worse thing’.) Then do it again with something even worse. Then think of something worse again. Then think of something much worse, ‘with freaking lasers!’

Then take a quick look around and find 5 things to be grateful for.

Do it often enough and it becomes an automatic response. The tougher things get, the more grateful you are for what you’ve got – which gives you the will to carry on.

And that’s what ‘tough’ is all about.

Scientists twist radio beams to send data: Transmissions reach speeds of 32 gigibits per second


Researchers twist four radio beams together to achieve high data transmission speeds. The researchers reached data transmission rates of 32 gigabits per second across 2.5 meters of free space in a basement lab. For reference, 32 gigabits per second is fast enough to transmit more than 10 hour-and-a-half-long HD movies in one second and is 30 times faster than LTE wireless.
Graphic showing the intensity of the radio beams after twisting.

Building on previous research that twisted light to send data at unheard-of speeds, scientists at USC have developed a similar technique with radiowaves, reaching high speeds without some of the hassles that can go with optical systems.

The researchers, led by electrical engineering professor Alan Willner of the USC Viterbi School of Engineering, reached data transmission rates of 32 gigabits per second across 2.5 meters of free space in a basement lab at USC.

For reference, 32 gigabits per second is fast enough to transmit more than 10 hour-and-a-half-long HD movies in one second and is 30 times faster than LTE wireless.

“Not only is this a way to transmit multiple spatially collocated radio data streams through a single aperture, it is also one of the fastest data transmission via radio waves that has been demonstrated,” Willner said.

Faster data transmission rates have been achieved — Willner himself led a team two years ago that twisted light beams to transmit data at a blistering 2.56 terabits per second — but methods to do so rely on light to carry the data.

“The advantage of radio is that it uses wider, more robust beams. Wider beams are better able to cope with obstacles between the transmitter and the receiver, and radio is not as affected by atmospheric turbulence as optics,” Willner said.

Willner is the corresponding author of an article about the research that will be published in Nature Communications on Sept. 16. The study’s co-lead authors Yan Yan and Guodong Xie are both graduate students at USC Viterbi, and other contributors came from USC, the University of Glasgow, and Tel Aviv University.

To achieve the high transmission rates, the team took a page from Willner’s previous work and twisted radio beams together. They passed each beam — which carried its own independent stream of data — through a “spiral phase plate” that twisted each radio beam into a unique and orthogonal DNA-like helical shape. A receiver at the other end of the room then untwisted and recovered the different data streams.

“This technology could have very important applications in ultra-high-speed links for the wireless ‘backhaul’ that connects base stations of next-generation cellular systems,” said Andy Molisch of USC Viterbi. Molisch, whose research focuses on wireless systems, co-designed and co-supervised the study with Willner.

Future research will focus on attempting to extend the transmission’s range and capabilities.

The work was supported by Intel Labs University Research Office and the DARPA InPho (Information in a Photon) Program.


Story Source:

The above story is based on materials provided by University of Southern California. Note: Materials may be edited for content and length.


Journal Reference:

  1. Yan Yan, Guodong Xie, Martin P. J. Lavery, Hao Huang, Nisar Ahmed, Changjing Bao, Yongxiong Ren, Yinwen Cao, Long Li, Zhe Zhao, Andreas F. Molisch, Moshe Tur, Miles J. Padgett, Alan E. Willner. High-capacity millimetre-wave communications with orbital angular momentum multiplexing. Nature Communications, 2014; 5: 4876 DOI: 10.1038/ncomms5876

Human brain gene turns mice into fast-learners


Scientists have spliced a key human brain gene into mice, that demonstrated accelerated learning as a result.

In the first study designed to assess how partially ‘humanising’ brains of a different species affects key cognitive functions, scientists report that mice carrying Foxp2 – a human gene associated with language – learned new ways to find food in mazes faster than normal mice.

Lab mouse

By isolating the effects of one gene, the research sheds light on its function and hints at the evolutionary changes that led to the unique capabilities of the human brain, the scientists say.

The findings were published in the Proceedings of the National Academy of Sciences.

“No one knows how the brain makes transitions from thinking about something consciously to doing it unconsciously,” says Ann Graybiel of the Massachusetts Institute of Technology, one of the study authors. “But mice with the human form of Foxp2 did much better.”

In a 2009 study, mice carrying human Foxp2 developed more complex neurons and more efficient brain circuits.

Building on that, Graybiel lead a team who took hundreds of mice genetically engineered to carry the human version of Foxp2, and trained them to find chocolate in a maze.

The animals had two options – use landmarks like lab equipment and furniture visible from the maze (“at the T-intersection, turn toward the chair”) or by the feel of the floor (“smooth, turn right;” “nubby, turn left”).

Mice with the human gene learned the route as well, by seven days, as regular mice did by 11, the scientists report.

Surprisingly, however, when the scientists removed all the landmarks in the room so mice could only learn by the feel-of-the-floor rule, the regular rodents did as well as the humanized ones. They also did just as well when the landmarks were present but the floor textiles were removed.

It was only when mice could use both learning techniques that those with the human brain gene excelled.

That suggested, Graybiel says, that what the human gene does is increase cognitive flexibility – it lets the brain segue from remembering consciously in what’s called declarative learning (“turn left at the petrol station”) to remembering unconsciously (take a right once the floor turns from tile to carpet).

Unconscious, or procedural, learning is the kind the feel-of-the-floor cue produced – the mice didn’t have to consciously think about the meaning of rough or smooth. They felt, they turned – much as people stop consciously thinking about directions on a regular route and navigate automatically.

If Foxp2 produces the cognitive flexibility to switch between forms of learning, that may help explain its role in speech and language.

When children learn to speak, they transition from consciously mimicking words they hear to speaking automatically. That suggests that switching from declarative to procedural memory, as the humanized mice did so well thanks to Foxp2, “is a crucial part of the process,” Graybiel says.

Depression Blood Test Designed To Help Improve Treatment And Diagnosis


Depression is a mind-altering mental disorder that steals life away from the untreated and misdiagnosed — imagine if one blood test could change all of that through a screening process. Researchers from Northwestern University have developed the first blood test designed to detect clinical depression in adults, and published their groundbreaking findings in the journal Translational Psychiatry.

Blood Test Can Detect Signs Of Depression

Doctors may soon be able to check for depression right alongside your cholesterol count with a blood test. The sooner a person is screened and diagnosed for depression, the sooner they can be helped. Nearly 40,000 Americans take their lives every year through an act of suicide, and many who attempt never seek professional care.

Previously, blood tests were tried by finger-crossing researchers to screen depression in teenagers who are especially vulnerable. First, they tried to pinpoint genetic and environmental predispositions in rats in order to sort out 26 markers for major depression. Then they looked for those markers in the blood of 28 human teenagers between the ages of 15 to 19, half of them diagnosed with depression and the other half without. The results were remarkable, as researchers announced in 2012 that 11 of their markers showed up in all of the depressed teens, but none in the teens without depression.

This new test takes those markers to the next level in adults. After 18 weeks of cognitive behavior therapy with 32 adults between the ages of 21 and 79, the research team was able to highlight markers in the patients and determine which ones were responding well to therapy by seeing actual physical changes in their blood tests. The test focuses on nine blood markers that are different in those who are depressed compared to those who aren’t depressed. By watching for the markers that indicate depression, researchers can essentially scan blood samples to determine who is suffering from clinical depression.

Considering there are more than 18 million adults in the United States who battle clinical depression, physicians have strived through leaps and bounds in research to come closer to understanding the ins and outs of their internal fight. Currently, depression is only diagnosed through a number of subjective one-on-one observations of a patient’s behavior and mood with a qualified therapist, along with self-reported events and feelings. A blood test may eventually be able to test for levels of severity or direction for treatment depending on the types of biomarkers that are highlighted on the test.

For those who seek treatment, 80 percent of them are treated successfully, according to Suicide Awareness Voices of Education. If primary care doctors provided routine testing for depression, maybe suicide wouldn’t be the second leading cause of death for young people between the ages of 15 and 24. Depression isn’t just being in a bad mood or feeling sad every once and a while, but it’s instead a life-altering and damaging mental disorder. The earlier detection, the earlier prevention can become.

Source: Redei EE, Ho J, Cai X, Seok J, Kwasny MJ, and Mohr DC. Blood transcriptomic biomarkers in adult primary care patients with major depressive disorder undergoing cognitive behavioral therapy. Translational Psychiatry. 2014.