New software turns smartphones into versatile keys.


http://www.newstrackindia.com/newsdetails/2013/01/06/21–New-software-turns-smartphones-into-versatile-keys-.html

Normalization of Vital Signs Does Not Reduce Risk for Acute Pulmonary Embolism.


Up to one third of patients whose abnormal triage vital signs reverted to normal values had PE.

In a prospective single-center study, researchers evaluated whether normalization of vital signs in patients who present with symptoms of pulmonary embolism (PE) reduces the probability of the disease. Patients at an urban academic emergency department (ED) in North Carolina were enrolled if they were older than 17 years and had at least one predefined sign or symptom and one risk factor for PE.

Of 192 patients, 35 (18%) were diagnosed with PE by computed tomography in the ED. In patients whose abnormal triage vital signs normalized at any time during their ED visit, incidence of PE was not lower than for patients whose vital signs did not normalize. The incidence of PE for patients with abnormal pulse rate, respiratory rate, shock index, or pulse oximetry at triage that subsequently normalized was 18%, 14%, 19%, and 33%, respectively.

Comment: Just as a normal blood gas value does not rule out acute pulmonary embolism, this study shows that normalization of initially abnormal vital signs also does not reduce the likelihood of PE. The best approach for patients presenting with signs and symptoms of PE is to note abnormal vital signs occurring at any time after symptom onset and use these vital sign numbers in the determination of pretest probability of PE, even if they subsequently normalized.

Source:Journal Watch Emergency Medicine

 

Septic Shock? Reach for Norepinephrine After Fluid Resuscitation.


A meta-analysis shows that dopamine is associated with increased risk for death and arrhythmic events compared with norepinephrine.

Although current guidelines recommend either dopamine or norepinephrine as the vasopressors of choice for septic shock, a recent meta-analysis of six interventional studies suggested that norepinephrine is the superior agent (JW Emerg Med Apr 22 2011). Investigators conducted a meta-analysis of the same six interventional studies — excluding patients with nonseptic shock — and five observational studies; the analysis involved a total of 2768 adult patients.

Observational studies showed significant heterogeneity in results overall and no difference in mortality between patients treated with dopamine and those treated with norepinephrine. After exclusion of one trial that accounted for the heterogeneity, dopamine was associated with an increased risk for death at 28 days over norepinephrine (relative risk, 1.23). Interventional trials were homogeneous and likewise showed a significantly increased risk for death with dopamine use (RR, 1.12). Two interventional studies that reported arrhythmic events showed a significant increase in these events in the dopamine groups (RR, 2.34).

An editorialist suggests that dopamine, a more powerful β-agonist than norepinephrine, could still be considered in patients with septic shock, hypotension (systolic blood pressure <90 mm Hg), and either a low cardiac index (<2.5 L/minute/m2) or low heart rate (<90 beats per minute).

Comment: This study supports norepinephrine as the vasopressor of choice for adult patients with septic shock. Dopamine is relegated to a secondary role, perhaps to be used when cardiac output is insufficient despite optimal use of norepinephrine.

Source: Journal Watch Emergency Medicine

 

Ablative Therapy for Barrett Esophagus: Caveat Emptor.


Cancer can still occur after successful eradication of dysplasia with radiofrequency ablation.

Radiofrequency ablation (RFA) for patients with Barrett esophagus with high-grade dysplasia (HGD) has been clearly established as an acceptable and preferred treatment option for the majority of these patients. In the initial multicenter trial, RFA completely eradicated dysplasia in 91% of patients with HGD (JW Gastroenterol May 27 2009) and in 95% who were followed up for 2 years. Repeat RFA was performed in 55% of patients after the 1-year primary end point — mostly based on the discretion of the endoscopist rather than biopsy indication (JW Gastroenterol Nov 4 2011). No cancers were reported. The inference by some clinicians is that patients who have had successful ablative therapy can be considered cured and can be discontinued from surveillance. However, a new case report provides contrary evidence.

Three patients underwent successful RFA treatment of Barrett esophagus with HGD at tertiary academic centers; procedures were performed by nationally recognized experts in RFA. Two patients underwent endoscopic mucosal resection before RFA. The first patient had five post-RFA surveillance endoscopies during 2 years before subsquamous HGD was detected. The second patient had normal neosquamous epithelium at 3 months but subsquamous esophageal adenocarcinoma detected at 6 months. The third patient underwent two endoscopies at 3-month intervals, and at 9 months, a nodular area was noted and a subsquamous esophageal adenocarcinoma was detected.

Comment: This report emphasizes the ongoing risk for cancer following successful RFA treatment in patients with Barrett esophagus and HGD. These cases clearly demonstrate the need for meticulous surveillance. However, until the optimal surveillance schedule after ablative therapy is defined in national guidelines, experts currently recommend surveillance intervals of 3 months in year 1, 6 months in year 2, and 1 year thereafter. Quadrant biopsies should be taken every 1 cm in addition to separate biopsies of any visible lesions. Although RFA poses less risk than surgery, it is far from a cure.

Source: Journal Watch Gastroenterology

 

Probiotics Might Lessen Infant Skin Problems?


infant-skinBeneficial bacteria such as those found in fermented foods and probiotics thrive in your intestines to perform a magnificent symbiotic relationship with you, improving not only your overall health but even your skin.

Signals from these gut microorganisms are known to interact with organisms on your skin and research suggests these interactions, or another unknown probiotic-skin connection, can help with skin conditions, including eczema.

Beneficial Bacteria Halve Infants’ Eczema Risk

Eczema, also known as atopic dermatitis, is very common in infants and young children. According to the American Academy of Dermatologists, it affects between 10 percent and 20 percent of all infants, resulting in red, itchy patches or rash on the skin (eczema is often known as “the itch that rashes,” meaning there’s really no rash until you start scratching the itchy area).

Eczema is more than just a skin problem, however, as it is an indication that there is a problem with your immune system. In fact, eczema is said to be one of the first signs of allergy during the first days of life, and about three out of four children with eczema later go on to develop asthma or hay fever.

What does this have to do with the beneficial bacteria in your gut?

Most people, including many physicians, do not realize that 80 percent of your immune system is located in your digestive tract, making a healthy gut a major focal point in your efforts to achieve optimal health. In fact, the root of many health problems is related to an imbalance of intestinal bacteria.

You may be surprised to learn that the bacteria in your gut outnumber the cells in your body by a factor of ten to one — you have approximately 100 trillion bacteria living in your GI tract, comprised of as many as 500 different species and 7,000 different strains. Collectively, each of us carries around several pounds of bacteria inside us!

The beneficial bacteria in your gut has actually been found to help prevent allergies by training your immune system to distinguish between pathogens and non-harmful antigens and respond appropriately – and this may be one reason why they also appear so beneficial for eczema.

According to the latest research, a review of 21 studies that included 11,000 participants, in children at risk for developing eczema, supplementing with a type of beneficial bacterial called Lactobacillus rhapsodic GG or Lactobacillus rhamnosus strain HN001 cut kids’ risk of developing eczema in half compared to those taking a placebo.1 Children that took other various mixtures of probiotics also had their risk of eczema at least halved.

Please note that this does not mean that this strain of beneficial bacteria is the only one that provides the benefit. It happens to be the one that was studied. These studies are not free and someone has to pay for them. But it is likely that most beneficial bacteria, especially lactobacillus strains, provide similar benefits.

A Simple Way to Lower Your Child‘s Risk of Eczema

That probiotics are beneficial for preventing eczema in infants is not a new finding, but rather one that I’ve been reporting on since at least 2001, when researchers also found infants receiving probiotics supplements were half as likely to develop the skin condition.2

In 2008, another found that children with only a limited variety of bacteria in their intestines one week after birth were more likely to developed eczema by the age of 18 months.3 Still more research published in 2009 also found that daily supplements of probiotic foods may reduce the risk of eczema in children by 58 percent.4

It’s thought that one reason giving an infant probiotics helps to stave off eczema and other allergic diseases is by beneficially altering the early colonization of bacteria in their gut, which may help the child’s immune system to develop and mature. At birth the human gastrointestinal tract is sterile, but in the first days, months and years of life a rapid colonization of bacteria occurs until a stable indigenous gut microflora is established.

Babies that are given the best start nutritionally by being breastfed (the major source of your immune-building good bacteria following their initial implantation through the birth canal) also tend to have intestinal microflora in which beneficial bacteria predominate over potentially harmful bacteria. So, the best way you can encourage your newborn’s gut health to flourish is by breastfeeding.

The most benefit from probiotics, at least in terms of eczema, may happen very early in life. After three months of life, the 2009 study above found no difference in the incidence or severity of eczema between groups given probiotics or a placebo, noting that the preventive effect appeared to be established within the first 3 months of life, although it appeared to be sustained during the firs two years.

What this means is it is essential that your baby to receive plenty of beneficial bacteria in the first few months of life and continuing through childhood and adulthood.

Your baby gets his or her first “inoculation” of gut flora from your birth canal during childbirth. If your flora is abnormal, your baby’s flora will also be abnormal; whatever organisms live in your vagina end up coating your baby’s body and lining his or her intestinal tract.

Many infants are challenged because their mother previously took birth control pills, was on antibiotics or was a typical American and ate 150 pounds or more of sugar a year. Any mother with any or all of these risk factors is likely to start her infant’s life out on shaky ground, as she is unable to provide them with optimal gut flora that will nourish their health. So any mother in this group needs to be especially conscious of this information and recommendations.

Studies show that a growing number of women have unknown vaginal infections at childbirth, which can result in the passage of abnormal microflora to their babies. This introduction of unfriendly flora, combined with antibiotic use, can predispose a baby to Gut and Psychology Syndrome (GAPS). GAPS can have very damaging long-term effects on a child’s health, including such conditions as autism, ADHD, learning disabilities and a number of other psychological, neurological, digestive and immunological problems.

Dr. Natasha Campbell-McBride is a neurologist and neurosurgeon who has devoted years of her career to studying this phenomenon, and how to treat and prevent it. Pathogenic microbes in your baby’s digestive tract damage the integrity of his or her gut wall, allowing all sorts of toxins, microbes and macromolecules from undigested food to flood his or her bloodstream, and then enter the brain and disrupt its development.

Breastfeeding protects your baby from this abnormal gut flora, which is why breastfeeding is so crucial to your child’s health. No infant formulas can do this.

Any time your baby is given a broad-spectrum antibiotic, his or her beneficial flora are wiped out, giving pathogenic flora (including antibiotic-resistant bacteria) a window of opportunity to overgrow and wreak havoc. It takes the “friendly flora” two weeks to two months to recover, but by then, some not-so-friendly ones have found a niche. The first symptoms you typically see are colic, loose stools, constipation, eczema or respiratory infections.

Adding a vaccine that further stresses your baby’s immature immune system is like adding fuel to a fire — conditions that raise your child’s risk for a major adverse vaccine reaction. In other words, a vaccine could be the proverbial “final straw” if your baby has GAPS. But all of this may be corrected, or even averted, by the addition of some natural probiotics.

Fermented Foods are Important for Babies, Infants and Children Too

Before you give your child fermented foods or probiotics it is especially important to recognize that they are not magic bullets and cure-all ills. They need to be integrated with a healthy diet. If your child is consuming loads of sugar, grains and fruit juices, those sugars will rapidly break down in the intestine and feed the pathogenic bacteria, which effectively competitively inhibit the beneficial bacteria you are supplementing with making them useless and virtually ineffective.

Once you have the diet optimized, providing abundant probiotics in the form of fermented foods is one of the most powerful ways to restore your baby’s beneficial gut flora. Oftentimes, a commercial probiotic supplement won’t even be needed.

Apart from breastfeeding, the first fermented food Dr. Campbell-McBride recommends for your infant is raw organic grass-fed yogurt (not commercial yogurt from the grocery store), because it’s well tolerated by most infants and children. It’s best to make your own yogurt at home from raw organic milk, and start with a very tiny amount. Once yogurt is well tolerated by your baby, then start introducing kefir. If you have any problems with cow’s milk dairy, you can try goat’s milk dairy as an alternative or substitute vegetables fermented with yogurt culture or kefir culture.

If your baby has a severe condition, such as necrotizing enterocolitis (NEC), then the addition of a high-quality probiotic supplement may be needed.

You can ferment virtually any food, and every traditional culture has fermented their foods to prevent spoilage. There are also many fermented beverages and yoghurts. Quite a large percent of all the foods that people consumed on a daily basis were fermented, and each mouthful provides trillions of beneficial bacteria — far more than you can get from a probiotics supplement.

Here’s a case in point: It’s unusual to find a probiotic supplement containing more than 10 billion colony-forming units. But when my team actually tested fermented vegetables produced by probiotic starter cultures, they had 10 trillion colony-forming units of bacteria. Literally, one serving of vegetables was equal to an entire bottle of a high potency probiotic! Fermented foods also give you a wider variety of beneficial bacteria, so all in all, it’s your most cost effective alternative.

Fermenting your own foods is a fairly straightforward and simple process, which is described in detail here. Remember, in addition to protecting your child from developing eczema, research shows giving pregnant women and newborns doses of good bacteria can:

  • Help prevent childhood allergies by training infants’ immune systems to resist allergic reactions5
  • Help optimize your baby’s weight later in life6
  • Improve the symptoms of colic, decreasing average crying times by about 75 percent7
  • Reduce your risk of premature labor

Consuming fermented foods is, again, the best way to optimize your, and your children’s, beneficial gut flora. To learn more, please listen to my interview with Caroline Barringer, a Nutritional Therapy Practitioner (NTP) who has been involved with nutrition for about 20 years. She’s now one of Dr. Campbell-McBride’s chief training partners, helping people understand the food preparation process.

Virtual architecture, intruding on the real thing?


denisovanfingerbone1Architecture tells stories.

That’s the idea University of Michigan architecture prof Sophia Psarra lays out in her recent book, Architecture and Narrative. The way forms and space are arranged tells a story. One’s movement through a building has its own unique plot line, too.

Plus we all know that the best pitchmen build the most. Architects who are the most convincing communicators, who spin a world-class yarn, get more commissions.

 

Online architecture for the masses: A tower rendered in Minecraft. Image courtesy Extreme-Games.net

Yet lately, this architectural ability for storytelling is increasingly a feature of the virtual world. For at least 15 years, we’ve seen “massively multiplayer” games like Doom, SimCity and, lately, Minecraft create entirely new and surprising structures and urban environments online. Real architects increasingly use BIM, building information modeling, to construct and explore a virtual replica of what they build.

Immersive visualization platforms have gone from the military and amusement parks to valuable commercial uses. The most futuristic include the RealityCave in Waterloo, Ontario, where real estate developers and project designers don their 3-D goggles to “walk through” fancy new buildings long before they’re built.

At RealityCave, no talking is required: The animation does all the work, and the design speaks for itself.

 

Inside Canada‘s RealityCave, architects and their clients explore unbuilt projects. Photo: Scott McQuarrie, re:actionphotography

At an upcoming seminar in Los Angeles on the topic, “The City and The Book,” panelists including architects like Greg Lynn will sit down with publishers and animation studio executives to discuss storytelling opportunities of tomorrow. In the same breath, they’ll extol the the virtues of interactive publishing and virtual architecture — techniques now rapidly converging on each other.

Game designers will be at the table, too. Many of them graduated from America’s best architecture schools only to be lured away to Hollywood to fabricate backdrops for shoot-’em-up films and Xbox backdrops.

There are indications the brain drain might turn around.

I’ve seen it firsthand. Eleven-year-old Myles Platt of Montclair, N.J., got hooked on Minecraft and decided he wanted to become an architect, which his father ascribes in part to his ability to build online. This summer, Myles went to an architecture summer camp in New York City, where he designed a complex skyscraper form inspired by Freedom Tower.

 

The torqued tower by 11-year-old Myles Platt, who got interested in architecture after playing hours of Minecraft.

If you haven’t seen Minecraft in action, have a look on YouTube. Harry Allen, an expert on virtual online architectural worlds used for games and eye candy, says, “Minecraft isn’t a game. It’s a movement.”

True enough. Since Minecraft launched last November, it has registered at least 36 million users. You can play it on your smartphone or computer to explore limitless virtual worlds, build your own homes, temples, rollercoasters and cities. It’s all low-res, and there are lots of silly games and adventures to keep kids occupied.

The best feature about Minecraft is that you can collaborate on projects and earn special status by logging experience and achievements.

 

Canada’s minister of natural resources, Joe Oliver, inside the RealityCave. (By Anthony Reinhart)

Some educators are exploiting Minecraft to teach kids like little Myles Platt. At New York City’s Columbia Grammar and Prep School, the second-grade teacher Joel Levin uses Minecraft as a virtual classroom. The kids join together online, working together to build their very own cityscape.

This is a long way from the first-person shooter games that first sucked millions of users into the virtual world. Those hyper-violent games created beautiful virtual architecture, however — gorgeous and often baroque backdrops to color the shooter’s unique narrative.

In the future, perhaps we’ll all have our own personal architecture, a backdrop and cloak for our very personal stories.

Source: Smart Planet

Next-Generation Protease Inhibitor Effective for HCV Infection.


Patients who received vaniprevir achieved higher rapid virologic response rates than those who received placebo.

Adding telaprevir and boceprevir to standard peginterferon and ribavirin therapy has been shown to significantly improve virologic response rates for patients with genotype 1 hepatitis C virus (HCV) infection (JW Gastroenterol Jul 1 2011 and JW Gastroenterol Mar 30 2011). However, these first-generation HCV nonstructured protein (NS)3/4A protease inhibitors require a complex administration schedule and are associated with additional adverse effects.

To evaluate the efficacy and safety of vaniprevir (MK-7009) — a macrocyclic next-generation HCV NS3/4A protease inhibitor that is administered once or twice daily — investigators conducted an industry-funded, phase II, multicenter, randomized, double-blind, placebo-controlled, dose-ranging study involving 94 treatment-naive adults with chronic HCV genotype 1 infection. Patients were assigned to vaniprevir (300 mg twice daily, 600 mg twice daily, 600 mg daily, or 800 mg daily) or matched placebo in combination with peginterferon (180 μg weekly) and ribavirin (1000–1200 mg daily) for 4 weeks. Thereafter, all patients continued peginterferon and ribavirin for 44 weeks. The primary endpoint was rapid virologic response (RVR); exploratory endpoints included sustained virologic response (SVR).

All 94 patients completed the 4-week triple-dosing regimen. Of these, 78 completed 48 weeks of peginterferon and ribavirin treatment, and 84 completed a 6-month post-therapy follow-up. The rate of viral decline by week 4 was at least 3log10 IU/mL greater in the vaniprevir groups versus the placebo group. Rates of RVR were significantly higher in all vaniprevir groups versus the placebo group (68.8%–83.3% vs. 5.6%; P<0.001). SVR rates were nonsignificantly higher in the vaniprevir groups than the placebo group (61.1%–84.2% and 63.2%, respectively), likely due to the small sample size. Safety profiles were similar between the vaniprevir and placebo groups, except vomiting occurred more often in the vaniprevir groups. HCV resistance variants were noted in three patients receiving vaniprevir.

Comment: This phase II study of vaniprevir shows early promise for a next-generation protease inhibitor–based triple therapy that is easy to administer in a daily or twice-daily dosing schedule. Subsequent vaniprevir studies are needed to identify the optimal dose and duration of therapy to maximize SVR and maintain an excellent safety profile.

Source: Journal Watch Gastroenterolog

 

What Neandertal DNA can teach about race, autism, and more?


london-bicycle-elevated-highway-screenshotPaleoanthropologists used to pray that they would unearth big troves of intact Neandertal skeletons and well-preserved artifacts that they could comb for clues to the origins of the human race. But these days, they can often get as much or more information straight from the DNA in bone fragments.

Case in point: the newly published genome study in Science from Matthias Meyer and Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology and their international team of colleagues. Using a novel DNA sequencing technique that works particularly well with degraded specimens, they examined the genome of a seven-year-old girl who died more than 74,000 years ago, using a surviving sliver from one of her finger bones. That girl’s bone fragment was one of the few pieces of evidence that in 2010 revealed the existence of the ancient Denisovan people — contemporaries of the Neandertals who overlapped with them in eastern Asia.

 

Matthias Meyer at work in the laboratory. (Credit: Max Planck Inst. for Evol. Anthro.)

Yet from that extraordinarily humble source, the Max Planck scientists have drawn a wealth of insights. They learned, for instance, that the Denisovans were probably dark-skinned, unlike the pale Neandertals. Because the girl had two X chromosomes, one from each parent, the scientists were able to infer that the Denisovan population had relatively little genetic diversity. Living natives of Papua New Guinea, Australia, and some southeast Asian islands derived about 6 percent of their genes from the Denisovans, yet the Denisovans seem to have contributed nothing of lasting value to the DNA of people in other parts of the world. Comparison with the Denisovan DNA also allowed the researchers to recognize that Europeans carry somewhat fewer genes from Neandertals than do East Asians and Native Americans.

Such discoveries are endlessly fascinating to some of us. But I can also understand that many people might reasonably question why any of these details matter. After all, Neandertals and our other ancient ancestors have been extinct for 30,000 years or longer. Why should we care so much about their DNA? Is there any practical value to be had from these studies?

I’ll argue that there is, and that it might be especially useful in helping us to develop more enlightened attitudes about racial differences and autism. To explain why, it may be useful to start by reviewing some of the major current ideas about how humans evolved in the first place.

Overview of our origins

Fifteen or 20 years ago, it might have been easier to find a rough consensus among paleoanthropologists about this topic than it is today precisely because of the recent bounty of fossil and DNA discoveries. All that information has answered some important questions and filled in a level of detail that might once have seemed inconceivable, but curiously enough, some of the broad strokes in the big picture have become less clear.

Roughly speaking, in Africa 1.7-2 million years ago, the earliest primitive members of the genus Homo appeared. They were small, hairy people who might look a bit apelike by our standards of beauty, but they had bigger brains and more tools than the upright Australopithecus species before them. The Homo erectus people were successful enough to spread out of Africa and migrate across Asia, and are responsible for some of the ancient fossils given names such as “Peking man.” Nevertheless, they were probably something of a false start for the spread of humanity as we now it.

 

The more relevant development came between 400,000 and 800,000 years ago, with a new wave of African emigration into the Middle East and Asia by a group of people with even bigger brains and better tools. They gave rise to the brawny, brow-ridged Neandertal people, Europe’s first inhabitants. Yet they also spawned at least one other Asian group, the Denisovans. (It wouldn’t be too surprising anymore if still more sibling groups contemporary to the Neandertals and Denisovans turned up elsewhere in Asia.) Meanwhile, humans also continued to prosper and evolve in Africa, and by 80,000 years ago, ones with a fully modern appearance had appeared and started their own exodus into the rest of the Old World.

What happened next is the stuff of archaeologists’ heated arguments. The oldest theory is the multiregional hypothesis strongly advocated by Milford Wolpoff of the University of Michigan in Ann Arbor. It claims that as different in appearance as moderns, Neandertals, Denisovans, and even the early Homo erectus might seem, they were all still members of the same human species. Over time, the modern traits predominated but some of the traits in local populations that had adaptive value (such as shorter, thicker bodies in cold climates) were retained and might bear some connection to physical differences seen in populations around the world today.

In the 1980s, however, a starkly opposing theory emerged largely, though not exclusively, from studies of mitochondrial DNA in living populations. (Mitochondria, the organelles in animal cells that create chemical energy, carry their own unique sets of genes, completely separate from the DNA in the nucleus for the rest of the cell’s genes.) Those analyses suggested that the maternal bloodlines of everyone alive today converged back on Africa less than 100,000 years ago, with no trace of a genetic contribution from local groups elsewhere. That conclusion spawned the “out of Africa” model, according to which scientists such as Chris Stringer of the Natural History Museum in London argued that when the anatomically modern humans colonized Asia and Europe, they displaced the Neandertals and other ancient residents without breeding with them. Whether the moderns had directly exterminated the ancients or simply outcompeted them for resources was anybody’s guess, but interbreeding was effectively nonexistent.

The out-of-Africa model and its mitochondrial DNA evidence proved highly persuasive to many anthropologists. Disagreements remained fierce, but during the 1990s it was often presented as the default explanation for human origins, even though almost everyone acknowledged how counterintuitive it seemed that modern humans would so completely refrain from mixing with creatures that looked so much like them. Mostly, scientists chalked it up to some obscure biological or behavioral speciation barrier.

DNA twists the plot

Ironically, one type of DNA evidence helped put the out-of-Africa model on top but later DNA evidence helped knock it back down. If brief, when Svante Pääbo and other researchers began the painstaking work of recovering nuclear DNA from Neandertal bones and sequencing it, they discovered that on average about 4 percent of living people’s genes are derived from Neandertals. (The telling exception was in people of modern African descent, whose genes were generally less than 1 percent Neandertal, which is what one might expect if the mixing would have occurred primarily outside Africa.)

Four percent might not sound like much, but it is substantially more than an out-of-Africa scenario with strict replacement rather than interbreeding would seem to allow. It’s remotely possible that this mixture is an artifact of old, unequal mixing of what became Neandertal genes within the ancestral African population (although anthropologist John Hawks has explained on his blog why that situation seems unlikely). The more likely explanation, though, is that some level of interbreeding did occur. For that reason, Stringer and other defenders of the concept now refer to a modified “mostly out of Africa” model that acknowledges some interbreeding but considers it largely trivial in extent and consequences.

That same evidence has, of course, only reinvigorated the multiregional hypothesis (though one might wonder why the percentage of ancient humans’ genes in us isn’t then higher). It has also nourished a popular new “assimilationist” school of thought that pragmatically splits the difference between multiregionalism and out-of-Africanism. The assimilationist model says that when the anatomically modern humans left Africa 80,000 years ago, they retained their own identity but also mixed to a degree with the older human populations they encountered. Both the modern and ancient groups became locally varying patchworks of physical traits and technologies. In the end, the ancients’ societies were too disrupted to survive but some of their genes persist in us.

The question of when and how humans emerged over the past few hundred thousand years is therefore considerably more complicated and less settled than it might have seemed a couple of decades ago. The same can be said for the closely related question about whether Neandertals, for example, represent their own species (Homo neanderthalensis) or just a subspecies (Homo sapiens neanderthalensis) alongside our own (Homo sapiens sapiens) – or whether, as Wolpoff would have it, virtually all of Homo has been one big species that has varied overtime.

Why we should care

Even if the science of human origins is still a work in progress, the accumulating information about how we got here and indeed what constitutes a member of the human race offers some useful perspectives on matters of scientific and ethical importance.

Perspective on the age of humanity. One small point that studies of the DNA of Neandertals and other ancient people illuminate is just how old or young we humans are as a species. The paleontological record indicates that the mean survival time for a mammalian species is about a million years, though some have lasted ten times that long. If we emerged only within the past 100,000 years or less, then Homo sapiens is indeed an amazingly young and precocious lot. And a loose, handwaving argument might therefore be made that we also probably have a commensurately long future ahead of us.

On the other hand, if Wolpoff is right and we are part of a species that has been around for two million years, then we are much more senior. It might make us look at the extinction rates with a little more sense of urgency.

Perspective on our nonprogressive evolution. The molecular study of our evolution also helps to drive home how unexceptional our biological history has been. Many icons of human evolution unintentionally reinforce a misleading sense of progress — witness the classic March of Progress illustration by Rudolph Zallinger that shows a modern human leading a Neanderthal and other “less evolved” ancestors.

But that sense really changes if we and Neandertals are seen as sibling groups, diverging but also sometimes re-merging throughout history. Our evolutionary history looks much less progressive and more like that of other species.

Perspective on race. For centuries (at least), arguments over race have invoked inappropriate biological concepts to make or defend distinctions among peoples — and distinctions in how they should be treated. They have likened races to subspecies to justify their inherent biological reality, along with some allegedly biological superiority, inferiority, or “otherness.”

A simple refutation of that idea has been the proof that the diversity of genetic characteristics within racial groups is greater than the diversity separating them: human races are not well enough defined and different enough to be meaningful biological groups. For that reason, many scientists now argue that race is not a biological concept but rather a social concept that sometimes carries biomedical consequences.

(Here’s what that means, if it isn’t immediately clear: In a society that mistreats the dark-skinned in general, for instance, black people may be at higher risk for diseases of poverty without having an intrinsic susceptibility to them. But an example that is perhaps less obvious is that of sickle-cell anemia, which is more common in those of black African descent than in those of white European descent. That’s because many people whose ancestors lived in regions where malaria was prevalent carry mutations for sickle-cell anemia that offer some protection from the parasite. But not all of those people are racially black and not all blacks carry the mutation. Sickle-cell information campaigns target predominantly black populations because society doesn’t accurately group people in terms of “ones whose ancestors had a lot of malaria.” In this case, race is a flawed but useful proxy for that nonexistent classification — but not because of the biological characteristics of the race as such.)

The foregoing is all true only in terms of race as we understand the concept today, however. If the multiregionalists and the assimilationists are right, then the Neandertals, Denisovans, and other ancient people we displaced may not have been separate species of person at all. They may instead have been races so different from modern humanity that they really were akin to other subspecies. Differences in their anatomical, genetic, behavioral, and intellectual traits would surely dwarf any seen in the world today among Homo sapiens. Color me naïve, but I would like to think that these insights might help to strengthen the spirit of color-blind brotherhood we ought to feel for one another.

(And in anticipation of a query I can feel coming: no, hypothetically, I would not be in favor of summarily treating Neandertals as second-class citizens if ever we could use technology to clone one. Neandertals were people and therefore, in my opinion, would deserve to be fully enfranchised. However, the question shows how ethically fraught such high-tech resurrections could be.)

Perspective on neurodiversity. In the course of their recent analysis of the Denisovan DNA, Meyer and Pääbo identified 23 highly conserved areas of the human genome that seem to be unique to our kind. Eight of those contain genes that previous studies have tied to nerve growth and other aspects of brain function. And three of the conserved genes — ADSL, CBTNAP2, and CNTNAP2 — have been implicated in some forms of autism.

Those correlations are not entirely surprising. Looking at the artwork and artifacts left by Neandertals, some archaeologists have argued that they seemed to lack a capacity for symbolic thought. Others such as John J. Shea disagree and suggest that the differences between modern and ancient thinking may have been exaggerated. Nevertheless, whatever evolutionary changes marked the emergence of modern humans, it’s likely they involved at least some important changes to our cognitive, linguistic, and social abilities. One might expect to find genes for those traits to be altered or absent in older types of humans.

I want to be perfectly clear on this point: this discovery absolutely does not mean that the Denisovans, Neandertals, and other ancients were autistic. Nor does it mean that autistic people exhibit prehistoric thinking. Rather, what it underscores is that normal modes of human thought occupy a broad continuum.

The “neurotypical” way in which most people see the world today is only one way of doing it. As enlightened studies of autism repeatedly drive home, we need to appreciate those variations as part of our human spectrum rather than just labeling them defective or abnormal.

With or without all our cognitive abilities, the Neandertals and Denisovans survived under amazingly hostile conditions for hundreds of thousands of years. Their different ways of thinking may have been dominant throughout long stretches of the past, and might even have had advantages over our own under their circumstances. The lesson that these ancients offer is that we should broaden our minds about how broad minds can be.

Source: Smart Planet