The Secret of Dino Success.


Stretching 22 meters from snout to tail, Apatosaurus was one of the largest creatures to ever walk the planet. At 10 meters long with razor-sharp teeth, Allosaurus was one of the most fearsome. How did such animals come to dominate the planet? A new study suggests it was nothing more than dumb luck.

The predecessors of dinosaurs rose from the ashes of Earth’s worst extinction. Prior to 251 million years ago, the dominant, large animals on land were the therapsids, early forerunners to mammals. These shrew- to hippo-size creatures came in a variety of forms, from tubby, tusked herbivores to agile, saber-toothed predators. At the end of the Permian period about 250 million years ago, however, rampant global warming and drastic drops in the atmosphere’s oxygen content wiped out almost 90% of the planet’s species. Many therapsids disappeared, and the few, diminutive lineages that survived faced competition from a different sort of creature—the archosauromorphs. These reptiles, the precursors of dinosaurs, crocodiles, and their closest relatives, would quickly rise to dominance.

To figure out how the archosauromorphs came to dominate other species, graduate student Roland Sookias of Ludwig Maximilian University of Munich in Germany and colleagues traced the evolution of body size in therapsids and archosauromorphs. They used femur length of over 400 species of fossil creatures spanning 100 million years to estimate body mass and tracked how body size changed in the two groups from the time of the Permian mass extinction to the heyday of the biggest Jurassic dinosaurs about 150 million years ago.

Sookias and collaborators confirmed that the archosauromorphs grew into a wider range of sizes—including the largest animals on land-while the therapsids remained small. One reason may be that archosauromorphs simply grew and bred faster. The quicker growth rate of the archosauromorphs, which had been discovered in previous studies of dinosaurs and their relatives, meant that these animals reached sexual maturity relatively earlier than the therapsids. Faster growth and breeding meant that the archosauromorphs quickly spread and adapted to overtake available habitats and the ecological roles of large herbivores and big predators before the smaller, slower-growing therapsids had a chance to put up a fight, Sookias says.

Not all of the archosauromorphs were large—over time, they diversified into a variety of body sizes throughout the Triassic and Jurassic, from the relatively tiny feathered dinosaur Anchiornis to Apatosaurus and Diplodocus, some of the biggest creatures to ever walk the planet. A key to evolving into such a wide range of sizes, Sookias says, may be in specialized features such as air pockets inside dinosaur bones that reduced the weight of their skeletons and opened up a wider range of possible sizes.

Thus, it appears that the forerunners of dinosaurs and mammals did not fiercely compete for space but rather used available space differently. After the Permian extinction, the archosauromorphs grew faster than the therapsids and effectively shut out the mammal precursors. The archosauromorphs filled up the ecological space so quickly that the therapsids were forced to stay small and use what was left over, the team reports online today in the Proceedings of the Royal Society B.

Paleontologist Jessica Theodor of the University of Calgary in Canada calls the research “intriguing” but notes a caveat about the methodology. While the new study used the femur lengths of therapsids and archosauromorphs to estimate body mass, Theodor says that the lack of modern, closely related species to the therapsids and archosauromorphs means that the estimates of body mass will be rough and dependent on how we reconstruct the posture of the ancient animals.

Still, Theodor points out that the new study indicates how archosauromorphs may have become dominant because they simply “prevented the remaining, smaller therapsids from evolving larger size.” Instead of going head to head with dinosaurs, mammals and their ancestors may have simply been left in the shadows until they got their chance 65 million years ago, when another great extinction ended the reign of the dinosaurs and allowed mammals to flourish.

Source:ScienceNOW

 

 

 

A novel technique of multiple-site epidural blood patch administration for the treatment of cerebrospinal fluid hypovolemia.


An epidural blood patch (EBP) is a widely accepted standard procedure to treat CSF hypovolemia, especially when the epidural CSF leak is detected by spinal MRI or CT myelography (CTM). In quite a few cases, however, the leaked CSF is spread over a large area along the spinal epidural space, making it difficult for the surgeon to clearly identify the true leakage points. In such cases, autologous blood can be infused at multiple spinal levels with multiple entries. In this paper, the authors have devised a new multiple-site EBP method with a single lumbar entry point by way of using an intravenous catheter as a slidable device for continuous infusion. In this report, they introduce this new, single-entry, continuous multiple-site EBP administration technique and report some of the results that they have obtained.

Methods

An EBP was applied via an epidural catheter in 5 patients with spontaneous CSF hypovolemia (3 men and 2 women; mean age 47.2 years, range 34–65 years). The detection of an epidural CSF leak was based on MRI and/or CTM findings. In all cases, however, the leakage sites could not be identified clearly. The main symptoms of these patients were recurrent spontaneous chronic subdural hematoma with orthostatic headache (3 patients) and orthostatic headache only (2 patients). All patients underwent surgery in the prone position on an angiography table, and biplane fluoroscopy was used for accurate manipulation. After administration of a local anesthetic, the authors inserted a 4-Fr short sheath (which is standard in angiography) through the lumbar interlaminar window and placed it in the dorsal epidural space. They then introduced a 4.2-Fr straight catheter through the sheath and navigated it upward along a 35-gauge guidewire whose tip was moved upward beyond the cranial end of the detected CSF leakage. Blood was obtained from each patient from a previously secured venous entry on the forearm, and it was injected slowly into the epidural catheter. Each time, the authors tried to infuse as much autologous blood as possible into the epidural space, while moving the catheter gradually in the caudal direction in response to the patient’s expression of pain.

Results

In all 3 cases of chronic subdural hematoma, its recurrence was prevented. In 1 patient, the orthostatic headache disappeared completely, and it was relieved in the other 4 patients.

Conclusions

An efficient treatment option for CSF hypovolemia is provided by the new application method of EBP with the aid of an intravenous catheter as a slidable device, which enables infusion of a sufficient amount of autologous blood into multiple epidural areas with a single lumbar entry point.

Source:Journal of Neurosurgery.

 

Antibiotic-Free Meat Not Free of Drug-Resistant Bacteria.


If you’re paying premium prices for pesticide- and antibiotic-free meat, you might expect that it’s also free of antibiotic-resistant bacteria. Not so, according to a new study. The prevalence of one of the world’s most dangerous drug-resistant microbe strains is similar in retail pork products labeled “raised without antibiotics” and in meat from conventionally raised pigs, researchers have found.

Methicillin-resistant Staphylococcus aureus (MRSA), a drug-resistant form of the normally harmless S. aureus bacterium, kills 18,000 people in the United States every year and sickens 76,000 more. The majority of cases are linked to a hospital stay, where the combination of other sick people and surgical procedures puts patients at risk. But transmission also can happen in schools, jails, and locker rooms (and an estimated 1.5% of Americans carry MRSA in their noses). All of this has led to a growing concern about antibiotic use in agriculture, which may be creating a reservoir of drug-resistant organisms in billions of food animals around the world.

Tara Smith, an epidemiologist at the University of Iowa College of Public Health in Iowa City who studies the movement of staph bacteria between animals and people, wondered whether meat products might be another mode of transmission. For the new study, published this month in PLoS ONE, she and colleagues bought a variety of pork products—395 packages in all—from 36 different stores in two big pig farming states, Iowa and Minnesota, and one of the most densely populated, New Jersey.

In the laboratory, the team mixed meat samples “vigorously” with a bacterial growth medium and allowed any microbes present to grow. MRSA, which appears as mauve-colored colonies on agar plates, was genetically typed and tested for antibiotic susceptibility.

The researchers found that 64.8% of the samples were positive for staph bacteria and 6.6% were positive for MRSA. Rates of contamination were similar for conventionally raised pigs (19 of 300 samples) and those labeled antibiotic-free (seven of 95 samples). Results of genetic typing identified several well-known strains, including the so-called livestock-associated MRSA (ST398) as well as common human strains; all were found in conventional and antibiotic-free meat. (The label “antibiotic-free” is not regulated, and the products were not “certified organic.”)

Smith says she was surprised by the results. In a related investigation, which has not been published, her group tested pigs living on farms and found that antibiotic-free pigs were free from MRSA, whereas the resistant bug is often found on conventional pig farms.

The study reveals an important data point on the path from farm to fork, yet the source of the MRSA on meat products is unknown, Smith says. “It’s difficult to figure out.” Transmission of resistant bugs might occur between antibiotic-using and antibiotic-free operations, especially if they’re near each other, or it could come from farm workers themselves. Another possibility is that contamination occurs at processing plants. “Processing plants are supposed to be cleaned between conventional and organic animals,” she says. “But how well does that actually happen?”

In another recent study, researchers from Purdue University in West Lafayette, Indiana, found that beef products from conventionally raised and grass-fed animals were equally likely to be contaminated by antibiotic-resistant Escherichia coli. In a second study by the same group, poultry products labeled “no antibiotics added” carried antibiotic-resistant E. coli and Enterococcus (another bacteria that causes invasive disease in humans), although the microbes were less prevalent than on conventionally raised birds.

“The real question is, where is it coming from, on the farm or post-farm?” says Paul Ebner, a food safety expert who led the Purdue studies. And the biggest question of all, he says, “Is it impacting human health?”

“There’s a tremendous amount of interest in this issue—feeding antibiotics to food animals,” says Ellen Silbergeld, an expert on health and environmental impacts of industrial food animal production at the Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland. “Thus, determining when amending that practice makes a difference is important.”

“The definitive study would take every bacterium and follow that along until it gets in humans—from food supply to causing a certain disease,” Smith says. “It would be a huge and costly study that no one’s going to do, but that’s what the meat producers” say is missing.” Meanwhile, Smith says she will continue her investigations of MRSA, one potential transmission point at a time.

This item has been corrected. All original references to “organic” have been replaced by “antibiotic-free” because the meat used in this study was not certified organic.

Source: ScienceNOW

 

 

How to Build a Hardy Web.


Flies caught in webs with hungry spiders bearing down on them don’t have the time to appreciate good engineering. Luckily, researchers aren’t so constrained. A new analysis reveals the intricacies of spider web design, showing how the unique properties of its silk turn webs into flexible yet strong traps.

“This is very innovative,” says Joyce Wong, a biomaterials researcher at Boston University, who was not involved with the study.

Spider silk is a remarkable material, says study co-author Markus Buehler, a materials scientist at the Massachusetts Institute of Technology in Cambridge. Take the single thread that suspends a dropping spider. When pulled, the thread will begin to stretch, sometimes to twice its original length, but eventually it will stiffen again.

Buehler and colleagues previously discovered that, during the elastic stage, the proteins in spider silk are scrunched into intricately folded structures. Pulling disentangles these convolutions, he says. When there are no more knots to untie, the proteins reconfigure into tough structures called beta-sheet nanocrystals.

Spiders do more than just dangle, however, and Buehler and colleagues wanted to see how these molecular properties impact their entire web. So they sought out a wide, radial spider web outdoors. With the spider still on board, the team hung tiny weights made from metal wires at various places, mimicking a fly pinging into the trap. When tugged, individual silk spokes would stretch and snap, but other threads wouldn’t break with them, Buehler says. And spider webs can stand to lose a few threads. These traps seem to retain their original strength even if 10% of the spokes at various locales are snipped, the group reports today in Nature.

To test if spider silk’s unique stretchability might be responsible for this structural feat, the team turned to computer simulations. Here, bending reality is easy. The team designed generic-looking webs that were constructed, for instance, of a type of material called dragline silk that was modified to be entirely stretchy. In other words, no beta-sheets. When an imaginary finger pulled on these simulations, whole portions of the web bulged out then eventually ruptured. Buehler explains that totally elastic spider silk would distribute weight widely across the net, which means that pulling on one thread can damage many others. Real orb weaver’s silk, however, can be either stretchy or stiff at different times, which produces threads that flex and then snap in just the right way to avoid wrecking nearby spokes.

The findings highlight the unique ecology of spiders. In fact, such self-sacrificing by a unit is highly unusual among natural materials, Buehler notes. Other silk spinners, such as silk worms, produce more elastic silk, which dissipates forces over an entire structure such as a cocoon, making it difficult for predators to bite through.

Philip LeDuc, a mechanical engineer at Carnegie Mellon University in Pittsburgh, Pennsylvania, is impressed: “It’s just fantastic work.” The molecular explanations for why webs stretch and snap will not only help engineers mimic these materials but also, potentially, make them better, he says.

Buehler is already working with biologists to genetically engineer arachnids that can spin threads with properties not seen in nature. Now that’s something that should scare a fly.

Source: Science now.

 

 

 

Human Brains Wire Up Slowly but Surely .


As the father-to-son exchange in the old Cat Stevens song advised, “take your time, think a lot, … think of everything you’ve got.” Turns out the mellow ’70s folkie had stumbled upon what may explain a key feature of our brains that sets us apart from our closest relatives: We unhurriedly make synaptic connections through much of our early childhoods, and this plasticity enables us to slowly wire our brains based on our experiences. Given that humans and chimpanzees share 98.8% of the same genes, researchers have long wondered what drives our unique cognitive and social skills. Yes, chimpanzees are smart and cooperative to a degree, but we clearly outshine them when it comes to abstract thinking, self-regulation, assimilation of cultural knowledge, and reasoning abilities. Now a study that looks at postmortem brain samples from humans, chimpanzees, and macaques collected from before birth to up to the end of the life span for each of these species has found a key difference in the expression of genes that control the development and function of synapses, the connections among neurons through which information flows.

As researchers describe in a report published online today in Genome Research, they analyzed the expression of some 12,000 genes—part of the so-called transcriptome—from each species. They found 702 genes in the prefrontal cortex (PFC) of humans that had a pattern of expression over time that differed from the two other species. (The PFC plays a central role in social behavior, working toward goals, and reasoning.) By comparison, genes in the chimpanzee PFC at various life stages had only 55 unique expression patterns—12-fold fewer than found in humans.

The genes the researchers analyzed have myriad functions. But when the researchers created five modules that lumped together genes that were co-expressed, they found that the module in humans that’s most closely tied to synapse formation and function had a “drastically” different developmental trajectory. These genes were turned on high from just after birth until about 5 years of age; the same genes in chimpanzees and macaques began to stop expressing themselves shortly after birth. “We might have discovered one of the differences that makes human brains work differently from chimpanzees and macaques,” says lead researcher Philipp Khaitovich, an evolutionary biologist who works at both the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and the Chinese Academy of Sciences (CAS) in Shanghai, China.

The researchers, including Svante Pääbo of the Leipzig institute and Xiling Liu of CAS, went a step further and actually counted more than 7000 synapses visible in electron micrographs from the three species at different ages. They found that the number of synapses in macaques and chimpanzees skyrocketed shortly after birth but did not peak in humans until about 4 years of age. “Humans have much more time to form synaptic connections,” Khaitovich concludes.

In their analyses, the researchers factored in that humans have much longer life spans than the other species and develop and mature more slowly in general. Their findings still stood out, even when adjusting for this developmental delay.

The work builds on behavioral evidence that showed the advantages of a prolonged childhood, as well as several other studies that have found differences in chimpanzee and human genes involved with synapse formation and function. But no group has ever done such a thorough comparative, longitudinal analysis of the brain transciptomes of these three species, says Todd Preuss, a neuroscientist at the Yerkes National Primate Research Center in Atlanta. “The whole thing is a technical tour de force,” Preuss says.

Nenad Sestan, a neurobiologist at Yale University who published a comprehensive analysis of the transcriptome of human brains from embryos to late adulthood in the 27 October 2011 issue of Nature, says the new work “is novel and provocative.” Sestan says to clarify differences between the species, the field now needs to examine more brain regions “to have a clearer idea of how specific this may be to the dorsolateral prefrontal cortex.”

The findings from Khaitovich and colleagues promise to spark future studies that address profound questions about everything from evolution to gene regulation. For example, they suggest in their report that the differences they found may also separate us from Neandertals, as evidence suggests that these extinct humans had faster cranial and dental development than modern humans.

Neurologist Eric Courchesne of the University of California, San Diego, says the new findings also mesh with his own studies of autism and brain overgrowth. Courchesne has found that the brains of autistic children grow more quickly than normal, which he theorizes prevents them from having enough experiences to properly wire neurons. “This is an absolutely fascinating study that will have great importance for advancing understanding of human disorders of early brain development as well as illuminating the evolutionary changes in neural development,” Courchesne says.

Source: ScienceNOW

 

Crocodile tears syndrome after vestibular schwannoma surgery.


Crocodile tears syndrome (CTS) is a lacrimal hypersecretion disorder characterized by excessive tearing with gustatory stimulation while eating, drinking, or smelling food. Surgeons tend to overlook CTS after vestibular schwannoma (VS) surgery because its symptoms are less obvious compared with facial paralysis. The authors aim to elucidate the precise incidence and the detailed natural course of CTS after VS surgery.

Methods

This study included 128 consecutive patients with unilateral VSs resected via a retrosigmoid, lateral suboccipital approach. Clinical information on the patients was obtained by retrospective chart review. The presence of, time of onset of, and recovery of patients from CTS were obtained from the chart or evaluated from the most recent outpatient visit.

Results

A total of 14 patients (10.9%) developed CTS. Motor function of the facial nerve at discharge was statistically related to the occurrence of CTS (p < 0.001). The odds ratio of House-Brackmann Grade 4 compared with Grade 1 was 86.4 (p < 0.001). A bimodal distribution of CTS onset was observed, with a mean onset of 6.1 ± 4.0 months after resection. The CTS improved in 10 patients (71%) at various intervals, whereas CTS resolved in only 7 patients (50%) at a mean interval of 10.9 ± 7.9 months. The mean interval to recovery in the early-onset group was 9.7 ± 7.9 months, and it was 18 months in the late-onset group; the mean is given ± SD throughout.

Conclusions

The occurrence of CTS following VS surgery was more common than expected; however, a surgical procedure intended to protect the functioning of the facial nerve appears to be conducive to reduction of the occurrence of CTS. To reduce the distress caused by CTS, all patients should be given sufficient information and provide their informed consent prior to surgery.

Source:Journal of Neurosurgery.

Utility of presurgical navigated transcranial magnetic brain stimulation for the resection of tumors in eloquent motor areas.


Navigated transcranial magnetic stimulation (nTMS) is a newly evolving technique. Despite its supposed purpose (for example, preoperative central region mapping), little is known about its accuracy compared with established modalities like direct cortical stimulation (DCS) and functional MR (fMR) imaging. Against this background, the authors performed the current study to compare the accuracy of nTMS with DCS and fMR imaging.

Methods

Fourteen patients with tumors in or close to the precentral gyrus were examined using nTMS for motor cortex mapping, as were 12 patients with lesions in the subcortical white matter motor tract. Moreover, preoperative fMR imaging and intraoperative mapping of the motor cortex were performed via DCS, and the outlining of the motor cortex was compared.

Results

In the 14 cases of lesions affecting the precentral gyrus, the primary motor cortex as outlined by nTMS correlated well with that delineated by intraoperative DCS mapping, with a deviation of 4.4 ± 3.4 mm between the two methods. In comparing nTMS with fMR imaging, the deviation between the two methods was much larger: 9.8 ± 8.5 mm for the upper extremity and 14.7 ± 12.4 mm for the lower extremity. In 13 of 14 cases, the surgeon admitted easier identification of the central region because of nTMS. The procedure had a subjectively positive influence on the operative results in 5 cases and was responsible for a changed resection strategy in 2 cases. One of 26 patients experienced nTMS as unpleasant; none found it painful.

Conclusions

Navigated TMS correlates well with DCS as a gold standard despite factors that are supposed to contribute to the inaccuracy of nTMS. Moreover, surgeons have found nTMS to be an additional and helpful modality during the resection of tumors affecting eloquent motor areas, as well as during preoperative planning.

Source:Journal of Neurosurgery.

Low versus high haemoglobin concentration threshold for blood transfusion for preventing morbidity and mortality in very low birth weight infants.


Infants of very low birth weight often receive multiple transfusions of red blood cells, usually in response to predetermined haemoglobin or haematocrit thresholds. In the absence of better indices, haemoglobin levels are imperfect but necessary guides to the need for transfusion. Chronic anaemia in premature infants may, if severe, cause apnoea, poor neurodevelopmental outcomes or poor weight gain. On the other hand, red blood cell transfusion may result in transmission of infections, circulatory or iron overload, or dysfunctional oxygen carriage and delivery.

Objectives

To determine if erythrocyte transfusion administered to maintain low as compared to high haemoglobin thresholds reduces mortality or morbidity in very low birth weight infants enrolled within three days of birth.

Search strategy

Two review authors independently searched the Cochrane Central Register of Controlled Trials (The Cochrane Library) , MEDLINE, EMBASE, and conference proceedings through June 2010.

Selection criteria

We selected randomised controlled trials (RCTs) comparing the effects of early versus late, or restrictive versus liberal erythrocyte transfusion regimes in low birth weight infants applied within three days of birth, with mortality or major morbidity as outcomes.

Data collection and analysis

Two review authors independently selected the trials.

Main results

Four trials, enrolling a total of 614 infants, compared low (restrictive) to high (liberal) haemoglobin thresholds. Restrictive thresholds tended to be similar, but one trial used liberal thresholds much higher than the other three. There were no statistically significant differences in the combined outcomes of death or serious morbidity at first hospital discharge (typical risk ratio (RR) 1.19; 95% confidence interval (CI) 0.95 to 1.49) or in component outcomes. Only the largest trial reported follow-up at 18 to 21 months corrected gestational age; in this study there was no statistically significant difference in a composite of death or adverse neurodevelopmental outcome (RR 1.06; 95% CI 0.95 to 1.19). One additional trial comparing transfusion for clinical signs of anaemia versus transfusion at a set level of haemoglobin or haematocrit, reported no deaths and did not address disability.

Authors’ conclusions

The use of restrictive as compared to liberal haemoglobin thresholds in infants of very low birth weight results in modest reductions in exposure to transfusion and in haemoglobin levels. Restrictive practice does not appear to have a significant impact on death or major morbidities at first hospital discharge or at follow-up. However, given the uncertainties of these conclusions, it would be prudent to avoid haemoglobin levels below the lower limits tested here. Further trials are required to clarify the impact of transfusion practice on long term outcome.

 

Plain language summary

Low versus high haemoglobin concentration threshold for blood transfusion for preventing morbidity and mortality in very low birth weight infants

Very premature infants are extremely vulnerable and often require intensive care to survive. Anaemia is a condition in which the blood does not contain enough haemoglobin, the component of red blood cells which carries oxygen around the body. These babies become anaemic very quickly due to blood sampling and because they are unable to make blood cells quickly the haemoglobin level in the blood falls rapidly in the weeks after birth. Generally, the treatment for anaemia is blood transfusion, and many of these babies receive multiple transfusions of blood. The decision to give a transfusion usually depends on the measured amount of haemoglobin in the blood.

Physicians looking after very premature infants are unsure as to the level of haemoglobin at which they should give a transfusion. As transfusion is the introduction of another person’s blood cells into the blood stream, there is a risk of infection and a risk of reaction to foreign blood components; the process requires careful monitoring and supervision to ensure safety. Some people find blood transfusion offensive or contrary to their religious values. Giving few or no transfusions reduces the risks of transfusion, but may result in low levels of haemoglobin and consequently a reduced supply of oxygen to the body which could have effects on survival, growth or development.

This review of five studies compares the effects of blood transfusion at low levels of haemoglobin to transfusion at high levels. Within the levels tested, there were no differences seen in survival, in the serious complications of prematurity, or in longer term development as measured at 18 to 21 months past the baby’s due date. Allowing the baby to become a little more anaemic did not affect the baby’s weight gain or breathing patterns. These conclusions are not firm, because too few babies have been studied. Our overall recommendation is not to exceed the higher levels of haemoglobin used in these trials, and thus diminish the risks of over-transfusion, but not to allow the level of haemoglobin to fall below the lower limits tested in these studies until further studies are completed.

Source:Cochrane Library.

 

 

 

Earthquake-Proof Engineering for Skyscrapers.


A fun engineering endeavor from Science Buddies.

Have you ever wondered how tall skyscrapers can stand up so impressively to the force of gravity? But what about more violent forces, such as those produced by earthquakes? A well-planned and tested design, when combined with the right materials, can keep a building intact through all sorts of shakes and quakes.

Once the tallest buildings in the world, the Petronas Towers in Kuala Lumpur, the capital of Malaysia, stand at an amazing 452 meters tall. Because Malaysia is in an area that experiences frequent earthquake activity, the towers had to be designed to withstand the lateral shaking force that is experienced during a quake. Can you, as an amateur architect and engineer, design a structure that can withstand lateral movements similar to that of an earthquake?

Background
Architects and engineers must design buildings to withstand a variety of forces, some stronger than others, from many sources: gravity, people inside, weight of building materials, weather and environmental impacts. If the design is stable, these forces will not weaken the structure or cause it to collapse.

Lateral shaking is the force that can cause the most damage to a building during an earthquake. As you might have guessed from its name, this force usually occurs in a direction parallel to the ground. Designing a building to have lateral resistance is helpful not only for preventing quake damage, but also from other lateral forces, such as wind. Engineers can test how well a building will hold up to lateral force by placing a model of it on a “shake table,” which moves horizontally to replicate the stresses created by an earthquake.

Materials
•    LEGO bricks
•    Large, flat LEGO baseplate
•    Metric ruler
•    Three-ring binder (an old one that is okay to take apart)
•    Scissors
•    Four small rubber balls (each the same size, about 2.5 centimeters in diameter)
•    Two rubber bands (each about eight centimeters or longer when flattened and doubled on itself)

Preparation
•    Carefully cut the front and back covers off of the three-ring binder with scissors. (This might be a good task for an adult.)
•    Place the two binder covers on top of one another and “rubber band” them together by stretching a rubber band around each end, about 2.5 centimeters from the edge of the boards.
•    Insert the rubber balls between the boards at each corner, placing them about five centimeters in from the edges.
•    Attach a large, flat LEGO plate to the top by slipping the plate underneath the rubber bands. Your “shake table” is now ready to shake some towers!

Procedure
•    Practice creating a lateral shaking movement with the shake table by pulling its top layer horizontally out of alignment and then letting it go.
•    Gently try pulling the top layer as far out of alignment as you feel comfortable with (and without damaging the shake table) then measure the distance of displacement, which is the horizontal distance between the top and bottom layers.
•    Build four or more LEGO towers of increasing height on a nearby surface. Use the same base size and shape for each tower, so that the size of the towers’ footprints are the same, and only their heights vary. What are the heights of the different towers?
•    One at a time, starting with the shortest tower and progressing to the taller ones, secure each LEGO tower in the center of the shake table’s top surface. To test each structure, create a lateral shaking movement using the same distance of displacement you previously measured. Did all, none or some of the towers fall? If some fell and others did not, what were the differences in height between these towers? In general, did the taller towers fall more frequently than the shorter ones?
•    Tip: If none of the towers fell, try testing this activity with taller towers and/or a smaller base size. If all of the towers fell, try testing this activity with shorter towers, a larger base that takes up more space and/or a smaller distance of shake-table displacement.
•    Extra: In this activity, you kept the footprint of each tower the same and only changed the height. Try testing LEGO towers with different-size bases. Do you think that by changing the footprint you could make taller buildings more stable? You could also calculate the area of the base (by multiplying its length times its width in centimeters) and divide this by the height for each tower to get the ratio of base to height. How do the towers’ base-to-height ratios compare with how they perform on your shake table?
•    Extra: Try building towers out of a different material that allows you to test different structural designs. Good materials are straws, popsicle sticks or toothpicks and marshmallows. Try comparing square designs to triangular or hexagonal designs. Try adding extra structural elements to your designs. Can you design a stabler tower? How tall can you build it before it loses stability?

Observations and results
In general, did the taller towers fall whereas the shorter towers remained standing? If you varied the footprint of the towers, were the ones with larger footprints generally more stable than the ones with smaller footprints?

Structures that are tall or skinny are generally less stable, making them more likely to fall when exposed to lateral forces, whereas ones that are shorter or wider (at the base) are generally more steadfast. Architects and engineers use all kinds of innovative techniques along with these basic principles to build amazing skyscrapers. Building heights keep creeping upward as technology allows engineers to safely build higher.

One major technological breakthrough that allowed for the creation of skyscrapers in the late 1800s was the development of a material that was lighter and stronger than previously used materials: steel. Before this, buildings were mostly made of brick and stone. Architects and engineers designed skyscrapers with a steel framework that supported the building’s weight, which meant that the walls no longer had to be load-bearing (as they had previously been). This development, along with other innovative ideas and materials, allowed for the creation of skyscrapers—and as our technologies continue to improve we are be able to reach ever closer to the sky.

Source: Scientific American.

 

 

Admission and Repeat Head CT for Patients on Warfarin with Minor Head Injury?


This approach identifies most delayed intracranial bleeds, but whether it changes patient outcomes and justifies the increased resources is not clear.

Current U.S. guidelines recommend head computed tomography (CT) for all patients taking warfarin who experience minor head injury. In a prospective study at a single emergency department in Italy, researchers evaluated outcomes in 97 adults with minor head injury who were taking warfarin, had normal initial head CT scans, and were admitted for 24-hour observation and repeat head CT (consistent with European guidelines). Minor head injury was defined as any head trauma other than superficial face injuries and Glasgow Coma Scale score of 14–15 regardless of presence or absence of loss of consciousness.

Ten patients declined the repeat CT scan (all were asymptomatic at 30 days). Of the 87 remaining patients, 5 (6%) had lesions detected on repeat CT. Of these five patients, two were discharged because their lesions were considered inconsequential, and three were admitted, with one undergoing craniotomy for subdural hematoma. Only one of the five patients with positive repeat CT scans demonstrated new neurological symptoms during the observation period. Two patients who were discharged after normal repeat CT scans were readmitted with symptomatic (confusion, headache) delayed subdural hematoma; neither patient required surgery.

Four of the five patients with positive repeat CT scans and both patients who were readmitted with delayed subdural hematoma had international normalized ratios (INRs) >3; however, the study was not powered to determine the statistical significance of predictors of intracranial hemorrhage (ICH).

Comment: Although the authors recommend that all patients who suffer minor head injury while taking warfarin be admitted and undergo repeat scanning, they did not demonstrate that doing so improves outcomes. If we assume that the 10 asymptomatic patients who declined the repeat CT scan did not have delayed consequential ICH, then the rate of delayed bleeding requiring surgery is 1% (1 of 97) and does not justify a change in practice. The compelling finding that most patients with delayed ICH had supratherapeutic INRs suggests that such patients warrant more observation and repeat imaging.

Source: Journal Watch Emergency Medicine