Benefits of Early Sepsis Interventions.


Use of the 6-hour sepsis resuscitation bundle was associated with reductions in mortality and hospital costs.

As part of a continuous quality improvement initiative, investigators analyzed data from 5 community and 6 tertiary hospitals in the U.S. that implemented an evidence-based, 6-hour sepsis resuscitation bundle for patients with suspected sepsis:

1) Measure serum lactate

2) Obtain blood cultures before antibiotic administration

3) Administer broad-spectrum antibiotic within 3 hours of emergency department (ED) admission and within 1 hour of non-ED intensive care unit admission

4) In the event of hypotension, serum lactate 4 mmol/L, or both:

  • Administer an initial minimum of 20 mL/kg of crystalloid or an equivalent
  • If hypotension does not respond to initial fluid resuscitation, administer vasopressors to maintain mean arterial pressure >65 mm Hg

5) In the event of persistent hypotension despite fluid resuscitation (septic shock), serum lactate 4 mmol/L, or both:

  • Achieve central venous pressure 8 mm Hg
  • Achieve central venous oxygen saturation 70%

In an analysis of 952 patients treated before and 4109 patients treated after implementation of the bundle at 8 hospitals, in-hospital mortality was significantly higher in the before group (43% vs. 29%). In an analysis of 1294 patients treated at 3 different hospitals after implementation of the bundle, mortality was significantly higher among the 602 patients who did not receive the full bundle than among the 692 patients who did (42% vs. 27%). In a combined analysis, hospital stay was 5.1 days shorter and 24-hour APACHE-II and SOFA scores were significantly lower in the treatment group (the 4109 patients treated after implementation and the 692 patients who received the full bundle) than in the control group (the 952 patients treated before implementation and the 602 patients who did not receive the full bundle).

The authors conclude that use of the sepsis resuscitation bundle is associated with significant reductions in hospital length of stay and mortality, with one life saved for every seven patients treated.

Comment: This observational study supports implementation of a sepsis resuscitation bundle in both community and tertiary hospitals and broad application of the bundle to all patients with suspected sepsis.

Source: Journal Watch Emergency Medicine

 

Glyphosate in Monsanto’s Roundup Found in All Urine Samples Tested.


recent study conducted by a German university found very high concentrations of Glyphosate, a carcinogenic chemical found in herbicides like Monsanto’s Roundup, in all urine samples tested. The amount of glyphosate found in the urine was staggering, with each sample containing concentrations at 5 to 20-fold the limit established for drinking water. This is just one more piece of evidence that herbicides are, at the very least, being sprayed out of control.

Glyphosate in Monsanto’s Roundup Impacting Global Health

This news comes only one month after it was found that glyphosate, contained in Monsanto’s Roundup, is contaminating the groundwater in the areas in which it is used. What does this mean? It means that toxic glyphosate is now polluting the world’s drinking water through the widespread contamination of aquifers, wells and springs. The recent reports of glyphosate showing up in all urine samples only enhances these past findings.

Monsanto continues to make the claim that their Roundup products are completely safe for both animals and humans. However many environmentalists, scientists , activists, and even doctors say otherwise. Glyphosate radically affects the metabolism of plants in a negative way. It is a systemic poison preventing the formation of essential amino acids, leading to weakened plants which ultimately die from it.

A formula seems to have been made to not only ruin the agricultural system, but also compromise the health of millions of people worldwide. With the invent of Monsanto’s Roundup Ready cropsresistant superweeds are taking over farmland and public health is being attacked. As it turns out, glyphosate is also leaving behind its residue on Roundup Ready crops, causing further potential concern for public health. Glyphosate is even contributing to escalating rates of mental illness and obesity through the depletion of beneficial gut flora that directly regulates these functions. But it certainly doesn’t stop there.

Researchers tested roundup on mature male rats at a concentration range between 1 and 10,000 parts per million (ppm), and found that within 1 to 48 hours of exposure, testicular cells of the mature rats were either damaged or killed. Even at a concentration of 1 ppm, the Roundup was able to affect the test subjects by decreasing their testosterone concentrations by as much as 35%.

.

 

Explore More:

  1. Monsanto’s Roundup is Causing DNA Damage
  2. Monsanto’s Carcinogenic Roundup Herbicide Contaminating Water Supply
  3. Monsanto’s Roundup Shown to be Ravaging Butterfly Population
  4. Causes of Water Pollution – GMO Farming, Glyphosate Big Contributors
  5. Monsanto’s Best-Selling Herbicide Roundup Linked to Infertility
  6. Monsanto’s Roundup Continuously Shown to Cause Birth Defects

 

Source: http://naturalsociety.com

 

 

 

 

Ten Inventions Inspired by Science Fiction.


The innovators behind objects like the cellphone or the helicopter took inspiration from works like “Star Trek” and War of the Worlds

 

Submarine

Known as the father of the modern submarine, American inventor Simon Lake had been captivated by the idea of undersea travel and exploration ever since he read Jules Verne’s Twenty Thousand Leagues Under the Sea in 1870. Lake’s innovations included ballast tanks, divers’ compartments and the periscope. His company built the Argonaut—the first submarine to operate successfully in the open ocean, in 1898—earning him a congratulatory note from Verne.
Helicopter

While Jules Verne is perhaps most famous for his fictional submarine, the Nautilus, the French author also envisioned the future of flight. Igor Sikorsky, inventor of the modern helicopter, was inspired by a Verne book, Clipper of the Clouds, which he had read as a young boy. Sikorsky often quoted Jules Verne, saying “Anything that one man can imagine, another man can make real.”
Rocket

Robert H. Goddard, the American scientist who built the first liquid-fueled rocket—which he successfully launched on March 16, 1926—became fascinated with spaceflight after reading an 1898 newspaper serialization of H.G. Wells’ classic novel about a Martian invasion, War of the Worlds. As Goddard would recall later, the concept of interplanetary flight “gripped my imagination tremendously.”
Atomic Power

In 1914, H.G. Wells published a novel, The World Set Free, imagining the emergence of “artificial” atomic energy by 1933, followed by a devastating world war and the eventual emergence of a peaceful global government. Physicist Leo Szilard read the book in 1932, which inspired him to solve the problem of creating a nuclear chain reaction—in 1933. The same book would inspire Szilard to campaign for arms control and the peaceful, international use of nuclear power after World War II.
Combat Information Center

In the 1930s and ’40s, E.E. “Doc” Smith delighted readers with his “Lensmen” novels, chronicling the adventures of a futuristic Galactic Patrol. In a 1947 letter, sci-fi editor James W. Campbell informed Smith that the Directrix—a command ship featured in his series—had inspired a U.S. naval officer to introduce the concept of combat information centers aboard warships.
The Waldo

In 1942, famed sci-fi author Robert Heinlein published a short story about a physically infirm inventor, Waldo F. Jones, who created a remotely operated mechanical hand. Real-life manipulator arms that were developed for the nuclear industry in the mid-1940s were named “waldos,” in recognition of Heinlein’s innovative idea.
Cellphone

Martin Cooper, the director of research and development at Motorola, credited the “Star Trek” communicator as his inspiration for the design of the first mobile phone in the early 1970s. “That was not fantasy to us,” Cooper said, “that was an objective.”
Taser

One of the most famous literary characters of the early 20th century was Tom Swift, a genius inventor who was the protagonist in a series of juvenile science fiction books. NASA physicist Jack Cover, who invented the Taser, was a fan—“Taser” is an acronym for one of Swift’s fictional inventions, the “Thomas A. Swift’s Electric Rifle.”
QuickTime

Apple scientist Steve Perlman says that he got the idea for the groundbreaking multimedia program QuickTime after watching an episode of “Star Trek: The Next Generation,” wherein one of the characters is listening to multiple music tracks on his computer.

Second Life

Neal Stephenson’s 1992 novel Snow Crash describes a fully immersive online “Metaverse” where people interact with one another through representations called “avatars.” Philip Rosedale, the inventor of the once popular online community Second Life, had been toying with the idea of virtual worlds since college, but credits Snow Crash for painting “a compelling picture of what such a virtual world could look like in the near future, and I found that inspiring.”

Source: http://www.smithsonianmag.com

 

 

 

Tuberculosis, Drug Resistance, and the History of Modern Medicine.


Tuberculosis is a treatable airborne infectious disease that kills almost 2 million people every year. Multidrug-resistant (MDR) tuberculosis — by convention, a disease caused by strains of Mycobacterium tuberculosis that are resistant to isoniazid and rifampin, the backbone of first-line antituberculosis treatment — afflicts an estimated 500,000 new patients annually. Resistance to antituberculosis agents has been studied since the 1940s; blueprints for containing MDR tuberculosis were laid out in the clinical literature and in practice, in several settings, more than 20 years ago.1,2 Yet today, barely 0.5% of persons with newly diagnosed MDR tuberculosis worldwide receive treatment that is considered the standard of care in the United States.3 Those who have not received appropriate treatment continue to fuel a global pandemic that now includes strains resistant to most — and by some accounts all — classes of drugs tested. 4,5 Despite the enormity of the threat, investments to contain the epidemic and to cure infected patients have been halting and meager when compared, for example, with those made to address the acquired immunodeficiency syndrome (AIDS) pandemic. In this essay we seek to elucidate the reasons for the anemic response to drug-resistant tuberculosis by examining the recent history of tuberculosis policy.

Research in Tuberculosis — Midwife of Modern Biomedicine

On the evening of March 24, 1882, when Robert Koch completed his presentation on the infectious cause of tuberculosis, silence enveloped the crowded room at the Berlin Physiological Society.6 A means of combating tuberculosis — a disease that in the 19th century caused, by some accounts, about 25% of all deaths in Massachusetts and New York and claimed the lives of one fourth of Europe’s population — was now within reach.7 Koch summarized the importance of his findings, for which he received the 1905 Nobel Prize, in a manuscript published in the Berliner Klinische Wochenschrift shortly after his announcement: “In the future the fight against this terrible plague of mankind will deal no longer with an undetermined something, but with a tangible parasite, whose living conditions are for the most part known and can be investigated further.”8

But therapy lagged. It was not until 60 years later, in 1943, that the first effective antituberculosis agent, streptomycin, was isolated in the laboratory of Selman Waksman at Rutgers University (see timeline, available with the full text of this article at NEJM.org). In November 1944, a patient with tuberculosis received streptomycin and was declared cured of the disease.6 Other cases of successful treatment soon followed.9,10 The British Medical Research Council conducted the first large-scale clinical trial of streptomycin in 1948.11 This study, said to be the world’s first published drug trial that involved the randomization of participants, set the methodologic standard for modern randomized, controlled trials. Although many patients were cured, a substantial proportion had a relapse; mycobacterial isolates cultured from the latter patients showed resistance to streptomycin.12 That same year, two new antituberculosis agents, thiacetazone and para-aminosalicylic acid, came on the market. When either of these agents was administered with streptomycin, cure rates rose and acquired antibiotic resistance declined.13 In 1951, isonicotinic acid hydrazide (isoniazid) was tested at Sea View Hospital in New York; it dramatically improved clinical outcomes and was soon introduced for wider use.14 Isoniazid was followed by the development of pyrazinamide (1952), cycloserine (1952), ethionamide (1956), rifampin (1957), and ethambutol (1962).

With its high level of efficacy and ease of administration, rifampin revolutionized the treatment of tuberculosis.15-17 But the advent of every new drug led to the selection of mutations conferring resistance to it. Resistance to rifampin was observed soon after it was first administered.18 Laboratory data from trials revealed the rapid onset of isoniazid resistance among patients receiving monotherapy and the suppression of resistance when isoniazid was given in combination with streptomycin or para-aminosalicylic acid.19 These observations led to the use of multidrug treatment regimens — a strategy widely used today to treat a variety of infectious diseases and cancers. Ultimately, through a series of multicountry clinical trials led by the British Medical Research Council, a four-drug regimen was recommended for use in patients with newly diagnosed tuberculosis. The backbone of such empirical regimens was the combination of isoniazid and rifampin, the most effective and reasonably well-tolerated oral agents, given for 6 to 8 months. Thus, short-course chemotherapy was born.19

Drug resistance, however, has remained a challenge. The early hypothesis that resistance always conferred a loss of bacterial fitness, and hence led to lower case fatality rates and decreased transmission of such strains, had been disproved by the 1950s.19 The first national drug-resistance survey in the world, which involved 974 clinical isolates cultured from newly diagnosed cases of tuberculosis in Britain (1955–1956), showed strains that were resistant to streptomycin (2.5%), para-aminosalicylic acid (2.6%), and isoniazid (1.3%).20 Similarly, data from the United States showed that isoniazid resistance increased from 6.3% (between 1961 and 1964) to 9.7% (between 1965 and 1968) among patients with newly diagnosed tuberculosis.21 Between 1970 and 1990, there were numerous outbreaks of drug-resistant tuberculosis involving strains resistant to two or more drugs.17,22,23 As early as 1970, an outbreak in New York City of highly virulent tuberculosis that was resistant to multiple drugs proved to be a grim reminder that resistance did not necessarily reduce a microbe’s fitness: the index patient died; 23 of 28 close contacts had evidence of new infection, and active, drug-resistant disease developed in 6 of these 23 contacts, 5 of whom were children.21

Tuberculosis, whether caused by drug-susceptible or drug-resistant strains, rarely made even medical headlines, in part because its importance as a cause of death continued to decline in areas in which headlines are written. In such settings, where many of the social determinants of tuberculosis — extreme poverty, severe malnutrition, and overcrowded living conditions — became the exception rather than the norm, some public health experts declared that “virtual elimination of the disease as a public health problem” was in sight.24 In the United States, federal funding for tuberculosis research was cut; consequently, drug discovery, development of diagnostics, and vaccine research ground almost to a halt.17

The Great Divergence in Tuberculosis Policy

Optimism that tuberculosis would soon be eliminated was not restricted to wealthy countries. At the 1978 International Conference on Primary Health Care in Alma-Ata (now called Almaty), Kazakhstan, delegates from around the world endorsed the goal of “health for all by the year 2000.” The eradication of smallpox had been announced the previous year, and the future of international public health looked promising to many who were gathered there.

But it was not to be. By the mid-20th century, tuberculosis outcomes had diverged along the fault lines of the global economy: while tuberculosis became rare in countries where income was high, epidemics of the disease raged on in low-income settings. In 1982, the Mexican government defaulted on many of its loan payments, triggering a debt crisis in many countries with weak economies. Increasing numbers of international health donors and policymakers, slow to contribute resources toward the ambitious Alma-Ata agenda, embraced the idea of selective primary health care: discrete, targeted, and inexpensive interventions.25,26 Bilateral assistance withered, and poor countries became increasingly reliant on loans from international financial institutions such as the World Bank, which based its health agenda on the principles of “cost-effectiveness” and “affordable health for all” — the latter concept a nod to the Alma-Ata Declaration.27

Selective primary health care offered clear targets, measurable outcomes, and a high return on health investments, all of which appealed to donors worried about investing in countries that were on the brink of default.28,29 But several leading causes of disability and death, including tuberculosis, were deemed too costly and complex to address in resource-poor settings and were largely excluded from the emerging, constricted agenda for effective health investments. “Leprosy and tuberculosis require years of drug therapy and even longer follow-up periods to ensure cure,” wrote two of the architects of selective primary health care in 1979. “Instead of attempting immediate, large-scale treatment programs for these infections, the most efficient approach may be to invest in research and development of less costly and more efficacious means of prevention and therapy.”25

But tuberculosis, which persisted in settings of poverty, could not be hidden away for long. In 1993, the World Bank began to use disability-adjusted life-years — a means of measuring the “cost-effectiveness” of a given health intervention that took into account morbidity, mortality, and age — to determine which health interventions to support.30 As a result of this new economic calculus, short-course chemotherapy for tuberculosis was declared a highly “cost-effective” intervention and gained momentum.31 Seizing the opportunity, the World Health Organization (WHO) shaped and promoted the DOTS (directly observed therapy, short-course) strategy, an approach that conformed to the selective primary health care agenda: simple to treat, algorithmic, and requiring no expensive inputs. According to this strategy, the diagnosis was to be made with the use of smear microscopy alone — in spite of the insensitivity and inability of this technique to detect drug resistance — and the treatment approach was to be based on the empirical use of first-line antituberculosis agents only. 32 Facility-based infection control was not part of the DOTS strategy. Despite these exclusions, DOTS was an important development in global tuberculosis policy. Increasingly, poor countries began implementing the DOTS approach; many lives were saved and many new cases averted. However, for children with tuberculosis, people with both tuberculosis and advanced disease from the human immunodeficiency virus (HIV), and the increasing proportion of patients infected with strains of tuberculosis that were already drug-resistant, the DOTS strategy provided limited options for prompt diagnosis and cure.

The Emergence of MDR Tuberculosis Globally

These shifts in tuberculosis policy — linked to the reconceptualization of this leading infectious killer of young adults and children from a disease deemed to be costly and difficult to treat to a disease deemed to be “cost-effective” to treat and slated for eradication — convey precisely what is meant by the “social construction of disease.”33 M. tuberculosis did not conform to the regnant disease-control strategy, and resistant strains continued to emerge and to be transmitted because empirical treatment with first-line antituberculosis drugs was ineffective for those sick with strains resistant to these drugs. HIV infection fanned epidemics of tuberculosis. In the late 1980s and early 1990s, outbreaks of MDR tuberculosis were again reported in the United States.17 Genetic analysis of drug-resistant strains showed that airborne transmission of undetected and untreated strains played a major role in these outbreaks, disabusing practitioners of the notion that resistance stemmed solely from “sporadic pill taking.”17,34 Public health officials developed a national action plan to combat drug-resistant tuberculosis and to increase funding for relevant research.17,35-37 The experience in New York City offered a blueprint that was quite different from the DOTS strategy; it consisted of diagnosis with the use of mycobacterial culture and fast-track drug-susceptibility testing, access to second-line antituberculosis medications, proper infection control, and delivery of medications under direct observation.1

Outbreaks of MDR tuberculosis in the United States were a harbinger of the coming global pandemic. By the early-to-mid-1990s, MDR tuberculosis had been found wherever the diagnostic capacity existed to reveal it. But in contrast to the U.S. strategy, the WHO — the principal standard-setting body for many countries — continued to advocate the use of sputum-smear microscopy and first-line antituberculosis treatment alone for combating epidemics in resource-poor settings. Some international policymakers thought that treating MDR tuberculosis would be too expensive and complex — claims similar to those made about treating drug-susceptible tuberculosis before this approach was found to be “cost-effective” — and would distract attention from the newly branded (and often successful) DOTS strategy.38 Contemporaneous experience in the United States and in several countries in the former Soviet Union suggested, however, that short-course chemotherapy was ineffective against strains shown to be resistant to precisely those drugs on which such therapy was based.1,17,39,40

The Limits of Short-Course Chemotherapy

The failure of short-course chemotherapy against MDR tuberculosis, though unsurprising clinically, was difficult politically. In Peru, for example, a campaign to promote the DOTS strategy had been so successful in making short-course chemotherapy available that the country’s leaders elevated it as a point of national pride. Peru emerged as a crucible for debates about the treatment and management of MDR tuberculosis in poor countries.2 In 1995, an outbreak in a shantytown in the northern reaches of Lima was identified.41 Many patients were infected with strains found to have broad-spectrum resistance to first-line drugs. Nongovernmental organizations worked with the Peruvian Health Ministry to apply the standard-of-care treatment used in New York City and elsewhere in the United States. The strategy was modified to provide community-based care, with good results.42 After arguing that the DOTS strategy alone could rein in the mutant bacteria, the WHO and other international public health authorities advised the Peruvian government to adopt a low-cost, standardized regimen for the treatment of MDR tuberculosis rather than protocols based on the results of drug-susceptibility testing. In the absence of tailored therapy, many hundreds of deaths occurred among some of Lima’s poorest people.43 As expected, amplification of drug resistance was documented.44,45

By the end of the 1990s, facing mounting evidence that MDR tuberculosis could be treated effectively in resource-poor settings,46,47 a multi-institutional mechanism — the Green Light Committee — was created to encourage and learn from pilot projects for treating MDR tuberculosis.2,17,48 This coincided with a grant from the Bill and Melinda Gates Foundation to scale up treatment of MDR tuberculosis in Peru and elsewhere and to change global policy.

Tuberculosis Policy and Global Health Equity

Drug resistance is well established as an inevitable outcome of antibiotic use; the fault lines of the MDR tuberculosis pandemic are largely man-made. The contours of global efforts against tuberculosis have always been mediated by both biologic and social determinants, and the reasons for the divergence in the rates of tuberculosis and drug resistance between rich and poor countries are biosocial.49 As case rates dropped in wealthy countries, funding for research and implementation programs dried up, even though tuberculosis remained the world’s leading infectious killer of young adults throughout the 20th century. Tuberculosis “control” in the 1990s was defined by the legacy of selective primary health care: targeted, “cost-effective” interventions packaged together, in the case of tuberculosis, as the DOTS strategy. Such protocols helped standardize tuberculosis treatment around the world — a process that was sorely needed — but they hamstrung practitioners wishing to address diagnostic and therapeutic complexities that could not be addressed by the use of sputum-smear microscopy and short-course chemotherapy or other one-size-fits-all approaches. These complexities, which now range from pan-resistant tuberculosis to undiagnosed pediatric disease, account for more than a trivial fraction of the 9 million new cases of tuberculosis and the almost 2 million deaths from this disease that occur around the globe each year.

The history of divergent policies for combating drug-resistant tuberculosis shows that decades of clinical research and effective programs in high-income settings did not lead to the deployment of similar approaches in settings of poverty. Achieving that goal demands a commitment to equity and to health care delivery.50 The U.S. response to the outbreaks of MDR tuberculosis in New York City and elsewhere was bold and comprehensive; it was designed to halt the epidemic.1,17 A similar response has not yet been attempted in low- and middle-income countries. Instead, selective primary health care and “cost-effectiveness” have shaped an anemic response to the ongoing global pandemic.

New diagnostics and therapeutics are urgently needed; most of the methods used currently were developed decades ago. Today, we have rapid nucleic acid–based tests for drug-resistant tuberculosis, sound models for laboratory expansion and for treatment delivery, and several drug candidates in the pipeline. To tackle tuberculosis, we also need an equity plan that takes seriously the biosocial complexity of a lethal airborne infection that has stalked us for centuries. The global AIDS effort of the past decade has shown how much can be accomplished in global health when effective diagnosis and care are matched with funding and political will. Stinting on investments or on bold action against tuberculosis — in all its forms — will ensure that it remains a leading killer of people living in poverty in this decade and the next.

Source Information

From the Program in Infectious Disease and Social Change, Department of Global Health and Social Medicine, Harvard Medical School; the Division of Global Health Equity, Brigham and Women’s Hospital; and Partners in Health — all in Boston.

Source: NEJM

 

 

Brain Amyloid Imaging — FDA Approval of Florbetapir F18 Injection.


The Centers for Disease Control and Prevention recently estimated that more than 16 million Americans are living with cognitive impairment.1 Cognitive impairment can be ascribed to a variety of disorders, some of which can be treated (e.g., severe depression or effects of medications) but others of which may signal the development of incurable dementias, such as Alzheimer’s disease. For these reasons, the development and improvement of diagnostic procedures — and neuroimaging procedures, in particular — that aid in characterizing cognitive impairment is a health care priority. Improved diagnostic evaluation of patients with cognitive impairment may also enhance the development of therapies, since reliable diagnoses are critical to the success of clinical trials.

Recently, the Food and Drug Administration (FDA) approved a new radiopharmaceutical agent to assist clinicians in detecting causes of cognitive impairment other than Alzheimer’s disease. Florbetapir F18 injection (Amyvid, Eli Lilly) is indicated for positron-emission tomographic (PET) imaging of the brain in cognitively impaired adults undergoing evaluation for Alzheimer’s disease and other causes of cognitive decline.2 Florbetapir binds to amyloid aggregates in the brain, and the florbetapir PET image is used to estimate the density of β-amyloid neuritic plaque. As a component of a comprehensive diagnostic evaluation, the finding of a “negative” florbetapir scan (as qualified below) should intensify efforts to find a non–Alzheimer’s disease cause of cognitive decline. Florbetapir brain imaging is a new type of nuclear medicine imaging, and the interpretation of the image requires special training. The unique features of the imaging information also require careful consideration when the scan results are integrated into a diagnostic evaluation.

Although the pathophysiological consequences of accumulation of β-amyloid in the brain are uncertain, neuropathological identification of amyloid plaques, typically at autopsy, has long been recognized as essential to confirming the diagnosis of Alzheimer’s disease. Because β-amyloid plaques in the brain have been described as a “hallmark” of Alzheimer’s disease, some clinicians may regard the florbetapir scan as a new test for the disease.3 But the drug was developed exclusively to estimate the density of β-amyloid neuritic plaque in the brain, and these plaques have been detected in patients with a variety of neurologic disorders, as well as in older people with normal cognition (see Florbetapir F18 Scan Usage: Information Summary).

Florbetapir is an 18F-labeled ligand that, in nonclinical studies, was shown to bind to β-amyloid aggregates in postmortem sections of human brains and in brain homogenates.4 In the main clinical studies supporting FDA approval, the accuracy of florbetapir scans was assessed in the brains of terminally ill patients who participated in a brain-donation program. The patients, who had a range of underlying cognitive function, underwent florbetapir scans and were followed until they died. The premortem scan results were subsequently compared with the brain autopsy findings. In all the clinical studies, the florbetapir scans were independently interpreted by multiple readers who had completed training in interpreting florbetapir images.

A binary method of interpretation was developed for relating “positive” or “negative” florbetapir scans to neuropathologically defined categories of density of β-amyloid neuritic plaque. The method designated a positive florbetapir scan as categorically indicative of “moderate to frequent” β-amyloid neuritic plaques, as defined by the consensus criteria for Alzheimer’s disease neuropathology established by the National Institute on Aging. In 59 patients who underwent florbetapir scans and autopsy, scan sensitivity for the detection of moderate to frequent β-amyloid neuritic plaques was 92% (range, 69 to 95), and scan specificity was 95% (range, 90 to 100), on the basis of the median assessment among five readers (ClinicalTrials.gov number, NCT01447719).

One of the challenges of the florbetapir clinical development program was that terminally ill patients are not representative of the population that is likely to undergo florbetapir scanning in medical practice. In addition, β-amyloid content could change between the time of live brain imaging and the time of autopsy. More than 20% of autopsies in the main clinical studies were performed more than a year after the live brain imaging (NCT01447719 and NCT01550549).

To evaluate scan reliability in a wider population, a clinical study had new readers examine images from non–terminally ill patients with Alzheimer’s disease or mild cognitive impairment, as well as persons with normal cognition. The previously obtained images from autopsied patients were also included in the study (NCT01550549). Among five readers who interpreted images from the 151 subjects, the kappa score for interrater reliability was 0.83 (95% confidence interval, 0.78 to 0.88), with the lower bound of the 95% confidence interval exceeding the prespecified reliability success criterion of 0.58. For the autopsy subgroup of 59 subjects, the median scan sensitivity was 82% (range, 69 to 92), and the median scan specificity was 95% (range, 90 to 95) for the five new readers.

Clinical and nonclinical studies verified that florbetapir scans can provide neuropathologically accurate and reliable estimations of the density of β-amyloid neuritic plaque in the brain. Nevertheless, as with other imaging methods, there is potential for clinical interpretive error. In the studies of scan accuracy, such errors were uncommon but when present were due mainly to false negative results, as determined by the density of β-amyloid neuritic plaque at autopsy.

Reader training was an especially important element in the clinical development of florbetapir, because the image-interpretation process differs markedly from that typically used in nuclear medicine. For example, the image reader must be proficient in distinguishing white from gray matter, a distinction that may be particularly challenging in patients with cortical atrophy. Unique “gray–white contrast” characteristics of florbetapir images must be recognized as signals of normal or abnormal isotope distribution (see figureTypical Negative and Positive Florbetapir Scans.). In addition, cognitive status and other clinical or diagnostic information are not considered during the interpretation of florbetapir images. The sole goal of the reader is to determine whether a scan is negative or positive, and this determination should be made only by readers who have completed the sponsoring company’s dedicated training program. The success of the reader-training process will be further evaluated in a postmarketing study of image interpretations performed under the typical conditions of clinical practice.

In approving florbetapir, the FDA did not require clinical data assessing the effect of florbetapir imaging on clinical management or patients’ health. The FDA code of regulations (in 21 CFR 315.5[a]) mandates that the effectiveness of a diagnostic radiopharmaceutical agent should be determined by an evaluation of the ability of the agent to provide useful clinical information related to the proposed indications for use. FDA guidance further recognizes that imaging information may in some instances “speak for itself” with respect to clinical value5 and that diagnostic approval may therefore not require assessment of the effects on clinical management or health outcomes. Two FDA advisory committees endorsed the implicit clinical value of information obtained from brain β-amyloid imaging. Florbetapir approval was based on this endorsement and on clinical data showing sufficient scan reliability and performance characteristics.2

The ultimate clinical value of florbetapir imaging awaits further studies to assess the role, if any, that it plays in providing prognostic and predictive information. For example, the prognostic usefulness of florbetapir imaging in identifying persons with mild cognitive impairment or cognitive symptoms who may be at risk for progression to dementia has not been determined. Nor are data available to determine whether florbetapir imaging could prove useful for predicting responses to medication. These concerns prompted the FDA to require a specific “Limitations of Use” section in the florbetapir label.

The FDA approval of florbetapir F18 injection sets the stage for future studies that increase the value of the technique in addressing the diagnostic challenges associated with cognitive impairment. Further investigation of the drug in the postmarketing context is consistent with the commitment of the FDA to the development of imaging products that aid in the diagnostic evaluation of cognitively impaired patients.

Florbetapir F18 Scan Usage: Information Summary.

A negative florbetapir scan:

• indicates sparse to no neuritic plaques.

• is inconsistent with a neuropathological diagnosis of Alzheimer’s disease at the time of image acquisition.

• reduces the likelihood that a patient’s cognitive impairment is due to Alzheimer’s disease.

A positive florbetapir scan:

• indicates moderate to frequent amyloid neuritic plaques.

• may be observed in older people with normal cognition and in patients with various neurologic conditions, including Alzheimer’s disease.

Important florbetapir scan limitations:

• A positive scan does not establish a diagnosis of Alzheimer’s disease or other cognitive disorder.

• The scan has not been shown to be useful in predicting the development of dementia or any other neurologic condition, nor has usefulness been shown for monitoring responses to therapies.

References

  1. 1

Promoting brain health. Atlanta: Centers for Disease Control and Prevention, 2011 (http://www.cdc.gov/aging/pdf/cognitive_impairment/cogImp_genAud_final.pdf).

  1. 2

Highlights of prescribing information: Amyvid (florbetapir F18 injection). Silver Spring, MD: Food and Drug Administration (http://www.accessdata.fda.gov/drugsatfda_docs/label/2012/202008s000lbl.pdf).

  1. 3

Okie S. Confronting Alzheimer’s disease. N Engl J Med 2011;365:1069-1072
Full Text | Web of Science | Medline

  1. 4

Lister-James J, Pontecorvo MJ, Clark C, et al. Florbetapir F-18: a histopathologically validated beta-amyloid positron emission tomography imaging agent. Semin Nucl Med 2011;41:300-304
CrossRef | Web of Science

  1. 5

Guidance for industry: developing medical imaging drug and biological products. Part 2: clinical indications. Washington, DC: Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research, Center for Biologics Evaluation and Research, 2004 (http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM071603.pdf).

 

Source: NEJM

 

 

 

 

Vomiting Shrimp and Other Deep-Sea Creatures Light Up the Ocean Floor.


More than a kilometer below the ocean’s surface, where the sunless water is inky black, scientists have documented one of nature’s most spectacular living light shows. An underwater survey has found that roughly 20% of bottom-dwelling organisms in the Bahamas produce light. Moreover, all of the organisms surveyed by the researchers proved to have visual senses tuned to the wavelengths of light generated by this bioluminescence. The work speaks to the important role self-generated light plays in deep-sea communities, marine biologists say.

Bioluminescence has evolved many times in marine species and may help organisms find mates and food or avoid predators. In the middle depths of the ocean—the mesopelagic zone that is located 200 to 1000 meters below the surface—the vast majority of organisms can bioluminesce. Much less was known about bioluminescence in organisms living close to the sea floor. Such benthic organisms are harder to visit or sample and therefore study, says Sönke Johnsen, a marine biologist at Duke University in Durham, North Carolina.

With Tamara Frank, a marine biologist at Nova Southeastern University in Florida, and colleagues, Johnsen recently explored four sites in the northern Bahamas in a submersible. The researchers collected the benthic organisms by suctioning them gently into a lightproof box with a vacuum hose. Once back in their shipboard labs, they stimulated bioluminescence in the captured organisms by softly prodding the animals. Those that glowed were tested further to determine the exact wavelength of light emitted.

As the survey team reported online on 5 September in The Journal of Experimental Biology,about 20% of the species they gathered were capable of producing bioluminescence when touched, including several species of coral, sea anemones, and an unusual species of shrimp that vomited bioluminescent chemicals into the water surrounding it. Most of the organisms glowed blue, except for a family of corals known as pennatulaceans, which produced green light.

Although fewer benthic species produce light than in the middle depths of the ocean, where approximately three-quarters of the organisms glow when touched, the sea floor was much brighter than upper depths. “It was like glowing rain,” Johnsen says. “We saw big flashes, then little ones, then streaks of larger gelatinous animals being squished against the windshield.”

This paradox—that fewer benthic organisms bioluminesce, but they do so more frequently—may be explained by how the phenomenon is triggered. A sea-dwelling species doesn’t produce light constantly; it typically does so only when touched by another object. Mesopelagic organisms float freely in the ocean and infrequently encounter other plants and animals. Benthic species, on the other hand, are constantly bumping up against corals or being jostled by microscopic plankton.

In a second study, Frank and Johnsen determined the wavelengths of light to which the captured organisms are most sensitive by placing a tiny electrode on the animal’s cornea or light-capturing organ. When they recorded a tiny jolt of electricity, it meant that the light had been detected by the animal. Most of the benthic organisms were most sensitive to blue-green light between 470 to 497 nanometers, Frank and her colleagues reported in a second article in that same issue of the journal.

Frank also discovered that two species of crab (Eumunida picta and Gastroptychus spinifer), were also sensitive to UV light, a surprising find because there is no UV light that deep in the ocean. Johnsen thinks this additional sensitivity may help the crabs avoid toxic corals, which produce a greenish glow, and home in on the edible organisms that emit blue light. Frank and Johnsen say they will conduct behavioral experiments to see whether their hypothesis about the color-coded benthic buffet is correct.

“It’s a splendid piece of work,” says Peter Herring, a retired bioluminescence expert at the National Oceanography Centre, Southampton, in the United Kingdom. “It’s a good collection of data from benthic animals at this location, and it uses the best technology as far as imaging is concerned.”

Source: Science Now

 

Smoking is injurious. If you don’t smoke, GREAT!!!


Smoking is injurious. If you don’t smoke, GREAT!!! If you do…BECOME AWARE!!! OF THE RISKS More for your loved ones …. if not for yourself….. Take care!!!!