Pregnancy and Breast Cancer.


Chemotherapy was safe for the fetus when administered in the second and third trimesters.

One of the most clinically challenging and personally heart-wrenching situations occurs when confronting a pregnant woman with a new diagnosis of cancer. Though uncommon, breast cancer in pregnancy accounts for up to 3% of all new breast cancer diagnoses. Optimal management can be complicated by limitations on the types of imaging, systemic therapy, and radiation therapy that can be administered without teratogenic effects on the fetus. Consequently, some patients consider terminating the pregnancy. However, with a team of experienced breast cancer specialists (medical oncologist, surgeon, and radiation oncologist), most pregnant patients with early-stage disease can be managed without compromising either the pregnancy or the prognosis of the breast cancer, but relatively little data is available to reassure clinicians and patients.

To provide such data, investigators conducted a multicenter, registry study in which 311 pregnant breast cancer patients (median age, 33), mostly from Belgium and Germany, were compared with 865 nonpregnant breast cancer patients (median age, 41). Either neoadjuvant or adjuvant chemotherapy was administered to 200 patients during the second or third trimester using actual body weight to calculate body surface area (BSA) and subsequent dosing of chemotherapy. The analysis was adjusted for age, disease stage, tumor grade, hormone receptor status, human epidermal growth factor 2 status, and type of systemic therapy received.

At median follow-up of 61 months, the 5-year disease-free survival rate was estimated to be similar in pregnant versus nonpregnant patients (65% and 71%, respectively; hazard ratio, 1.34; P=0.14), as was the 5-year overall survival rate (78% and 81%; HR, 1.19; P= 0.51). Earlier than normal delivery occurred, but no fetal malformations were reported, reaffirming that chemotherapy is safe if administered in the second and third trimesters.

COMMENT

This report adds to the body of evidence, largely retrospective in nature, that breast cancer can be managed successfully during pregnancy without compromising outcome during pregnancy. These data reassure patients and clinicians that systemic therapy administered during the second and third trimesters of pregnancy does not seem to affect the health of the fetus. Pregnant women diagnosed with breast cancer should be referred to centers with a multidisciplinary team experienced in the management of such patients.

Source: NEJM

 

Mapping the Journey to an HIV Vaccine

“Universal” vaccines that elicit cross-reactive and broadly neutralizing antibodies (bNAbs) are the ultimate goal of efforts to provide protective immunity against both the influenza virus and the human immunodeficiency virus (HIV). Infection with either virus leads to the induction of abundant strain-specific antibodies that are easily evaded by subsequent viral variants. However, the circulating diversity of HIV is greater than that of influenza by orders of magnitude, posing a tremendous challenge to the achievement of vaccine-mediated protection.

New hope for a universal sterilizing HIV vaccine arose several years ago with the evidence that bNAbs emerge in 10 to 30% of infected persons.1 Because these bNAb responses typically appear after 2 to 3 years of infection, they fail to control established infection: the kinetics of the evolving B-cell response lag behind the rapidly diversifying virus, and they cannot “catch up” to control established infection. However, these bNAbs have provided protection from infection at remarkably low doses in animals, suggesting that vaccine-induced bNAbs could provide sterilizing immunity if they were present before infection. Translating our current knowledge of bNAbs into a vaccine remains a daunting challenge, since the mechanism by which such antibodies are induced remains enigmatic.

As compared with other antibodies, bNAbs have unusual characteristics, including odd physical structures (e.g., elongated antigen-binding loops) and remarkably high levels of mutation that affect antibody–antigen binding and structural domains. These changes accumulate over years of infection as exposure to diverse viral variants drives antibody evolution, resulting in the generation of a set of antibodies that bears little similarity to their original antigen-naive B-cell ancestors (i.e., germline sequences).

Liao et al.3 have recently described the path along which bNAbs develop. They tracked the evolution of a single bNAb and the counter-evolution of an HIV virus, starting in the first weeks of infection. Their findings offer a roadmap for the induction of bNAbs through vaccination .Two key events distinguished the interaction of B-cell and virus during the developing natural history of this bNAb. First, whereas in most scenarios the naive B-cell population cannot bind to HIV, the naive B-cell repertoire in this infected person bound to the earliest incoming virus (the transmitted virus), which suggests that early rapid diversification of the B-cell response was initiated very soon after infection. Second, the rapid evolution of mutations affecting antibodies, which is required for potent antibody neutralization, occurs simultaneously with the rapid diversification of the virus in the first few months of infection. This occurrence suggests that the timing of the exposure to diverse viral variants may be crucial to the induction of protective antibody immunity.

Although the early evolution of the antibody response predominantly occurred within the antigen-recognition site, Liao et al. found that later evolutionary changes in the antibody occurred in structural regions, which are thought to have a limited role in antigen recognition. However, in a recent publication by Klein et al.,4 the authors report that mutations affecting these structural regions can potentiate antibody function. The authors found that among a set of diverse bNAbs, mutations affecting the structural regions are not just incidental to extensive mutation but are actually critical to neutralization, providing breadth and potency through multiple mechanisms — by expanding the antigen-recognition footprint, by subtly altering binding-loop positioning, and perhaps by changing the conformational dynamics of antibody–antigen binding.

Together, these studies highlight key features of the immune system’s natural induction of bNAbs. First, effective initiation of the antibody response depends on the early interactions between the virus and the naive B-cell repertoire. Second, an explosion of viral diversity can drive the molecular evolution of a bNAb. Finally, neutralization potency arises in an unanticipated way — by means of mutations affecting structural regions of the antibody.

Although we encode a finite number of B-cell–receptor sequences within our naive antibody repertoire, these sequences can become hugely diversified after initial selection and driven in specific directions by subsequent antigen exposure, in a process called affinity maturation, permitting nearly infinite exploration of antibody-recognition space. This idea raises the following questions: How can this finite set of naive sequences be effectively recruited initially, and how can the evolution of the antibody response be constrained to recognize HIV in a way that leads to neutralization? These studies suggest that the rational design of an effective HIV vaccine will require directed-antigen evolution to generate HIV-envelope immunogens that will robustly bind and trigger the germline-naive B-cell repertoire5.

Despite the 200 years that have elapsed since Edward Jenner’s smallpox vaccine, the development of vaccines has remained, for the most part, an empirical process. The studies by Liao et al. and Klein et al. outline the evolution of bNAb activity and may therefore enable the design of a universally protective vaccine against HIV and possibly other viruses.

Source: NEJM

 

 

 

 

The Past 200 Years in Diabetes.


Diabetes was first recognized around 1500 B.C.E. by the ancient Egyptians, who considered it a rare condition in which a person urinated excessively and lost weight. The term diabetes mellitus, reflecting the fact that the urine of those affected had a sweet taste, was first used by the Greek physician Aretaeus, who lived from about 80 to 138 C.E. It was not until 1776, however, that Matthew Dobson actually measured the concentration of glucose in the urine of such patients and found it to be increased.1

Diabetes was a recognized clinical entity when the New England Journal of Medicine and Surgery was founded in 1812. Its prevalence at the time was not documented, and essentially nothing was known about the mechanisms responsible for the disease. No effective treatment was available, and diabetes was uniformly fatal within weeks to months after its diagnosis owing to insulin deficiency. In the intervening 200 years, major fundamental advances have been made in our understanding of the underlying causes of diabetes and the approach to its prevention and treatment (see timeline, available with the full text of this article at NEJM.org). Although diabetes is still associated with a reduced life expectancy, the outlook for patients with this disease has improved dramatically, and patients usually lead active and productive lives for many decades after the diagnosis has been made. Many effective therapies are available for treating hyperglycemia and its complications. The study of diabetes and related aspects of glucose metabolism has been such fertile ground for scientific inquiry that 10 scientists have received the Nobel Prize for diabetes-related investigations since 1923 .Thus, as a result of the efforts of the past 200 years, there is much good news to report regarding diabetes.

Ironically, although scientific advances have led to effective strategies for preventing diabetes, the pathway to cure has remained elusive. In fact, if one views diabetes from a public health and overall societal standpoint, little progress has been made toward conquering the disease during the past 200 years, and we are arguably worse off now than we were in 1812. Two centuries ago, severe insulin deficiency dominated the clinical presentation of diabetes. Although it is possible that some people had milder forms of hyperglycemia at that time, they largely escaped clinical detection. In 2012, the commonly encountered spectrum of diabetes is quite different. Although severe insulin deficiency still occurs, it now accounts for only about 10% of cases overall and can be readily treated with insulin. The vast majority of patients with diabetes are overweight and have a combination of insulin resistance and impaired insulin secretion. The prevalence of this form of diabetes has been increasing dramatically, particularly in the past three to four decades, resulting in a worldwide epidemic that has made diabetes one of the most common and most serious medical conditions humankind has had to face.

THE SCIENTIFIC BASIS OF CURRENT TREATMENT APPROACHES

Studies of Glucose Metabolism

In the past 200 years, we have made dramatic advances in our understanding of the regulation of normal glucose metabolism. Beginning in the mid-19th century, Claude Bernard showed that blood glucose levels are regulated not just by the absorption of dietary carbohydrate but also by the liver, which plays a central role in producing glucose from nonglucose precursors.2 Other investigators built on this discovery to identify the enzymes responsible for the synthesis and breakdown of glycogen,3 the role of anterior pituitary hormones in glucose metabolism and the onset of diabetes,4 the role of reversible protein phosphorylation by a protein kinase,5 and the discovery of cyclic AMP and its role in hormonal action, particularly that of epinephrine and glucagon, both of which elevate the blood glucose concentration and contribute to diabetic hyperglycemia.6

The Role of the Pancreas and the Discovery of Insulin

In 1889, Joseph von Mering and Oskar Minkowski found that removing the pancreas from dogs resulted in fatal diabetes, providing the first clue that the pancreas plays a key role in regulating glucose concentrations.7,8 In 1910, Edward Albert Sharpey-Schafer hypothesized that diabetes was due to the deficiency of a single chemical produced by the pancreas; he called this chemical insulin, from the Latin word insula, meaning island and referring to the pancreatic islet cells of Langerhans. In 1921, Frederick Banting and Charles Best actually discovered insulin when they reversed diabetes that had been induced in dogs with an extract from the pancreatic islet cells of healthy dogs. Together with James Collip and John Macleod, they purified the hormone insulin from bovine pancreases and were the first to use it to treat a patient with diabetes. The production of insulin and its therapeutic use quickly spread around the world. This series of events may be the most dramatic example of the rapid translation of a discovery in basic science into a benefit for patients. Once insulin injections became available, young people with insulin deficiency who had previously faced almost certain, painful death within weeks to months were able to survive for prolonged periods of time. Figure 1FIGURE 1Effects of Insulin Therapy. shows a patient before and after she was treated successfully with insulin in 1922.11

Insulin Chemistry, Biology, and Physiology

The dramatic discovery of insulin and the rapid demonstration that it is essential for human health stimulated intense interest in its chemistry and biology. A number of landmark discoveries resulted, some of which reached beyond diabetes research. For example, Frederick Sanger was awarded the Nobel Prize in Chemistry for developing methods to sequence the amino acids of proteins, and he used insulin as an example of his approaches.12 Insulin was the first hormone for which the three-dimensional crystal structure was determined (by Dorothy Hodgkin, who had previously received the Nobel Prize in Chemistry for determining the structure of vitamin B12). Donald Steiner’s demonstration in 1967 that the two-polypeptide insulin molecule is derived from a single-chain precursor proinsulin13 was important not only for our understanding of the biochemistry of insulin but also because it applies to other peptide hormones that are transcribed as single-chain precursors. Insulin was the first hormone to be cloned14 and then produced for therapeutic use by means of recombinant DNA technology, which provided an unlimited supply of this important molecule and laid the foundation for the biotechnology industry.  Structure of Human Proinsulin. shows the structure of insulin.

The development of the radioimmunoassay for insulin by Rosalyn Yalow and Solomon Berson in 1959 permitted the quantitative measurement of pancreatic beta-cell function in animals and humans and established the radioimmunoassay as a powerful tool for measuring proteins, metabolites, and other chemicals present in very low concentrations.15 Much of our current understanding of diabetes has resulted from the ability to measure serum insulin levels.

PATHOGENESIS OF DIABETES

Insulin Resistance and Insulin Deficiency

Over the past two centuries, we have learned that diabetes is a complex, heterogeneous disorder. Type 1 diabetes occurs predominantly in young people and is due to selective autoimmune destruction of the pancreatic beta cell, leading to insulin deficiency. Type 2 diabetes is much more common, and the vast majority of people with this disorder are overweight. The increase in body weight in the general population, a result of high-fat, high-calorie diets and a sedentary lifestyle, is the most important factor associated with the increased prevalence of type 2 diabetes. Older adults are most likely to have type 2 diabetes, although the age at onset has been falling in recent years, and type 2 diabetes is now common among teenagers and young adults.

Harold Himsworth first proposed in 1936 that many patients with diabetes have insulin resistance rather than insulin deficiency.16 We now know that insulin resistance is essential in the pathogenesis of type 2 diabetes and that the disease results from both insulin resistance and impaired beta-cell function.17 A clinical phenotype widely called the metabolic syndrome, which includes insulin resistance, upper-body obesity, hypertension, hypertriglyceridemia, and low levels of high-density lipoprotein cholesterol,18identifies persons at high risk for glucose intolerance and diabetes. Such persons are also at high risk for cardiovascular disease and should be targeted for preventive strategies.

Genetic Factors

Genetic factors play an important role in the development of diabetes. Type 1 and type 2 diabetes are polygenic disorders, and multiple genes and environmental factors contribute to the development of the disease. A few forms of diabetes (e.g., maturity-onset diabetes of the young and neonatal diabetes) are single-gene disorders that affect the pancreatic beta cell19,20 but account for only 1 to 2% of cases. In type 1 diabetes, alleles at the human leukocyte antigen locus on the short arm of chromosome 6 appear to explain up to 50% of the cases of familial clustering.21,22 In contrast, a predominant genetic susceptibility locus for type 2 diabetes has not been found. Genetic studies have identified over 40 genetic variants that increase the risk of type 2 diabetes, but in the aggregate these variants account for only about 10% of the heritability of the disorder.23,24 Individually, persons with these variants have an increased risk of diabetes of 10 to 15%, as compared with persons without the variants. The multiplicity of genes that contribute to the risk of type 2 diabetes makes it difficult to determine this risk precisely or to develop selective preventive or therapeutic strategies based on the genetic profile.

PREVENTION AND TREATMENT OF DIABETES

The approach to the prevention and treatment of diabetes has been transformed since the discovery of insulin, which led to the rapid development of a widely available and lifesaving new treatment and initiated a series of advances that have fundamentally enhanced the daily lives of patients with diabetes and dramatically extended their life expectancy. Many advances have resulted from important clinical trials that were reported in the Journal and elsewhere.25-49 Some highlights of these studies include the use of biosynthetic human insulin, which has virtually eliminated local reactions at the injection site; insulin syringes and needles that are small and convenient to use and have reduced the pain of injections; home glucose monitoring,25 which together with measurements of glycated hemoglobin,26 allows therapy to be altered on the basis of accurate assessments of glucose control; and insulin pumps27 driven by computer algorithms28 that adjust insulin doses on the basis of the continuous measurement of glucose levels to achieve glucose concentrations within the physiologic. Preventive strategies and treatments for diabetic complications have undergone impressive improvements. The beneficial effects of angiotensin-receptor blockade, angiotensin-converting–enzyme inhibition, and protein restriction in preventing diabetic nephropathy have been shown.29-34 Advances in kidney transplantation have extended the lives of patients with advanced diabetic kidney disease, and laser photocoagulation has preserved the vision of millions of patients with diabetic retinopathy.35 Advances in islet-cell and pancreas transplantation have also been impressive.36,37 Recent evidence exemplified by the results of two randomized, controlled clinical trials reported this past spring in the Journal suggests that bariatric surgery to induce weight loss in patients with type 2 diabetes is much more effective than either standard or intensive medical therapy alone in lowering glucose levels and even in achieving disease remission.38,39 Advances in technology have thus profoundly improved our ability to monitor diabetic control (from urine testing to home glucose meters to continuous glucose monitoring) and to treat this disease and its complications (laser therapy for diabetic retinopathy, kidney transplantation for diabetic renal disease, and bariatric surgery to induce disease remission).

Diabetes care has been at the forefront of efforts to develop team-based approaches to patient care that involve physicians, nurses, nutritionists, social workers, podiatrists, and others and in developing models of care delivery for chronic illness. Using such an approach, the Diabetes Prevention Program showed that physical activity and weight loss can reduce the risk of diabetes in predisposed persons by 58%.40Major effects are also seen after treatment with metformin40 or pioglitazone.41 The Diabetes Control and Complications Trial showed that improved glucose control reduces microvascular complications in type 1 diabetes,42 and the United Kingdom Prospective Diabetes Study showed the same for type 2 diabetes.43 Intensive insulin therapy to prevent hyperglycemia improves outcomes in critically ill patients.44,45

The effect of diabetes treatment on cardiovascular outcomes and mortality is a critical issue. The Steno-2 Study showed that a multifactorial intervention aimed at improving control of glucose levels, lipid levels, and blood pressure led to a 50% reduction in cardiovascular mortality among patients with type 2 diabetes.46,47 Among patients with type 1 diabetes, improved glucose control leads to a reduction in macrovascular disease, an effect that becomes apparent only many years after the improvement has been achieved.48 The recent Action to Control Cardiovascular Risk in Diabetes (ACCORD) trial showed that aggressive glycemic control of type 2 diabetes reduced the risk of nonfatal myocardial infarction but increased overall mortality.49 The reasons for these differences between studies are not clear, but in type 2 diabetes, multiple factors increase the predisposition to cardiovascular disease. Indeed, treatment of hyperlipidemia and hypertension appears to be more effective in reducing cardiovascular events than does treatment to lower glucose levels. As a result of these and other findings, the treatments available for patients with diabetes have improved dramatically, particularly over the past 30 to 40 years.

PREVALENCE OF DIABETES — A WORLDWIDE EPIDEMIC

Unfortunately, the improvement in outcomes for individual patients with diabetes has not resulted in similar improvements from the public health perspective. The worldwide prevalence of diabetes has continued to increase dramatically. The difficulty in applying the principles of diabetes care from the individual patient to the population reflects the unique challenges of implementing research findings and effecting behavioral change.  4Number of Persons and Percentages of the Population with Diagnosed Diabetes in the United States, 1980–2010. shows the number and percentage of persons in the U.S. population with diagnosed diabetes between 1980 and 2010. During this period, the number of diagnosed cases of diabetes increased from 5.6 million to 20.9 million, representing 2.5% and 6.9% of the population, respectively. Nearly 27% of persons over 65 years of age have diabetes. If current trends continue, 1 in 3 U.S. adults could have diabetes by 2050. The American Diabetes Association estimated that the cost of diagnosed diabetes in the United States was $174 billion in 2007,50 and efforts to prevent and treat diabetes threaten to overwhelm health systems throughout the world.

FUTURE CHALLENGES

Given the surge in the prevalence of diabetes, timely prevention of this disease at the population level is essential. Opportunities abound for the implementation of preventive public policies. Rigorous scientific methods will be needed to evaluate the effects of policy and legislative initiatives to eliminate trans fat from the diet; require restaurants to provide the caloric content of items on their menus; reduce the availability of high-calorie, high-fat foods in school cafeterias; and impose a tax on sugar-sweetened beverages. Lifestyle modification will undoubtedly play a key role in the ultimate solution to the problem of diabetes, but the necessary modifications have not been easy to implement, and more definitive solutions will depend on the ability of basic science to point prevention and treatment in new directions. Advances in basic immunology — in particular, the transformation of primitive stem cells into pancreatic beta cells — offer promise for the prevention and treatment of autoimmunity in patients with type 1 diabetes. Advances in the identification of diabetes-susceptibility genes should clarify the relative role of insulin resistance and beta-cell dysfunction and identify molecular pathways and new drug targets, leading to more effective approaches to the prevention and treatment of type 2 diabetes. Although the challenges are still substantial, if we build on past accomplishments, there is every reason for optimism that another breakthrough as dramatic as the discovery of insulin will occur in the foreseeable future, with a similarly dramatic impact.

Source: NEJM

Rapid Blood-Pressure Lowering in Patients with Acute Intracerebral Hemorrhage.


BACKGROUND

Whether rapid lowering of elevated blood pressure would improve the outcome in patients with intracerebral hemorrhage is not known.

METHODS

We randomly assigned 2839 patients who had had a spontaneous intracerebral hemorrhage within the previous 6 hours and who had elevated systolic blood pressure to receive intensive treatment to lower their blood pressure (with a target systolic level of <140 mm Hg within 1 hour) or guideline-recommended treatment (with a target systolic level of <180 mm Hg) with the use of agents of the physician’s choosing. The primary outcome was death or major disability, which was defined as a score of 3 to 6 on the modified Rankin scale (in which a score of 0 indicates no symptoms, a score of 5 indicates severe disability, and a score of 6 indicates death) at 90 days. A prespecified ordinal analysis of the modified Rankin score was also performed. The rate of serious adverse events was compared between the two groups.

RESULTS

Among the 2794 participants for whom the primary outcome could be determined, 719 of 1382 participants (52.0%) receiving intensive treatment, as compared with 785 of 1412 (55.6%) receiving guideline-recommended treatment, had a primary outcome event (odds ratio with intensive treatment, 0.87; 95% confidence interval [CI], 0.75 to 1.01; P=0.06). The ordinal analysis showed significantly lower modified Rankin scores with intensive treatment (odds ratio for greater disability, 0.87; 95% CI, 0.77 to 1.00; P=0.04). Mortality was 11.9% in the group receiving intensive treatment and 12.0% in the group receiving guideline-recommended treatment. Nonfatal serious adverse events occurred in 23.3% and 23.6% of the patients in the two groups, respectively.

CONCLUSIONS

In patients with intracerebral hemorrhage, intensive lowering of blood pressure did not result in a significant reduction in the rate of the primary outcome of death or severe disability. An ordinal analysis of modified Rankin scores indicated improved functional outcomes with intensive lowering of blood pressure.

Source: NEJM

 

 

Clopidogrel with Aspirin in Acute Minor Stroke or Transient Ischemic Attack.


Stroke is common during the first few weeks after a transient ischemic attack (TIA) or minor ischemic stroke. Combination therapy with clopidogrel and aspirin may provide greater protection against subsequent stroke than aspirin alone.

METHODS

In a randomized, double-blind, placebo-controlled trial conducted at 114 centers in China, we randomly assigned 5170 patients within 24 hours after the onset of minor ischemic stroke or high-risk TIA to combination therapy with clopidogrel and aspirin (clopidogrel at an initial dose of 300 mg, followed by 75 mg per day for 90 days, plus aspirin at a dose of 75 mg per day for the first 21 days) or to placebo plus aspirin (75 mg per day for 90 days). All participants received open-label aspirin at a clinician-determined dose of 75 to 300 mg on day 1. The primary outcome was stroke (ischemic or hemorrhagic) during 90 days of follow-up in an intention-to-treat analysis. Treatment differences were assessed with the use of a Cox proportional-hazards model, with study center as a random effect.

RESULTS

Stroke occurred in 8.2% of patients in the clopidogrel–aspirin group, as compared with 11.7% of those in the aspirin group (hazard ratio, 0.68; 95% confidence interval, 0.57 to 0.81; P<0.001). Moderate or severe hemorrhage occurred in seven patients (0.3%) in the clopidogrel–aspirin group and in eight (0.3%) in the aspirin group (P=0.73); the rate of hemorrhagic stroke was 0.3% in each group.

CONCLUSIONS

Among patients with TIA or minor stroke who can be treated within 24 hours after the onset of symptoms, the combination of clopidogrel and aspirin is superior to aspirin alone for reducing the risk of stroke in the first 90 days and does not increase the risk of hemorrhage.

Source: NEJM

Risks (and Benefits) in Comparative Effectiveness Research Trials.


Comparative effectiveness research (CER) aims to provide high-quality evidence to help patients and clinicians make informed clinical decisions and to assist health systems in improving the quality and cost-effectiveness of clinical care.1 Recently, the Department of Health and Human Services indicated that the regulatory framework for protecting human subjects is inadequate to evaluate the multifaceted risks of CER randomized, controlled trials (RCTs).2 As the federal Common Rule states, risks to subjects must be “reasonable in relation to anticipated benefit.” Institutional review boards (IRBs) are directed to “consider only those risks and benefits that may result from the research (as distinguished from risks and benefits of therapies subjects would receive even if not participating in the research).” Furthermore, unless the requirement for informed consent is waived by the IRB, subjects must be informed of “any reasonably foreseeable risks or discomforts” associated with participation. The enmeshment of research and standard clinical care makes evaluation of the risks posed by a CER RCT complex. In order to provide ethically appropriate oversight and informed consent, investigators should consider, manage, and communicate with potential participants about at least nine different types of potential risk — some unique to CER RCTs, some common to all RCTs.

1. Risks associated with the standard of care. All patients, when receiving the standard of care, are at risk for both the ills of the underlying disease processes and iatrogenic harm. Patients should be informed about undesired events or outcomes that are likely to occur with some frequency or that would be severe. Patients who are not participating in research studies may not be as thoroughly informed about the absolute risks associated with the proposed treatment or the relative risks of alternative treatments. A collateral benefit of trial participation is access to better information.

2. Risks (and benefits) of intervention A as compared with intervention B. CER studies are warranted when, within the range of the standard of care, more than one intervention is in common use for the same diagnostic, therapeutic, or other core clinical purpose, when there is debate among clinicians about which intervention is superior, and when evidence from a clinical trial could resolve the dispute and improve outcomes. In such situations, the relative risks associated with interventions A and B may be unknown, or one intervention may be known to be more risky or more costly but may have the potential to offer compensatory benefits. CER measures the difference in the marginal risks and benefits of A and B relative to each other.

3. Risks due to randomization. In CER RCTs, randomization dictates which intervention a participant will receive — nothing more, nothing less. If common treatments A and B were identical, the risk difference would be zero. In RCTs of two different interventions, when the differential risks of A as compared with B are unknown — hovering in a state of so-called clinical equipoise — the risks posed by assigning patients to one of the two interventions either by randomization or by uncertain clinician preference are not marginally different.

4. Risks due to experimental assignment versus practice variation. Patients receive care at particular sites, where the staff often prefers and offers one treatment over another. Thus, from the patient’s perspective, trial enrollment entails a describable alteration in the likelihood of receiving intervention A or B. If the patient is at a site that has mostly used intervention A, then entering the study entails an increased likelihood of exposure to B, and vice versa. Participants at a specific site should be given this site-specific information so they can evaluate the ways in which their treatment in the study might differ from what they would have received outside the study.

5. Risks due to masking of “standard” interventions. Keeping participants and investigators blind to treatment assignment can reduce outcome-measurement bias but can also introduce risk. In attempting to mask standard interventions, researchers must consider how blinding may affect the overall care of the participant as a patient. Research participants and health care professionals must be made fully aware that they will not be allowed to either choose or know the treatment assignment.

6. Risks due to protocol fidelity. In standard clinical care, disease- or condition-specific pathways or protocols are often suspended or altered when a patient is not doing well. Deviations from routinized care may or may not benefit the patient but are common practice. CER RCTs need to identify those junctures at which a particular intervention should be altered or stopped according to standard-of-care practices and to evaluate and minimize the risk associated with delays in such alterations of standard medical care. Prospective participants need to understand the investigator’s obligation to ensure fidelity to the protocol, the protocol-specified limits to that obligation, and their own right to withdraw from the study.

7. Risks of being assigned to the study group that receives less benefit. In CER RCTs, neither treatment option is universally accepted as the default or control. This symmetry has implications for informing participants about both risks and benefits. Since each intervention is simultaneously presumed to be effective (albeit to a different yet unknown degree), participants may perceive the risks as low. In the end, however, one treatment may yield greater benefit. The participants assigned to the other group may perceive their lower benefit as an actual harm. Potential participants should be informed of this possibility and should understand that the relative difference between the treatments is not currently known.

8. Risks due to acknowledgment of uncertainty. When providing potential participants with information during the consent process, investigators must clarify the existing uncertainty regarding the interventions. This clarification may cause psychological discomfort in patients who find uncertainty disconcerting. Although concealing uncertainty may avert this discomfort, concealment would constitute a failure to respect the patient. Patients need facts in order to make informed decisions. These psychological risks are therefore unavoidable in the ethical conduct of randomized trials. But it is important to recognize that these risks also exist for patients receiving the standard-of-care interventions if they receive appropriately complete information about treatment alternatives.3

9. Risks associated with being in the trial as compared with not being in it. Overall, if participation in clinical trials posed risks to participants, then participants in clinical trials would be observed to experience more harm or poorer outcomes than those receiving care outside clinical trials. There is no evidence that they do. Instead, participants in clinical trials usually have outcomes equivalent to those among similar patients not enrolled in studies who receive the same treatments.4 Unless specific reasons dictate otherwise, participants in CER RCTs should be informed that their outcomes will most likely be the same, but could be better or worse, if they do not participate.

In sum, CER RCTs carry multiple risks and benefits. Analysis of the overall risks and benefits of such studies — by IRBs or federal oversight committees — must take into account each of these domains separately and then integrate them into an assessment of the risks and benefits of the study as a whole. This approach often requires analysts to make judgments when comparing one sort of risk to another. The communication of information on these various forms of risks and benefits to potential study participants requires a balancing act. Detailed explanation of each separate risk may be overwhelming and confusing. Summaries of the risks may oversimplify or underemphasize particular risks.5 Evaluation of the acceptability of studies and of the adequacy of consent forms must reflect consideration and communication about these potential risks and benefits both separately and as a whole.

Source: NEJM

Vedolizumab as Induction and Maintenance Therapy for Ulcerative Colitis.


BACKGROUND

Gut-selective blockade of lymphocyte trafficking by vedolizumab may constitute effective treatment for ulcerative colitis.

METHODS

We conducted two integrated randomized, double-blind, placebo-controlled trials of vedolizumab in patients with active disease. In the trial of induction therapy, 374 patients (cohort 1) received vedolizumab (at a dose of 300 mg) or placebo intravenously at weeks 0 and 2, and 521 patients (cohort 2) received open-label vedolizumab at weeks 0 and 2, with disease evaluation at week 6. In the trial of maintenance therapy, patients in either cohort who had a response to vedolizumab at week 6 were randomly assigned to continue receiving vedolizumab every 8 or 4 weeks or to switch to placebo for up to 52 weeks. A response was defined as a reduction in the Mayo Clinic score (range, 0 to 12, with higher scores indicating more active disease) of at least 3 points and a decrease of at least 30% from baseline, with an accompanying decrease in the rectal bleeding subscore of at least 1 point or an absolute rectal bleeding subscore of 0 or 1.

RESULTS

Response rates at week 6 were 47.1% and 25.5% among patients in the vedolizumab group and placebo group, respectively (difference with adjustment for stratification factors, 21.7 percentage points; 95% confidence interval [CI], 11.6 to 31.7; P<0.001). At week 52, 41.8% of patients who continued to receive vedolizumab every 8 weeks and 44.8% of patients who continued to receive vedolizumab every 4 weeks were in clinical remission (Mayo Clinic score ≤2 and no subscore >1), as compared with 15.9% of patients who switched to placebo (adjusted difference, 26.1 percentage points for vedolizumab every 8 weeks vs. placebo [95% CI, 14.9 to 37.2; P<0.001] and 29.1 percentage points for vedolizumab every 4 weeks vs. placebo [95% CI, 17.9 to 40.4; P<0.001]). The frequency of adverse events was similar in the vedolizumab and placebo groups.

CONCLUSIONS

Vedolizumab was more effective than placebo as induction and maintenance therapy for ulcerative colitis.

Source: NEJM

Can ‘powdered rain’ make drought a thing of the past?


The lack of water is a growing, global problem that seems intractable.

While the UN estimates that a large majority of the water we use goes on irrigation, researchers have been working on a range of ideas that make the water we use in agriculture last longer.

CD558DADFEB3030EE7D2E76CAE96DD1F

There has been a great deal of excitement and some dramatic headlines in recent weeks about a product that is said to have the potential to overcome the global challenge of growing crops in arid conditions.

“Solid Rain” is a powder that’s capable of absorbing enormous amounts of water and releasing it slowly over a year so that plants can survive and thrive in the middle of a drought.

A litre of water can be absorbed in as little as 10 grams of the material, which is a type of absorbent polymer originally pioneered by the US Department of Agriculture (USDA).

Back in the 1970s, USDA developed a super-absorbent product made from a type of starch nicknamed the “super slurper“.

The most widely used, commercial application of this technology has been in disposable nappies, or diapers as they are quaintly termed in the US.

But a Mexican chemical engineer called Sergio Jesus Rico Velasco saw more in the product than dry bottoms.

He developed and patented a different version of the formula that could be mixed in with soil to hold water that could then slowly feed plants.

_69356646_solidrain2

Ground water

He formed a company to sell Solid Rain and it has quietly been selling the product in Mexico for around 10 years. The company says that the government there tested Solid Rain and found that crop yields could increase by 300% when it was added to the soil.

According to Edwin Gonzalez, a vice president with the Solid Rain company, the product is now attracting wider interest because of growing concerns about the scarcity of water.

“It works by encapsulating the water, and our product lasts 8 to 10 years in the ground, depending on the water quality – if you use pure water, it lasts longer,” he told BBC News.

The company recommends using about 50kgs per hectare – though it’s not cheap, at $1,500 (£960) for that amount.

Mr Gonzalez was at pains to point out that Solid Rain was all natural and would not damage the land even if it was used for several years.

“Our product is not toxic; it’s made from a bio-acrylamide. After it disintegrates, the powder-like substance becomes part of the plant – it is not toxic,” he said.

Science uncertain

But not everyone is convinced that Solid Rain is a significant solution to the problem of drought.

Dr Linda Chalker-Scott from Washington State University says that these types of products have been known to gardeners for several years now.

“They’re hardly new, and there’s no scientific evidence to suggest that they hold water for a year, or last for 10 years in the soil,” she told BBC News.

“An additional practical problem is that gels can do as much harm as good. As the gels begin to dry out, they soak up surrounding water more vigorously. That means they will start taking water directly from plant roots,” she added.

Dr Chalker-Scott says that research she carried out in Seattle with newly transplanted trees showed that wood chip mulching was just as effective as adding powdered materials and gels to the soil. And it was significantly cheaper.

However, Edwin Gonzalez says Solid Rain is different.

“There are other competitors that last three or four years. The ones that don’t last as long are the ones that have sodium – they don’t absorb as much. The potassium ones, like ours, are seen as the better products,” he said

Despite the fact that the science may not be entirely certain about the benefits of products like this, Edwin Gonzalez says his company has been inundated with enquiries from dry spots including India and Australia.

And he’s also had several orders from the UK, where the lack of water is usually not a problem.

Source:BBC

5 Easy Ways to Keep Your Actions Aligned with Your Priorities.


It is not enough to be busy… The question is: what are we busy about? Henry David Thoreau

A busy life is a modern reality. We rush from activity to activity, chore to chore. But at the end of the day, did all of our hustle and bustle actually align with our core values and life goals?

For many of us, the answer is often, “no”. I propose, if our days are to be packed from dawn until dusk, we owe it to ourselves to make sure our activities are serving us.

It all starts with identifying your real goals. For me, my real goals are scratched out on a post-it note that hangs on my bedroom mirror. They are: connection with my family, maintaining my health and making time for my writing. Once you know what’s really important to you, you’re ready to use the five strategies below to make sure your daily actions align with your priorities.

busy

1. Make a short list of actions every morning

When you wake up, roll over and jot down two or three simple things you can do today that will serve your higher purpose. This is especially important if you’re working a job that doesn’t particularly match your life goals, but is necessary to get your bills paid right now.

2. Read something relevant to your goal every day

During your morning train ride or time spent waiting in line for coffee, read something to either inspire you or help you gain knowledge to get you closer to your goals. Subscribing to blogs in your areas of interest is an easy way to keep relevant reading material close at hand.

3. Listen to something relevant to your goal while doing banal tasks

The dishes still need to be done and the towels still need to be folded, make use of busy work by listening to an audio book or podcast that is meaningful for you.

4. Schedule activities that don’t align with your goals, don’t default to them

It’s just too easy to turn on the television and waste a few hours. If there are activities you enjoy, but get in the way of acting in line with your priorities, schedule them infrequently rather than making them a default whenever you don’t know what else to do. For example, save watching the latest episode of Mad Men for every Friday night.

5. Journal your progress at the end of the day

Before you go to sleep,do a quick recap of your day and identify which of your activities was in line with your goals and what you spent time on that wasn’t. This will keep your choices front and center and hold you accountable for acting in ways that align with your priorities.

Follow these five tips for the next thirty days and you’ll be amazed at how keeping your actions in line with your priorities becomes second nature. Your busy life will begin to feel less frantic and more intentional.

Source: purpose fairy