Cases of Rare, Fatal Cancer Linked to Celiac Disease Increasing


VANCOUVER, B.C.—The incidence of enteropathy-associated T-cell lymphoma—a rare and aggressive T-cell, non-Hodgkin lymphoma—is rising, according to data presented at the 2023 annual meeting of the American College of Gastroenterology.

The growing number of EATL cases could be connected to the increase in celiac disease, given the strong association between the two conditions, researchers said.

“Although [EATL is] rare, most of the cases that we see develop in patients with celiac disease,” said lead investigator Isabel Hujoel, MD, the clinic director of the Celiac Disease Center at UW Medical Center, in Seattle. “We don’t know why the incidence of EATL has steadily risen over the past two decades, but we suspect that it’s because celiac disease continues to increase in prevalence.”

The study, which used the Surveillance, Epidemiology, and End Results (SEER) program database, found 463 cases of EATL between 2000 and 2020, with an age-adjusted incidence rate of 0.014 per 100,000 people (abstract P1249). However, the incidence of EATL increased by 2.58% annually over this 20-year period.

image

Poor Survival Outcomes

Findings from the study, which won Presidential Poster and Outstanding Research Awards, also showed that EATL was more common in men, and that the median age at diagnosis was 65 years. Most cases (42%) were treated with both surgery and chemotherapy.

Older age at the time of diagnosis was associated with a higher risk for death, while factors such as sex, race, year of diagnosis and time to treatment initiation showed no significant impact on survival.

Notably, despite medical advances over the past two decades, the data showed no change in survival over the study period. “Unfortunately, survival outcomes of patients with EATL have not improved over time,” Dr. Hujoel said. “Mean survival for this disease remains approximately six months.”

A separate, retrospective cohort analysis of 259 patients in the SEER database found that the majority of patients underwent surgical resection (69.9%), followed by chemotherapy (47.5%) (abstract P1594).

Investigator Sophia Dar, MD, a gastroenterology fellow at Southern Illinois University School of Medicine, in Springfield, told Gastroenterology & Endoscopy News that treatment with these methods was associated with improved survival rates compared with no treatment. Chemotherapy alone also decreased the hazard ratio when compared with no treatment.

“Patients in the chemotherapy group marginally outperformed their surgical counterparts, indicating perhaps chemotherapy as a more viable treatment option in certain scenarios,” Dr. Dar said. “Unfortunately, irrespective of treatment, 83.7% of patients died in the five-year follow-up period.”

Researchers from both studies emphasized the need for further study, especially considering the strong association between EATL and celiac disease.

“Better understanding of the factors contributing to the high mortality rate could help medical practitioners develop more efficient treatment plans for EATL,” Dr. Dar concluded.

Screen Only for Refractory Disease

Debra Silberg, MD, PhD, the chief scientific officer of the nonprofit Beyond Celiac, based in Ambler, Pa., told Gastroenterology & Endoscopy News that the research does not address whether the increase in EATL is accounted for by patients with celiac disease, but “since there has also been an annual increase in celiac disease diagnosis, it follows that there would be an increase in EATL. Therefore, the overall percentage of EATL in patients with celiac disease may not have changed—which is around 0.22 to 1.9 per 100,000—and is still extremely rare,” she added.

Regarding screening, Dr. Silberg referred to the American Gastroenterological Association guidelines for refractory celiac disease (Gastroenterology 2022;163[5]:1461-1469). If a patient is diagnosed with type 2 refractory celiac disease, which, she said, “is associated with clonal T-cell expansion, then small bowel imaging is recommended to exclude EATL and ulcerative jejunoileitis. Since EATL is still rare even with this increased incidence,” she added, “only patients with refractory celiac disease or a suspicion of a complication of celiac disease should be screened.”

Long term gluten consumption in adults without celiac disease and risk of coronary heart disease: prospective cohort study


Abstract

Objective To examine the association of long term intake of gluten with the development of incident coronary heart disease.

Design Prospective cohort study.

Setting and participants 64 714 women in the Nurses’ Health Study and 45 303 men in the Health Professionals Follow-up Study without a history of coronary heart disease who completed a 131 item semiquantitative food frequency questionnaire in 1986 that was updated every four years through 2010.

Exposure Consumption of gluten, estimated from food frequency questionnaires.

Main outcome measure Development of coronary heart disease (fatal or non-fatal myocardial infarction).

Results During 26 years of follow-up encompassing 2 273 931 person years, 2431 women and 4098 men developed coronary heart disease. Compared with participants in the lowest fifth of gluten intake, who had a coronary heart disease incidence rate of 352 per 100 000 person years, those in the highest fifth had a rate of 277 events per 100 000 person years, leading to an unadjusted rate difference of 75 (95% confidence interval 51 to 98) fewer cases of coronary heart disease per 100 000 person years. After adjustment for known risk factors, participants in the highest fifth of estimated gluten intake had a multivariable hazard ratio for coronary heart disease of 0.95 (95% confidence interval 0.88 to 1.02; P for trend=0.29). After additional adjustment for intake of whole grains (leaving the remaining variance of gluten corresponding to refined grains), the multivariate hazard ratio was 1.00 (0.92 to 1.09; P for trend=0.77). In contrast, after additional adjustment for intake of refined grains (leaving the variance of gluten intake correlating with whole grain intake), estimated gluten consumption was associated with a lower risk of coronary heart disease (multivariate hazard ratio 0.85, 0.77 to 0.93; P for trend=0.002).

Conclusion Long term dietary intake of gluten was not associated with risk of coronary heart disease. However, the avoidance of gluten may result in reduced consumption of beneficial whole grains, which may affect cardiovascular risk. The promotion of gluten-free diets among people without celiac disease should not be encouraged.

Introduction

Gluten, a storage protein in wheat, rye, and barley, triggers inflammation and intestinal damage in people with celiac disease.1 People with intestinal or extra-intestinal symptoms triggered by gluten but who do not meet formal criteria for celiac disease may have non-celiac gluten sensitivity, a clinical entity with an as yet uncharacterized biological basis.2 Celiac disease, which is present in 0.7% of the US population,3 is associated with an increased risk of coronary heart disease, which is reduced after treatment with a gluten-free diet.4

On the basis of evidence that gluten may promote inflammation in the absence of celiac disease or non-celiac gluten sensitivity,5 concern has arisen in the medical community and lay public that gluten may increase the risk of obesity, metabolic syndrome, neuropsychiatric symptoms, and cardiovascular risk among healthy people.678910 As a result, diets that limit gluten intake have gained popularity.1112 In an analysis of the National Health and Nutrition Examination Survey (NHANES), most people adhering to a gluten-free diet did have a diagnosis of celiac disease.3 Moreover, in a follow-up analysis of NHANES, adoption of a gluten-free diet by people without celiac disease rose more than threefold from 2009-10 (prevalence 0.52%) to 2013-14 (prevalence 1.69%).13

Short of strict gluten avoidance, people may reduce gluten in their diet owing to beliefs that this practice carries general health benefits.14 The reasons for gluten reduction likely relate to the perception that gluten carries adverse health effects. One national survey showed a steep rise in interest in this diet in recent years, and by 2013 nearly 30% of adults in the US reported that they were trying to minimize or avoid gluten.15 Concerns exist that a gluten-free or gluten restricted diet may be nutritionally suboptimal,16 and gluten-free substitute foods cost considerably more than their counterparts that contain gluten.1718 Despite the rising trend in gluten restriction, no long term, prospective studies have assessed the relation of dietary gluten with the risk of chronic conditions such as coronary heart disease in people without celiac disease. Thus, using prospective, validated data on dietary intake collected over 20-30 years, we examined the association of estimated long term intake of gluten with the development of incident coronary heart disease (fatal or non-fatal myocardial infarction).

Methods

Study population

The Nurses’ Health Study (NHS) is a prospective cohort of 121 700 female nurses from 11 US states who were enrolled in 1976. The Health Professionals Follow-up Study (HPFS) is a prospective cohort of 51 529 male health professionals from all 50 states who were enrolled in 1986. Participants in NHS and HPFS have been followed via biennial self administered questionnaires on health and lifestyle habits, anthropometrics, environmental exposures, and medical conditions. In 1986, diet in both cohorts was assessed with a validated 136 item semiquantitative food frequency questionnaire. Among the 73 666 women in NHS and 49 934 men in HPFS who completed a food frequency questionnaire in 1986, we excluded participants if they reported implausible daily energy intake (<600 or >3500 kcal/d for women and <800 or >4200 kcal/d for men) or missing gluten data (NHS 48; HPFS 39); a diagnosis of myocardial infarction, angina, or stroke or coronary artery bypass graft surgery (NHS 4015; HPFS 2647); or cancer (NHS 4689; HPFS 1785). Participants were specifically asked about a history of celiac disease in 2014; we excluded from this analysis anyone who reported a previous diagnosis of celiac disease (NHS 200; HPFS 160). After these exclusions, 64 714 women and 45 303 men were available for analysis. Return of the mailed questionnaire was considered to imply informed consent.

Measurement of exposure and outcome

In both cohorts, diet was assessed in 1986, 1990, 1994, 1998, 2002, 2006, and 2010. For each food item, participants were asked about the frequency with which they consumed a commonly used portion size for each food over the previous year; available responses ranged from never or less than once a month to six or more times a day. We calculated nutrients by using the Harvard T. H. Chan School of Public Health nutrient database, which was updated every two to four years during the period of food frequency questionnaire distribution.19 We used year specific nutrient tables for ingredient level foods. Previous validation studies have shown that the derivation of nutrient values correlates highly with nutrient intake as measured by one week food diaries in women and men.2021

For each of these two cohorts, we derived the quantity of gluten consumed. We calculated the quantity of gluten on the basis of the protein content of wheat, rye, and barley based on recipe ingredient lists from product labels provided by manufacturers or cookbooks in the case of home prepared items. Previous studies have used conversion factors of 75% or 80% when calculating the proportion of protein content that comprises gluten; we used the more conservative estimate of 75%.222324 Although gluten’s proportion of total protein may be more variable for rye and barley than for wheat,25 we used the same conversion factor for all three grains, consistent with previous studies.2223 Although trace amounts of gluten can be present in oats and in condiments (for example, soy sauce), we did not calculate gluten on the basis of these items as the quantity of gluten is much lower than that in cereals and grains and the contribution to total gluten intake would be negligible.26

In 1986 the five largest contributors to gluten in both cohorts were dark bread, pasta, cold cereal, white bread, and pizza (supplementary table A). Previous validation studies within these cohorts found that the Pearson correlation coefficients between the number of servings of these items reported on food frequency questionnaires and that reported on seven day dietary records ranged from 0.35 (pasta) to 0.79 (cold cereal) for women and from 0.37 (dark bread) to 0.86 (cold cereal) for men.2728 A separate validation study of this food frequency questionnaire found that this method of measuring vegetable (that is, plant based) protein intake, of which gluten is the major contributor, correlated highly with that measured in seven day dietary records (Spearman correlation coefficient 0.66).29

We divided cohort participants into fifths of estimated gluten consumption, according to energy adjusted grams of gluten per day. We obtained energy adjusted values by regression using the residual method, as described previously.30 To quantify long term dietary habits, we used cumulative averages through the questionnaires preceding the diagnosis of coronary heart disease, death, or the end of follow-up.31 For example, we calculated cumulative average estimated gluten intake in 1994 by averaging the daily consumption of gluten reported in 1986, 1990, and 1994. We treated cumulative average estimated gluten intake as a time varying covariate. For participants with missing dietary data, we used the most recent previous dietary response on record. Because the development of a significant illness may cause a major change in dietary habits, and so as to reduce the possibility of reverse causality, we suspended updating dietary response data for participants who developed diabetes, cardiovascular disease (including stroke, angioplasty, or coronary artery bypass graft surgery), or cancer. For such patients, the cumulative average dietary gluten value before the development of this diagnosis was carried forward until the end of follow-up.32

The primary outcome of incident coronary heart disease consisted of a composite outcome of non-fatal myocardial infarction or fatal myocardial infarction. For all participants who recorded such a diagnosis, we requested and reviewed medical records. We classified myocardial infarctions meeting World Health Organization criteria, which require typical symptoms plus either diagnostic electrocardiographic findings or elevated cardiac enzyme concentrations, as definite, and we considered myocardial infarctions requiring hospital admission and corroborated by phone interview or letter only as probable. Deaths were identified from state vital records and the National Death Index or reported by participants’ next of kin. We classified coronary heart disease deaths by examining autopsy reports, hospital records, or death certificates. Fatal coronary heart disease was confirmed via medical records or autopsy reports or if coronary heart disease was listed as the cause of death on the death certificate and there was previous evidence of coronary heart disease in the medical records. We designated as probable those cases in which coronary heart disease was the underlying cause on the death certificate but no previous knowledge of coronary heart disease was indicated and medical records concerning the death were unavailable. We considered definite and probable myocardial infarction together as our primary outcome, as we have previously found that results were similar when probable cases were excluded.33

Statistical analyses

Patients were followed from 1986 until the development of coronary heart disease, death, or the end of follow-up in 2012 (June 2012 for NHS; January 2012 for HPFS). We tested for the association between cumulative average gluten intake and the development of coronary heart disease, comparing each fifth of gluten intake with the lowest fifth. We used Cox proportional hazards models conditioning on age in months and follow-up cycle to calculate age adjusted and multivariable adjusted hazard ratios and 95% confidence intervals. We first generated these estimates in each cohort and tested for heterogeneity of the associations by meta-analysis of aggregate data using the Q statistic. Because we did not observe any significant heterogeneity for the association of gluten with coronary heart disease in the two cohorts (P for heterogeneity>0.10), we then did a pooled analysis combining the participants of NHS and HPFS and estimated the hazard ratios by using Cox modeling stratified by study cohort. We tested the assumption of proportional hazards by testing the interaction term between gluten intake and the period of follow-up and found no violations of this assumption (P>0.05).

We tested the hypothesis that increasing amounts of energy adjusted dietary gluten is associated with an increased risk of coronary heart disease. Our main model included non-dietary and dietary covariates, constructed a priori. Non-dietary covariates consisted of age, race (white, non-white), body mass index (by fifth), height (in inches), history of diabetes, regular (at least twice weekly) use of aspirin and non-steroidal anti-inflammatory drugs, current use of 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (statins), current use of a multivitamin, smoking history (pack years), parental history of myocardial infarction, history of hypertension, history of hypercholesterolemia, use of physical activity as measured in metabolic equivalents (METs) per week, and (in NHS) menopausal status and menopausal hormone use. Dietary covariates were energy adjusted and consisted of daily consumption of alcohol (grams), trans fats (grams), red meats (servings), processed meats (servings), polyunsaturated fats (grams), fruits (servings), and vegetables (servings).

We did several secondary analyses, constructed a priori. Firstly, because gluten is a component of both refined grains and whole grains, which are each purported to be associated with coronary heart disease, we used multivariable models examining the association between estimated gluten intake and coronary heart disease with additional adjustment for refined grain consumption and whole grain consumption. Secondly, we did stratified analyses by age (<65 v ≥65 years), body mass index (<25 v ≥25), physical activity (<18 v ≥18 MET-hours/week), and smoking status (current v never v past smoking). Thirdly, we separately considered the outcomes of fatal and non-fatal myocardial infarction. Fourthly, we considered the possibility that an association of estimated gluten intake with coronary heart disease may be evident only when extreme levels of intake are considered; we therefore examined participants according to tenths (instead of fifths) of gluten intake. Fifthly, because identification and treatment of risk factors for coronary heart disease may have changed over time, we repeated the primary analysis, restricting the time period first to 1986-97 and then to 1998-2012. Sixthly, instead of suspending dietary updates on the diagnosis of cardiovascular disease, diabetes, or cancer (as we did for the primary analysis), we repeated the primary analysis, updating dietary responses regardless of the development of these conditions. Finally, in addition to these a priori analyses, we did post-hoc analyses, including each of the following additional dietary variables in our full model: the Alternate Healthy Eating Index score, percentage protein, percentage total fat, and intake of dairy, saturated fatty acids, monounsaturated fatty acids, sodium, and dietary fiber. We used SAS version 9.4 for all analyses and considered two sided P values of <0.05 to be statistically significant.

Patient involvement

No patients were involved in setting the research question or the outcome measures, nor were they involved in developing plans for recruitment, design, or implementation of the study. No patients were asked to advise on interpretation or writing up of results. Although this specific analysis concerning gluten and coronary heart disease was not conceived in direct collaboration with the research participants, they have been actively engaged in the broad research direction of the cohorts. For example, participants are mailed an annual newsletter that communicates results and highlights notable findings. In response, participants return feedback, including suggestions for future studies. Findings are also disseminated on study websites (www.nurseshealthstudy.org and https://www.hsph.harvard.edu/hpfs/index.html)

Results

Among 64 714 women and 45 303 men eligible for analysis, the mean daily estimated intake of gluten at baseline was 7.5 (SD 1.4) g among women and 10.0 (2.0) g among men in the highest fifth and 2.6 (0.6) g among women and 3.3 (0.8) g among men in the lowest fifth. In 2010 the mean daily estimated gluten intake was 7.9 (2.4) g among women and 9.2 (2.8) g among men in the highest fifth and 3.1 (1.2) g among women and 3.7 (1.3) g among men in the lowest fifth. Table 1 shows baseline demographic characteristics according to fifth of gluten intake, and table 2 shows dietary characteristics. Gluten intake correlated inversely with alcohol intake, smoking, total fat intake, and unprocessed red meat intake. Gluten intake correlated positively with whole grain intake (Spearman correlation coefficients NHS 0.37, HPFS 0.42) and refined grain intake (Spearman correlation coefficients NHS 0.66, HPFS 0.65). Gluten did not correlate strongly with sodium intake (Spearman correlation coefficients NHS 0.13, HPFS 0.07).

Table 1 

Age adjusted baseline characteristics of study participants by fifths of energy adjusted gluten intake. Values are numbers (age adjusted percentages) unless stated otherwise

Table 2 

Age adjusted baseline dietary characteristics of study participants by fifths of energy adjusted gluten intake. Values are means (SD) unless stated otherwise and are standardized to age distribution of study population

Over a total of 2 273 931 person years of follow-up, we documented coronary heart disease in 6529 participants (2431 women and 4098 men). Fatal myocardial infarction developed in 2286 participants (540 women and 1746 men), and non-fatal myocardial infarction developed in 4243 participants (1891 women and 2352 men). Table 3 shows the measurements of association between estimated gluten intake and incident coronary heart disease. Compared with participants in the lowest fifth of gluten intake, who had a coronary heart disease incidence rate of 352 per 100 000 person years, those in the highest fifth had a rate of 277 events per 100 000 person years, leading to an unadjusted rate difference of 75 (95% confidence interval 51 to 98) fewer cases of coronary heart disease per 100 000 person years. With adjustment for age only, participants in the highest fifth of gluten intake had a decreased risk of subsequent coronary heart disease compared with those in the lowest fifth in men (hazard ratio 0.88, 95% confidence interval 0.80 to 0.97) and in the pooled analysis (0.87, 0.80 to 0.93). However, after adjustment for race, body mass index, height, diabetes, regular aspirin or non-steroidal anti-inflammatory drug use, statin use, multivitamin use, alcohol, smoking, parental history of coronary heart disease, hypertension, hypercholesterolemia, physical activity, menopausal status, and menopausal hormone use, the association was no longer significant (hazard ratio 0.98, 0.91 to 1.06) in the pooled cohorts.

Table 3 

Gluten and risk of coronary heart disease (fatal and non-fatal myocardial infarctions)

Addition of other dietary covariates known or purported to be associated with coronary heart disease yielded a similarly null association when we compared participants in the highest fifth of gluten intake with those in the lowest fifth (hazard ratio 0.95, 0.88 to 1.02). Assessment of gluten intake as a continuous variable yielded a multivariate hazard ratio of 0.99 (0.98 to 1.01) for each 1 g increase in daily intake. In sensitivity analyses, our results were essentially unchanged when we added alternative dietary variables, including Alternate Healthy Eating Index score, percentage protein in the diet, percentage total fat in the diet, and intake of dairy, saturated fatty acids, monounsaturated fatty acids, sodium, or dietary fiber to the model.

Secondary analyses

As gluten is obtained primarily from whole grains and refined grains, we repeated the primary analysis, adding each of these components to the full model (table 4). When further adjusting for refined grains (with the remaining variance of gluten intake correlating with whole grain intake), we found an inverse relation between estimated gluten intake and coronary heart disease; participants in the highest fifth of gluten intake had a lower coronary heart disease risk (hazard ratio 0.85, 0.77 to 0.93). When we instead adjusted for whole grains (leaving the variance of gluten intake correlating with refined grain intake), we found no association between gluten intake and incident coronary heart disease; participants in the highest fifth of gluten intake had a risk of coronary heart disease that was not different from those in the lowest group (hazard ratio 1.00, 0.92 to 1.09).

Table 4 

Hazard ratios for coronary heart disease events by fifths of energy adjusted gluten intake, with additional adjustment for refined grains and whole grains (pooled cohorts)

Table 5 shows results according to subgroups defined by age, body mass index, physical activity, and smoking status. The association between estimated gluten intake and coronary heart disease remained null across all of these strata with the exception of smoking status. Among current smokers, the highest fifth of gluten intake was associated with increased risk of coronary heart disease (hazard ratio 1.34, 1.09 to 1.66; P for trend=0.02). However, when we additionally adjusted for refined grains (leaving the variance of gluten intake correlating with whole grain intake), the association between gluten intake and coronary heart disease was no longer significant (hazard ratio for highest fifth 1.25, 0.95 to 1.64; P for trend=0.21).

Table 5 

Hazard ratios for coronary heart disease events (fatal and non-fatal myocardial infarctions) by fifths of energy adjusted gluten intake, stratified by age, body mass index, physical activity, and smoking

We found no significant association between estimated gluten intake and either fatal myocardial infarction or non-fatal myocardial infarction when considered as separate outcomes (see supplementary table B). We did not observe any significant associations between tenth categories of gluten intake with risk of coronary heart disease (see supplementary table C). Nor did we find a significant association between gluten intake and coronary heart disease when separately considering the time strata 1986-97 and 1998-2012 or when updating dietary responses regardless of the development of the comorbid conditions of cardiovascular disease, diabetes, or cancer (see supplementary tables D and E).

Discussion

In two prospective cohorts with updated dietary information over 20 years of follow-up, we found no significant association between estimated gluten intake and the risk of subsequent overall coronary heart disease, non-fatal myocardial infarction, and fatal myocardial infarction. The lack of association was consistent in both men and women, as well among other subgroups defined by cardiovascular risk factors.

Comparison with other studies

Dietary gluten has been the subject of increased attention and concern in recent years. Much of the data on gluten and coronary heart disease are limited to people with celiac disease, in whom gluten elicits an inflammatory response characterized by small intestinal villous atrophy and the development of antibodies to tissue transglutaminase, a ubiquitous enzyme that is present on vascular endothelial cells.3435 Patients with celiac disease may have an increased risk of myocardial infarction and death from cardiovascular disease that is reduced after diagnosis of celiac disease, possibly owing to the beneficial effect of a gluten-free diet,436 although this association is controversial.37 Patients with celiac disease who develop a myocardial infarction are less likely to have classic cardiac risk factors, such as smoking and dyslipidemia, leading to the hypothesis that the pro-inflammatory effect of gluten exerts an independent cardiac risk.38

The popularity of a low gluten or gluten-free diet in the general population has markedly increased in recent years.13 Despite the limited evidence that gluten plays a role in cardiovascular health, this increasing adoption of a gluten-free diet by people without celiac disease has occurred in conjunction with speculation that gluten may have a deleterious role in health outcomes even in the absence of gluten sensitivity. The rationale for this concern includes the observation that foods containing gluten often have a high glycemic index, which has been linked to cardiovascular risk.11 Studies in mice have shown pro-inflammatory effects of gluten administration and a protective association of gluten restriction with the development of diabetes.3940 In one cross sectional study in young adults, gluten intake was correlated with higher plasma concentrations of α2-macroglobulin, an acute phase reactant that is associated with inflammation.5 Gluten has been found to cause gastrointestinal symptoms in patients without celiac disease,41 although the mechanism for this remains uncertain.4243

We found that estimated gluten intake correlated moderately with whole grain and refined grain intake, as expected given the prominence of wheat among dietary grains; gluten also correlated with glycemic index. We noted a significant inverse relation between estimated gluten intake and coronary heart disease when we adjusted for refined grain intake. Although the absolute risk difference was modest (75 coronary heart disease events per 100 000 person years when we compared the highest fifth of gluten intake with the lowest fifth in the pooled analysis), this lower risk likely reflects the fact that adjustment for refined grains leaves the remainder of the variance of gluten intake correlated with whole grain intake. Whole grain intake has been found to be inversely associated with coronary heart disease risk and cardiovascular mortality.4445 These findings underscore the potential that people who severely restrict gluten intake may also significantly limit their intake of whole grains, which may actually be associated with adverse cardiovascular outcomes.

Strengths and limitations of study

Strengths of our study include its large sample size, long term follow-up, prospective and repeated assessments of diet with validated questionnaires, and validated outcome measurement. Our study also has several limitations. Unmeasured or residual negative confounding is a possibility, although our main model included multiple dietary and non-dietary covariates. We did not specifically ask about the intake of gluten-free substitute foods, and participants were not asked about whether they specifically adhered to a gluten-free diet. Although we excluded participants who reported a diagnosis of celiac disease, we could not identify which people without celiac disease nonetheless maintained a very low gluten or gluten-free diet. Nevertheless, the observation period (1986-2012) largely preceded the widespread interest in gluten as a health concern that has arisen more recently in the US.111213 We likewise were unable to determine whether gluten was present in trace amounts in certain foods, such as soy sauce or oats that were not harvested on separate fields; therefore, potential exists for misclassification at the low end of gluten intake. Although trace amounts of gluten (such as 50 mg daily) can induce symptoms and inflammation in patients with celiac disease,46 measurement of such gluten exposure would have a small effect on gluten quantity even in the lowest fifth of baseline daily gluten intake in our cohorts (2.6 g daily in women and 3.3 g in men). Therefore, although we were unable to determine the association of a strict gluten-free diet with coronary heart disease, we did not observe any association of very low estimated gluten intake with coronary heart disease, as might be realistically expected among people who maintain a gluten-free diet in usual practice.

Our measurement of gluten intake was based on the assumption that gluten comprised 75% of the protein content of wheat, rye, and barley, following the convention of a single conversion factor for all three grains.2223 Although this may overestimate the amount of gluten intake for rye and barley, it is unlikely to bias our results given the overall low intake of rye and barley in these cohorts. Although gluten has not been specifically quantified in validation studies of food frequency questionnaires, this instrument has shown good validity with regard to reasonable correlation with seven day dietary recall of foods containing gluten (supplementary table A) and intake of vegetable protein, to which gluten is a significant contributor.29 In addition, participants with undiagnosed celiac disease were not uniformly identified in these cohorts. However, according to population based estimates, such people would account for less than 1% of the cohort.3 Moreover, inclusion of these participants would be expected to bias the results toward an association of gluten with coronary heart disease, which was not observed.

In this analysis, we did not examine change in body mass index in relation to gluten ingestion. However, body mass index is unlikely to mediate an association between gluten and coronary heart disease, as the risk estimate did not change from positive toward a null association when we added body mass index to the model. Finally, in secondary subgroup analyses, we observed a higher risk of coronary heart disease among participants in the highest fifth of gluten intake among current smokers. However, these associations were no longer significant once we adjusted the models for refined grain consumption. Given the relatively small number of cases of coronary heart disease among current smokers in the highest fifth of gluten intake and the lack of a clear mechanistic basis for this heterogeneity, these results should be viewed in the context of multiple testing. When we applied the Bonferroni correction to smoking categories (which contained three strata), our finding regarding gluten intake and coronary heart disease among current smokers was no longer statistically significant.

Conclusion and public health implications

In these two large, prospective cohorts, the consumption of foods containing gluten was not significantly associated with risk of coronary heart disease. Although people with and without celiac disease may avoid gluten owing to a symptomatic response to this dietary protein, these findings do not support the promotion of a gluten restricted diet with a goal of reducing coronary heart disease risk. In addition, the avoidance of dietary gluten may result in a low intake of whole grains, which are associated with cardiovascular benefits. The promotion of gluten-free diets for the purpose of coronary heart disease prevention among asymptomatic people without celiac disease should not be recommended.

What is already known on this topic

  • Dietary gluten causes adverse clinical effects in people with celiac disease
  • Avoidance of gluten among people without celiac disease has increased in recent years, partly owing to the belief that gluten can have harmful health effects

What this study adds

  • Among male and female health professionals followed for more than 25 years, quantity of gluten consumption was not associated with coronary heart disease
  • A reduction in dietary gluten may result in the reduced consumption of whole grains, which are associated with lower cardiovascular risk.

Source: BMJ

Blood Test Rules Out Celiac Disease During Gluten-Free Diet


An experimental blood test accurately identifies people who do, or don’t, have celiac disease, even if they are following gluten-free diets, researchers say.

The two main blood tests used to screen for celiac disease rely on detecting an immune response to gluten, but that immune response gradually disappears in people who avoid gluten.

“Unfortunately, many persons with gluten sensitivity go gluten-free without consulting their clinician for exclusion of celiac disease,” said lead study author Dr. Vikas K. Sarna’s from Oslo University Hospital in Norway. “In such cases, guidelines recommend . . . performing a gluten challenge involving daily consumption of gluten for up to 8 weeks, followed by an endoscopic procedure for a biopsy taken from the small intestine (duodenum). Our blood test may replace such a gluten challenge and duodenal biopsy.”

The new test is designed to detect immune cells in a blood sample that are specifically targeted at gluten proteins, even when the individual hasn’t been recently exposed to gluten.

Sarna’s team tried their test on 62 patients with celiac disease and 19 individuals without celiac disease who were on a gluten-free diet, 10 patients with celiac disease who were eating foods containing gluten and 52 healthy individuals following normal diets. They also used the currently available celiac tests on these participants for comparison.

The old tests detected celiac disease in 9 out of 10 patients who weren’t on a gluten-free diet but in only 4 of the 62 patients who’d been following a gluten-free diet.

The new test, by comparison, was 96% accurate in distinguishing celiac disease patients from people who didn’t have celiac disease but were still following gluten-free diets.

It was 95% accurate for distinguishing celiac disease patients who were eating gluten-containing foods from healthy individuals following normal diets, the researchers reported November 13 online in Gastroenterology.

“We calculated that our test is stronger to exclude rather than confirm the diagnosis of celiac disease in gluten sensitive persons,” Sarna said. “Although we need more research in this field, we propose that the test be used to exclude celiac disease in persons on a gluten-free diet,” he told Reuters Health by email.

“It is important to point out that this test is still not available for commercial use, although there is a huge demand of a test for celiac disease that can be applied for persons that are already on a gluten-free diet,” Sarna said. “I do hope that the promising results from our study can initiate commercial initiatives along with more research, to allow this test to be used in the general public in the near future.”

Several members of the research team have applied for a patent on this testing technology, and some disclose that they are consultants for companies. The clinical trial in the current study was paid for by the Research Council of Norway, the authors note.

“Researchers are actively working to identify tests that may allow for screening for celiac disease in patients on a gluten-free diet,” said Dr. Maureen Leonard from Harvard Medical School’s Center for Celiac Research and Treatment in Boston, who was not involved in the study. “These are not clinically available tests and require further work before they are accurate and available for clinical use,” she said in an email.

“Additionally, these tests may benefit only people with a certain genetic background,” Leonard said. “Therefore, the general public should be aware that before self-imposing a gluten-free diet they must be tested for celiac disease.”

Celiac disease must be confirmed with a duodenal biopsy, Leonard added. “If a patient begins a gluten-free diet prior to being screened for celiac disease, all available blood tests to screen for celiac disease and duodenal biopsies will no longer be accurate.”

The Dark Side of Wheat – New Perspectives On Celiac Disease and Wheat Intolerance


by Sayer Ji

The globe-spanning presence of wheat and its exalted status among secular and sacred institutions alike differentiates this food from all others presently enjoyed by humans. Yet the unparalleled rise of wheat as the very catalyst for the emergence of ancient civilization has not occurred without a great price. While wheat was the engine of civilization’s expansion and was glorified as a “necessary food,” both in the physical (staff of life) and spiritual sense (the body of Christ), those suffering from celiac disease are living testimony to the lesser known dark side of wheat. A study of celiac disease and may help unlock the mystery of why modern man, who dines daily at the table of wheat, is the sickest animal yet to have arisen on this strange planet of ours.
The Celiac Iceberg
 
Celiac disease (CD) was once considered an extremely rare affliction, limited to individuals of European descent. Today, however, a growing number of studies indicate that celiac disease is found throughout the world at a rate of up to 1 in every 100 persons, which is several orders of magnitude higher than previously estimated.
These findings have led researchers to visualize CD as an iceberg. The tip of the iceberg represents the relatively small number of the world’s population whose gross presentation of clinical symptoms often leads to the diagnosis of celiac disease. This is the classical case of CD characterized by gastrointestinal symptoms, malabsorption and malnourishment. It is confirmed with the “gold standard” of an intestinal biopsy. The submerged middle portion of the iceberg is largely invisible to classical clinical diagnosis, but not to modern serological screening methods in the form of antibody testing. This middle portion is composed of asymptomatic and latent celiac disease as well as “out of the intestine” varieties of wheat intolerance. Finally, at the base of this massive iceberg sits approximately 20-30% of the world’s population – those who have been found to carry the HLA-DQ locus of genetic susceptibility to celiac disease on chromosome 6.*
The “Celiac Iceberg” may not simply illustrate the problems and issues associated with diagnosis and disease prevalence, but may represent the need for a paradigm shift in how we view both CD and wheat consumption among non-CD populations.
First let us address the traditional view of CD as a rare, but clinically distinct species of genetically-determined disease, which I believe is now running itself aground upon the emerging, post-Genomic perspective, whose implications for understanding and treating disease are Titanic in proportion.
It Is Not In the Genes, But What We Expose Them To
Despite common misconceptions, monogenic diseases, or diseases that result from errors in the nucleotide sequence of a single gene are exceedingly rare. Perhaps only 1% of all diseases fall within this category, and Celiac disease is not one of them. In fact, following the completion of the Human Genome Project (HGP) in 2003 it is no longer accurate to say that our genes “cause” disease, any more than it is accurate to say that DNA alone is sufficient to account for all the proteins in our body. Despite initial expectations, the HGP revealed that there are only 20,000-25,000 genes in human DNA (genome), rather than the 100,000 + believed necessary to encode the 100,000 + proteins found in the human body (proteome).
The “blueprint” model of genetics: one gene → one protein → one cellular behavior, which was once the holy grail of biology, has now been supplanted by a model of the cell where epigenetic factors (literally: “beyond the control of the gene”) are primary in determining how DNA will be interpreted, translated and expressed. A single gene can be used by the cell to express a multitude of proteins and it is not the DNA alone that determines how or what genes will be expressed. Rather, we must look to the epigenetic factors to understand what makes a liver cell different from a skin cell or brain cell. All of these cells share the exact same 3 billion base pairs that make up our genome, but it is the epigenetic factors, e.g. regulatory proteins and post-translational modifications, that make the determination as to which genes to turn on and which to silence, resulting in each cell’s unique phenotype. Moreover, epigenetic factors are directly and indirectly influenced by the presence or absence of key nutrients in the diet, as well as exposures to chemicals, pathogens and other environmental influences.
In a nutshell, what we eat and what we are exposed to in our environment directly affects our DNA and its expression.
Within the scope of this new perspective even classical monogenic diseases like cystic fibrosis (CF) can be viewed in a new, more promising light. In CF many of the adverse changes that result from the defective expression of the Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) gene may be preventable or reversible, owing to the fact that the misfolding of the CFTR gene product has been shown to undergo partial or full correction (in the rodent model) when exposed to phytochemicals found in turmeric, cayenne, and soybean Moreover, nutritional deficiencies of seleniun, zinc, riboflavin, vitamin e, etc. in the womb or early in life, may “trigger” the faulty expression or folding patterns of the CFTR gene in cystic fibrosis which might otherwise have avoided epigenetic activation. This would explain why it is possible to live into one’s late seventies with this condition, as was the case for Katherine Shores (1925-2004). The implications of these findings are rather extraordinary: epigenetic and not genetic factors are primary in determining disease outcome. Even if we exclude the possibility of reversing certain monogenic diseases, the basic lesson from the post-Genomic era is that we can’t blame our DNA for causing disease. Rather, it may have more to do with what we choose to expose our DNA to.
Celiac Disease Revisited
 
What all of this means for CD is that the genetic susceptibility locus, HLA-DQ, does not by itself determine the exact clinical outcome of the disease. Instead of being ‘the cause,’ the HLA genes may be activated as a consequence of the disease process. Thus, we may need to shift our epidemiological focus from viewing this as a classical “disease” involving a passive subject controlled by aberrant genes, to viewing it as an expression of a natural, protective response to the ingestion of something that the human body was not designed to consume.
If we view celiac disease not as an unhealthy response to a healthy food, but as a healthy response to an unhealthy food, classical CD symptoms like diarrhea may make more sense. Diarrhea can be the body’s way to reduce the duration of exposure to a toxin or pathogen, and villous atrophy can be the body’s way of preventing the absorption and hence, the systemic effects of chronic exposure to wheat.
I believe we would be better served by viewing the symptoms of CD as expressions of bodily intelligence rather than deviance. We must shift the focus back to the disease trigger, which is wheat itself.
People with celiac disease may actually have an advantage over the apparently non-afflicted because those who are “non-symptomatic” and whose wheat intolerance goes undiagnosed or misdiagnosed because they lack the classical symptoms and may suffer in ways that are equally or more damaging, but expressed more subtly, or in distant organs. Within this view celiac disease would be redefined as a protective (healthy?) response to exposure to an inappropriate substance, whereas “asymptomatic” ingestion of the grain with its concomitant “out of the intestine” and mostly silent symptoms, would be considered the unhealthy response insofar as it does not signal in an obvious and acute manner that there is a problem with consuming wheat.
It is possible that celiac disease represents both an extreme reaction to a global, species-specific intolerance to wheat that we all share in varying degrees. CD symptoms may reflect the body’s innate intelligence when faced with the consumption of a substance that is inherently toxic. Let me illustrate this point using wheat germ agglutinin (WGA), as an example.
WGA is classified as a lectin and is known to play a key role in kidney pathologies, such as IgA nephropathy. In the article: “Do dietary lectins cause disease?” the Allergist David L J Freed points out that WGA binds to “glomerular capillary walls, mesangial cells and tubules of human kidney and (in rodents) binds IgA and induces IgA mesangial deposits,” indicating that wheat consumption may lead to kidney damage in susceptible individuals. Indeed, a study from the Mario Negri Institute for Pharmacological Research in Milan Italy published in 2007 in the International Journal of Cancer looked at bread consumption and the risk of kidney cancer. They found that those who consumed the most bread had a 94% higher risk of developing kidney cancer compared to those who consumed the least bread. Given the inherently toxic effect that WGA may have on kidney function, it is possible that in certain genetically predisposed individuals (e.g. HLA-DQ2/DQ8) the body – in its innate intelligence – makes an executive decision: either continue to allow damage to the kidneys (or possibly other organs) until kidney failure and rapid death result, or launch an autoimmune attack on the villi to prevent the absorption of the offending substance which results in a prolonged though relatively malnourished life. This is the explanation typically given for the body’s reflexive formation of mucous following exposure to certain highly allergenic or potentially toxic foods, e.g. dairy products, sugar, etc? The mucous coats the offending substance, preventing its absorption and facilitating safe elimination via the gastrointestinal tract.   From this perspective the HLA-DQ locus of disease susceptibility in the celiac is not simply activated but utilized as a defensive adaptation to continual exposure to a harmful substance. In those who do not have the HLA-DQ locus, an autoimmune destruction of the villi will not occur as rapidly, and exposure to the universally toxic effects of WGA will likely go unabated until silent damage to distant organs leads to the diagnosis of a disease that is apparently unrelated to wheat consumption.
Loss of kidney function may only be the “tip of the iceberg,” when it comes to the possible adverse effects that wheat proteins and wheat lectin can generate in the body. If kidney cancer is a likely possibility, then other cancers may eventually be linked to wheat consumption as well. This correlation would fly in the face of globally sanctioned and reified assumptions about the inherent benefits of wheat consumption. It would require that we suspend cultural, socio-economic, political and even religious assumptions about its inherent benefits. In many ways, the reassessment of the value of wheat as a food requires a William Boroughs-like moment of shocking clarity when we perceive “in a frozen moment….what is on the end of every fork.” Let’s take a closer look at what is on the end of our forks.
Our biologically inappropriate diet
 
In a previous article, I discussed the role that wheat plays as an industrial adhesive (e.g. paints, paper mache’, and book binding-glue) in order to illustrate the point that it may not be such a good thing for us to eat. The problem is implicit in the word gluten, which literally means “glue” in Latin and in words like pastry and pasta, which derives from wheatpaste, the original concoction of wheat flour and water which made such good plaster in ancient times. What gives gluten its adhesive and difficult-to-digest qualities are the high levels of disulfide bonds it contains. These same sulfur-to-sulfur bonds are found in hair and vulcanized rubber products, which we all know are difficult to decompose and are responsible for the sulfurous odor they give off when burned.
There will be 676 million metric tons of wheat produced this year alone, making it the primary cereal of temperate regions and third most prolific cereal grass on the planet. This global dominance of wheat is signified by the Food & Agricultural Organization’s (FAO) (the United Nation’s international agency for defeating hunger) use of a head of wheat as its official symbol. Any effort to indict the credibility of this “king of grains” will prove challenging. As Rudolf Hauschka once remarked, wheat is “a kind of earth-spanning organism.” It has vast socio-economic, political, and cultural significance.   For example, in the Catholic Church, a wafer made of wheat is considered irreplaceable as the embodiment of Christ. .
Our dependence on wheat is matched only by its dependence on us. As Europeans have spread across the planet, so has this grain. We have assumed total responsibility for all phases of the wheat life cycle: from fending off its pests; to providing its ideal growing conditions; to facilitating reproduction and expansion into new territories. We have become so inextricably interdependent that neither species is sustainable at current population levels without this symbiotic relationship.
It is this co-dependence that may explain why our culture has for so long consistently confined wheat intolerance to categorically distinct, “genetically-based” diseases like “celiac.” These categorizations may protect us from the realization that wheat exerts a vast number of deleterious effects on human health in the same way that “lactose intolerance” distracts attention from the deeper problems associated with the casein protein found in cow’s milk. Rather than see wheat for what it very well may be: a biologically inappropriate food source, we “blame the victim,” and look for genetic explanations for what’s wrong with small subgroups of our population who have the most obvious forms of intolerance to wheat consumption, e.g. celiac disease, dermatitis herpetiformis, etc.   The medical justification for these classifications may be secondary to economic and cultural imperatives that require the inherent problems associated with wheat consumption be minimized or occluded.
In all probability the celiac genotype represents a surviving vestigial branch of a once universal genotype, which through accident or intention, have had through successive generations only limited exposure to wheat. The celiac genotype, no doubt, survived through numerous bottlenecks or “die offs” represented by a dramatic shift from hunted and foraged/gathered foods to gluten-grain consumption, and for whatever reason simply did not have adequate time to adapt or select out the gluten-grain incompatible genes. The celiac response may indeed reflect a prior, species-wide intolerance to a novel food source: the seed storage form of the monocotyledonous cereal grasses which our species only began consuming 1-500 generations ago at the advent of the Neolithic transition (10-12,000 BC). Let us return to the image of the celiac iceberg for greater clarification.
Our Submerged Grain-Free Prehistory
The iceberg metaphor is an excellent way to expand our understanding of what was once considered to be an extraordinarily rare disease into one that has statistical relevance for us all, but it has a few limitations. For one, it reiterates the commonly held view that Celiac is a numerically distinct disease entity or “disease island,” floating alongside other numerically distinct disease “ice cubes” in the vast sea of normal health. Though accurate in describing the sense of social and psychological isolation many of the afflicted feel, the celiac iceberg/condition may not be a distinct disease entity at all.
Although the HLA-DQ locus of disease susceptibility on chromosome 6 offers us a place to project blame, I believe we need to shift the emphasis of responsibility for the condition back to the disease “trigger” itself: namely, wheat and other prolamine rich grains, e.g. barley, rye, spelt, and oats. Without these grains the typical afflictions we call celiac would not exist. Within the scope of this view the “celiac iceberg” is not actually free floating but an outcropping from an entire submerged subcontinent, representing our long-forgotten (cultural time) but relatively recent metabolic prehistory as hunters-and-gatherers (biological time), where grain consumption was, in all likelihood, non-existent, except in instances of near-starvation.
The pressure on the celiac to be viewed as an exceptional case or deviation may have everything to do with our preconscious belief that wheat, and grains as a whole are the “health foods,” and very little to do with a rigorous investigations of the facts.
Grains have been heralded since time immemorial as the “staff of life,” when in fact they are more accurately described as a cane, precariously propping up a body starved of the nutrient-dense, low-starch vegetables, fruits, edible seeds and meats, they have so thoroughly supplanted (c.f. Paleolithic Diet). Most of the diseases of affluence, e.g. type 2 diabetes, coronary heart disease, cancer, etc. can be linked to the consumption of a grain-based diet, including secondary “hidden sources” of grain consumption in grain-fed fish, poultry, meat and milk products.
Our modern belief that grains make for good food, is simply not supported by the facts. The cereal grasses are within an entirely different family: monocotyledonous (one leafed embryo) than that from which our body sustained itself for millions of years: dicotyledonous (two leafed embryo). The preponderance of scientific evidence points to a human origin in the tropical rainforests of Africa where dicotyledonous fruits would have been available for year round consumption. It would not have been monocotyledonous plants, but the flesh of hunted animals that would have allowed for the migration out of Africa 60,000 years ago into the northern latitudes where vegetation would have been sparse or non-existent during winter months. Collecting and cooking grains would have been improbable given the low nutrient and caloric content of grains and the inadequate development of pyrotechnology and associated cooking utensils necessary to consume them with any efficiency. It was not until the end of the last Ice Age 20,000 years ago that our human ancestors would have slowly transitioned to a cereal grass based diet coterminous with emergence of civilization.   20,000 years is probably not enough time to fully adapt to the consumption of grains. Even animals like cows with a head start of thousands of years, having evolved to graze on monocotyledons and equipped as ruminants with the four-chambered fore-stomach enabling the breakdown of cellulose and anti-nutrient rich plants, are not designed to consume grains. Cows are designed to consume the sprouted mature form of the grasses and not their seed storage form. Grains are so acidic/toxic in reaction that exclusively grain-fed cattle are prone to developing severe acidosis and subsequent liver abscesses and infections, etc. Feeding wheat to cattle provides an even greater challenge:
“Beef:  Feeding wheat to ruminants requires some caution as it tends to be more apt than other cereal grains to cause acute indigestion in animals which are unadapted to it. The primary problem appears to be the high gluten content of which wheat in the rumen can result in a “pasty” consistency to the rumen contents and reduced rumen motility.”
(source: Ontario ministry of Agriculture food & Rural affairs)
Seeds, after all, are the “babies” of these plants, and are invested with not only the entire hope for continuance of its species, but a vast armory of anti-nutrients to help it accomplish this task: toxic lectins, phytates and oxalates, alpha-amalyase and trypsin inhibitors, and endocrine disrupters. These not so appetizing phytochemicals enable plants to resist predation of their seeds, or at least preventing them from “going out without a punch.”
Wheat: An Exceptionally Unwholesome Grain
Wheat presents a special case insofar as wild and selective breeding has produced variations which include up to 6 sets of chromosomes (3x the human genome worth!) capable of generating a massive number of proteins each with a distinct potentiality for antigenicity. Common bread wheat (Triticum aestivum), for instance, has over 23,788 proteins cataloged thus far. In fact, the genome for common bread wheat is actually 6.5 times larger than that of the human genome!
With up to a 50% increase in gluten content of some varieties of wheat, it is amazing that we continue to consider “glue-eating” a normal behavior, whereas wheat-avoidance is left to the “celiac” who is still perceived by the majority of health care practitioners as mounting a “freak” reaction to the consumption of something intrinsically wholesome.
Thankfully we don’t need to rely on our intuition, or even (not so) common sense to draw conclusions about the inherently unhealthy nature of wheat. A wide range of investigation has occurred over the past decade revealing the problem with the alcohol soluble protein component of wheat known as gliadin, the sugar-binding protein known as lectin (Wheat Germ Agglutinin), the exorphin known as gliadomorphin, and the excitotoxic potentials of high levels of aspartic and glutamic acid found in wheat. Add to these the anti-nutrients found in grains such as phytates, enzyme inhibitors, etc. and you have a substance which we may more appropriately consider the farthest thing from wholesome.
The remainder of this article will demonstrate the following adverse effects of wheat on both celiac and non-celiac populations: 1) wheat causes damage to the intestines 2) wheat causes intestinal permeability 3) wheat has pharmacologically active properties 4) wheat causes damage that is “out of the intestine” affecting distant organs 5) wheat induces molecular mimicry 6) wheat contains high concentrations of excitoxins.
1) WHEAT GLIADIN CREATES IMMUNE MEDIATED DAMAGE TO THE INTESTINES
Gliadin is classified as a prolamin, which is a wheat storage protein high in the amino acids proline and glutamine and soluble in strong alcohol solutions. Gliadin, once deamidated by the enzyme Tissue Transglutaminase, is considered the primary epitope for T-cell activation and subsequent autoimmune destruction of intestinal villi. Yet gliadin does not need to activate an autoimmune response, e.g. Celiac disease, in order to have a deleterious effect on intestinal tissue.
In a study published in GUT in 2007 a group of researchers asked the question: “Is gliadin really safe for non-coeliac individuals?”   In order to test the hypothesis that an innate immune response to gliadin is common in patients with celiac disease and without celiac disease, intestinal biopsy cultures were taken from both groups and challenged with crude gliadin, the gliadin synthetic 19-mer (19 amino acid long gliadin peptide) and 33-mer deamidated peptides.   Results showed that all patients with or without Celiac disease when challenged with the various forms of gliadin produced an interleukin-15-mediated response. The researchers concluded:
“The data obtained in this pilot study supports the hypothesis that gluten elicits its harmful effect, throughout an IL15 innate immune response, on all individuals [my italics].”
The primary difference between the two groups is that the celiac disease patients experienced both an innate and an adaptive immune response to the gliadin, whereas the non-celiacs experienced only the innate response.   The researchers hypothesized that the difference between the two groups may be attributable to greater genetic susceptibility at the HLA-DQ locus for triggering an adaptive immune response, higher levels of immune mediators or receptors, or perhaps greater permeability in the celiac intestine. It is possible that over and above the possibility of greater genetic susceptibility, most of the differences are from epigenetic factors that are influenced by the presence or absence of certain nutrients in the diet. Other factors such as exposure to NSAIDs like naproxen or aspirin can profoundly increase intestinal permeability in the non-celiac, rendering them susceptible to gliadin’s potential for activating secondary adaptive immune responses. This may explain why in up to 5% of all cases of classically defined celiac disease the typical HLA-DQ haplotypes are not found. However, determining the factors associated greater or lesser degrees of susceptibility to gliadin’s intrinsically toxic effect should be a secondary to the fact that it is has been demonstrated to be toxic to both non-celiacs and celiacs. 
2) WHEAT GLIADIN CREATES INTESTINAL PERMEABILITY
Gliadin upregulates the production of a protein known as zonulin, which modulates intestinal permeability. Over-expression of zonulin is involved in a number of autoimmune disorders, including celiac disease and Type 1 diabetes. Researchers have studied the effect of gliadin on increased zonulin production and subsequent gut permeability in both celiac and non-celiac intestines, and have found that “gliadin activates zonulin signaling irrespective of the genetic expression of autoimmunity, leading to increased intestinal permeability to macromolecules.”10   These results indicate, once again, that a pathological response to wheat gluten is a normal or human, species specific response, and is not based entirely on genetic susceptibilities. Because intestinal permeability is associated with wide range of disease states, including cardiovascular illness, liver disease and many autoimmune disorders, I believe this research indicates that gliadin (and therefore wheat) should be avoided as a matter of principle.
3) WHEAT GLIADIN HAS PHARMACOLOGICAL PROPERTIES
Gliadin can be broken down into various amino acid lengths or peptides. Gliadorphin is a 7 amino acid long peptide: Tyr-Pro-Gln-Pro-Gln-Pro-Phe which forms when the gastrointestinal system is compromised. When digestive enzymes are insufficient to break gliadorphin down into 2-3 amino acid lengths and a compromised intestinal wall allows for the leakage of the entire 7 amino acid long fragment into the blood, glaidorphin can pass through to the brain through circumventricular organs and activate opioid receptors resulting in disrupted brain function.
There have been a number of gluten exorphins identified: gluten exorphin A4, A5, B4, B5 and C, and many of them have been hypothesized to play a role in autism, schizophrenia, ADHD and related neurological conditions.   In the same way that the celiac iceberg illustrated the illusion that intolerance to wheat is rare, it is possible, even probable, that wheat exerts pharmacological influences on everyone. What distinguishes the schizophrenic or autistic individual from the functional wheat consumer is the degree to which they are affected.
Below the tip of the “Gluten Iceberg,” we might find these opiate-like peptides to be responsible for bread’s general popularity as a “comfort food”, and our use of phrases like “I love bread,” or “this bread is to die for” to be indicative of wheat’s narcotic properties. I believe a strong argument can be made that the agricultural revolution that occurred approximately 10-12,000 years ago as we shifted from the Paleolithic into the Neolithic era was precipitated as much by environmental necessities and human ingenuity, as it was by the addictive qualities of psychoactive peptides in the grains themselves.
The world-historical reorganization of society, culture and consciousness accomplished through the symbiotic relationship with cereal grasses, may have had as much to do with our ability to master agriculture, as to be mastered by it.   The presence of pharmacologically active peptides would have further sweetened the deal, making it hard to distance ourselves from what became a global fascination with wheat.
An interesting example of wheat’s addictive potential pertains to the Roman army. The Roman Empire was once known as the “Wheat Empire,” with soldiers being paid in wheat rations. Rome’s entire war machine, and its vast expansion, was predicated on the availability of wheat. Forts were actually granaries, holding up to a year’s worth of grain in order to endure sieges from their enemies. Historians describe soldiers’ punishment included being deprived of wheat rations and being given barley instead.   The Roman Empire went on to facilitate the global dissemination of wheat cultivation which fostered a form of imperialism with biological as well as cultural roots.
The Roman appreciation for wheat, like our own, may have had less to do with its nutritional value as “health food” than its ability to generate a unique narcotic reaction. It may fulfill our hunger while generating a repetitive, ceaseless cycle of craving more of the same, and by doing so, enabling the surreptitious control of human behavior. Other researchers have come to similar conclusions. According to the biologists Greg Wadley & Angus Martin:
 “Cereals have important qualities that differentiate them from most other drugs. They are a food source as well as a drug, and can be stored and transported easily. They are ingested in frequent small doses (not occasional large ones), and do not impede work performance in most people. A desire for the drug, even cravings or withdrawal, can be confused with hunger. These features make cereals the ideal facilitator of civilization (and may also have contributed to the long delay in recognizing their pharmacological properties).”
4) WHEAT LECTIN (WGA) DAMAGES OUR TISSUE.
Wheat contains a lectin known as Wheat Germ Agglutinin which is responsible for causing direct, non-immune mediated damage to our intestines, and subsequent to entry into the bloodstream, damage to distant organs in our body.
Lectins are sugar-binding proteins which are highly selective for their sugar moieties. It is believed that wheat lectin, which binds to the monosaccharide N-acetyl glucosamine (NAG), provides defense against predation from bacteria, insects and animals. Bacteria have NAG in their cell wall, insects have an exoskeleton composed of polymers of NAG called chitin, and the epithelial tissue of mammals, e.g. gastrointestinal tract, have a “sugar coat” called the glycocalyx which is composed, in part, of NAG. The glycocalyx can be found on the outer surface (apical portion) of the microvilli within the small intestine.
There is evidence that WGA may cause increased shedding of the intestinal brush border membrane, reduction in surface area, acceleration of cell losses and shortening of villi, via binding to the surface of the villi. WGA can mimic the effects of epidermal growth factor (EGF) at the cellular level, indicating that the crypt hyperplasia seen in celiac disease may be due to a mitogenic reponse induced by WGA. WGA has been implicated in obesity and “leptin resistance” by blocking the receptor in the hypothalamus for the appetite satiating hormone leptin. WGA has also been shown to have an insulin-mimetic action, potentially contributing to weight gain and insulin resistance.15   And, as discussed earlier, wheat lectin has been shown to induce IgA mediated damage to the kidney, indicating that nephropathy and kidney cancer may be associated with wheat consumption.
5) WHEAT PEPTIDES EXHIBIT MOLECULAR MIMICRY
Gliadorphin and gluten exporphins exhibit a form of molecular mimicry that affects the nervous system, but other wheat proteins effect different organ systems. The digestion of gliadin produces a peptide that is 33 amino acids long and is known as 33-mer which has a remarkable homology to the internal sequence of pertactin, the immunodominant sequence in the Bordetella pertussis bacteria (whooping cough). Pertactin is considered a highly immunogenic virulence factor, and is used in vaccines to amplify the adaptive immune response. It is possible the immune system may confuse this 33-mer with a pathogen resulting in either or both a cell-mediated and adaptive immune response against Self.
6) WHEAT CONTAINS HIGH LEVELS OF EXCITO-TOXINS
John B. Symes, D.V.M. is responsible for drawing attention to the potential excitotoxicity of wheat, dairy, and soy, due to their exceptionally high levels of the non-essential amino acids glutamic and aspartic acid. Excitotoxicity is a pathological process where glutamic and aspartic acid cause an over-activation of the nerve cell receptors (e.g. NMDA and AMPA receptor) leading to calcium induced nerve and brain injury.   Of all cereal grasses commonly consumed wheat contains the highest levels of glutamic acid and aspartic acid. Glutamic acid is largely responsible for wheat’s exceptional taste. The Japanese coined the word umami to describe the extraordinary “yummy” effect that glutamic acid exerts on the tongue and palate, and invented monosodium glutamate (MSG) to amplify this sensation. Though the Japanese first synthesized MSG from kelp, wheat can also be used due to its high glutamic acid content.   It is likely that wheat’s popularity, alongside its opiate-like activity, has everything to do with the natural flavor-enhancers already contained within it. These amino acids may contribute to neurodegenerative conditions such as multiple sclerosis, Alzhemier disease, Huntington’s disease, and other nervous disorders such as epilepsy, attention deficit disorder and migraines.
CONCLUSION
In this article I have proposed that celiac disease be viewed not as a rare “genetically-determined” disorder, but as an extreme example of our body communicating to us a once universal, species-specific affliction: severe intolerance to wheat. Celiac disease reflects back to us how profoundly our diet has diverged from what was, until only recently a grain free diet, and even more recently, a wheat free one. We are so profoundly distanced from that dramatic Neolithic transition in cultural time that “missing is any sense that anything is missing.” The body, on the other hand, cannot help but remember a time when cereal grains were alien to the diet, because in biological time it was only moments ago.
Eliminating wheat, if not all of the members of the cereal grass family, and returning to dicotyledons or pseudo-grains like quinoa, buckwheat and amaranth, may help us roll back the hands of biological and cultural time, to a time of clarity, health and vitality that many of us have never known before. When one eliminates wheat and fills the void left by its absence with fruits, vegetables, high quality meats and foods consistent with our biological needs we may begin to feel a sense of vitality that many would find hard to imagine. If wheat really is more like a drug than a food, anesthetizing us to its ill effects on our body, it will be difficult for us to understand its grasp upon us unless and until we eliminate it from our diet. I encourage everyone to see celiac disease not as a condition alien to our own. Rather, the celiac gives us a glimpse of how profoundly wheat may distort and disfigure our health if we continue to expose ourselves to its ill effects. I hope this article will provide inspiration for non-celiacs to try a wheat free diet and judge for themselves if it is really worth eliminating.

How to Accurately Tell If You’re Sensitive to Gluten, Dairy, or Any Food Without a Blood Test


Gluten-free (GF) is a catchy fad that has become more popular over the last year or so. You’ve probably seen expensive gluten-free options at restaurants, grocery stores and bakeries. But is that all it is? Is it just a fad?

If you don’t have Celiac disease or a known gluten sensitivity you may be scratching your head asking,

“What’s the big deal?”

Gluten is a protein found in grains such as:

  • Wheat
  • Couscous
  • Spelt
  • Rye
  • Durum
  • Udon
  • Barley
  • Graham
  • Semolina
  • Bran
  • Orzo
  • Panko
  • Bulgur
  • Possibly oats (due to cross-contamination)

Gluten intolerance vs. sensitivity

Gluten intolerance affects people who lack the enzymes required to break down the gluten protein, such as kumamolisin-As. Research is still being done to fully understand which enzymes are lacking and potential treatments.

The most common symptoms are explosive diarrhea, excessive gas, low energy and fatigue,dehydration, and/or malnutrition.If ingested by a person with a gluten sensitivity or intolerance, it results in an inflammatory response which damages the intestinal lining of the gut leading to malabsorption of other nutrients (aka. Leaky Gut Syndrome- see more below).

Gluten sensitivity is a delayed hypersensitivity immune response (IgG) occurs when a sensitized person repeatedly eats gluten over a short period of time. The effects progress more gradually and are non-specific and often dose-dependent.

Symptoms can vary from migraines, to cognitive ‘brain fog’, to behavioural difficulties in children with ADHD, to chronic digestive concerns (constipation, diarrhea, excessive gas, IBSIBD), to skin issues (acne, eczema, atopic dermatitis), to low energy, weight gain, water retention and joint pain.

Testing for gluten sensitivities measures for an immune system antibody called IgG.

  • In vivo (in/on the body) – muscle testing or energetic tests
  • In vitro (in a lab) – IgG blood sample with finger prick

Muscle testing, applied kinesiology, and energetic tests are controversial. The idea is that the patient holds a vial of a food antigen and the body will weaken in strength if there is a sensitivity. Physical strength is manually tested by the practitioner, which offers a level of bias (conscious and unconscious). There are also various energetic tests that measure a person’s response to the potential food trigger, gluten.

I also have issues with the current IgG tests and often find them to be unreliable. I know colleagues who have sent in multiple tests of the same blood sample to different lab companies and even the same labs with varying results. Some patients have even shown to be highly sensitive to foods they have never eaten before.

Until IgG tests become more accurate and reproducible, I still prefer taking a more practical approach to identifying potential food sensitivities with the hypo-allergenic diet (we’ll take a closer look at this test later in the article).

Celiac disease varies from a gluten sensitivity like an anaphylactic bee sting to a mild mosquito bite. To diagnoses celiac disease, a combination the following tests are commonly performed:

  • Anti-tissue transglutaminase (tTG- IgA)
  • Anti-endomysial antibody (EMA- IgA)
  • Anti-gliadin antibody (AGA- IgA)
  • Deamidated gliadin peptide antibody (DGP- IgA)
  • with a possible endoscopic biopsy of injured tissues

Celiac disease is an autoimmune condition of the small intestine. In celiac disease, there is both an inflammatory conditions and a loss of microvilli in a portion of the small intestine.

The loss of microvilli disrupts the mucosal cells and allows large molecules (ie. food) to pass through the tightly packed cells of the mucosal lining and into the blood stream where antibodies will be created to these ‘foreign bodies’. This is the same result as leaky gut syndrome. Now that the body has created ‘food antibodies’, the next time you ingest that food you will have an IgG response.

Sometimes food particles have similar structures as molecules in your body and the antibodies can start to attack your own cells. This is called autoimmunity and can take on many different forms, such as:

Where Else is Gluten Found?

Although usually found in grains, gluten is also used as a “filler” in many processed foods, seasonings, flavourings and products, such as:

  • Ales, beer, brown rice syrup, candies, deli meats, broth, sauces, imitation meats, marinades, lipsticks, and balms.

Malt, a popular substance used in candies and beverages has gluten. Caramel colouring and caramel also contain gluten. Wheat flour (glutinous) is found in many things from soy sauce and soups to condiments such as mustard, so reading labels are very important.

Supplements also may contain gluten as fillers or in the coating of their capsules. Read all medicinal and non-medicinal labels carefully.

What are Gluten-Free Foods?

The following types of flours are gluten-free:

  • Amaranth
  • Corn meal
  • Quinoa
  • Arrowroot
  • Cornstarch
  • Rice bran
  • Buckwheat
  • Flax
  • Tapioca
  • Corn bran
  • Millet
  • Potatoes
  • Corn flour
  • Soy (but be cautious of wheat additives in soya sauce)
  • Legumes (bean, chickpea, garfava, lentil and pea)

Where to find gluten-free products?Organic versions of soy sauce and soups (easily found in the grocery store) are usually gluten free (but read your labels).

Since gluten sensitivity and Celiac disease is becoming more common, it is much easier to find gluten-free products. Yes, health food stores have the best variety in products, but they can be expensive.



  • Gluten-free options can be found in the health food aisle or in the frozen food section (as many of the products are frozen) of any supermarket and even in Wal-mart.
  • Bob’s Red Mill products carry every type of gluten-free flour and baking mix, and can be found in the baked goods aisle (which all the flour and sugar) of most supermarkets.
  • The Bulk Barn also sells gluten-free flours, baked good mixes, pancake mixes and even powered soup mixes.
  • But remember, just because something is gluten-free, doesn’t mean it is healthy. These two terms are not synonymous. Many gluten-free products substitute with other products that may not agree with your digestive system (ie. egg, dairy, soy, corn) or be supportive to your health.

How to Cook Gluten-free

Below are some great websites for gluten-free recipes. Also look at paleo meals, which are wheat-free.

For information on which restaurants provide gluten-free options:

Best Options for Gluten-Free Food

gluten-free

Asian Styled Restaurants

  • Sushi – Avoid tempura and bring your own soy sauce.
  • Indian Food – Rice and vegetable dishes (paneer does contain dairy).
  • Korean Barbecue
  • Noodle Houses – Noodle bowls contain broths and rice noodles.
  • Chinese food – Rice and vegetables (note: be wary and ask questions as a lot of thickeners with gluten are used as sauces).

Mediterranean

  • Greek – Souvalaki, rice, greek salad, potatoes (avoid baklafa and spanikopita as phylo pastry paper contains gluten.
  • Fish (any kind)
  • Steak House – Meat, chicken or fish with baked potato or vegetable (avoid mashed potatoes as they are thickened with cream and flour).

Organic Restaurants

Fresh – Contains organic food and a large amount of gluten-free and dairy-free options.

Organic Food Restaurants are more used to catering to individuals with gluten and dairy sensitivities, hence why they have more options.

How to Test Yourself For Food Sensitivities, at Home and For FREE?

The Hypo-Allergenic Diet is a great tool that I use with my patients to test for all food sensitivities, including gluten.

I completed my first hypo-allergenic diet in my 3rd year of medical school. It was challenging but insightful. Never in a million years would I have guessed a major food sensitivity to be SOY.

Being Chinese, my family is used to eating a lot of soy products: soya sauce, tofu, fermented bean curd, edamame, miso, soya nuts, soya milk. I knew that I was sensitive to cow dairy and when I switched from cow milk to soy milk I was still experiencing bloating, gas, stomach pains, and fatigue. It seems obvious looking back now, but at the time I never imagined someone with a Chinese background could have difficulties ingesting soy.

Hypo-Allergenic is also known as Oligoantigenic or Elimination diet.

This means that we avoid eating the most common ingredients that cause people inflammation and digestive issues. IT IS NOT A DIET TO LOSE WEIGHT. It should be viewed more as a food sensitivity TEST.

The top 5 food offenders include:

  1. Wheat
  2. Dairy
  3. Corn
  4. Soy
  5. Eggs

Note that some of the foods on the list are very nutritious, so if you are not sensitive to the food, bring them back into your diet (ie. eggs).

Also remember, many non-gluten foods may not be healthy. Just because they remove those ingredients does not mean they haven’t replaced them with other poorer quality ingredients.

Try and stay away from packaged, canned, processed and deep fried foods. And be cautious of dehydrated and dried foods for they often contain added sugars and preservatives. Raw and fresh is often your best bet for optimal health.

A thorough elimination and re-introduction is not easy, which makes the IgG blood tests more appealing to many. The hypo-allergenic diet takes 6-8 weeks to fully complete with lots of meal planning and having the people you live with (and possibly cook for) on board. If a full hypo-allergenic diet is not plausible for you at this time, you can modify the test by eliminating one food at a time, for example gluten.

Watch the video. URL:https://youtu.be/uEM2iDT-VAk

The Dark Side of Wheat – New Perspectives On Celiac Disease and Wheat Intolerance


The globe-spanning presence of wheat and its exalted status among secular and sacred institutions alike differentiates this food from all others presently enjoyed by humans. Yet the unparalleled rise of wheat as the very catalyst for the emergence of ancient civilization has not occurred without a great price. While wheat was the engine of civilization’s expansion and was glorified as a “necessary food,” both in the physical (staff of life) and spiritual sense (the body of Christ), those suffering from celiac disease are living testimony to the lesser known dark side of wheat. A study of celiac disease and may help unlock the mystery of why modern man, who dines daily at the table of wheat, is the sickest animal yet to have arisen on this strange planet of ours.
The Celiac Iceberg
 
Celiac disease (CD) was once considered an extremely rare affliction, limited to individuals of European descent. Today, however, a growing number of studies indicate that celiac disease is found throughout the world at a rate of up to 1 in every 100 persons, which is several orders of magnitude higher than previously estimated.
These findings have led researchers to visualize CD as an iceberg. The tip of the icebergrepresents the relatively small number of the world’s population whose gross presentation of clinical symptoms often leads to the diagnosis of celiac disease. This is the classical case of CD characterized by gastrointestinal symptoms, malabsorption and malnourishment. It is confirmed with the “gold standard” of an intestinal biopsy. The submerged middle portion of the iceberg is largely invisible to classical clinical diagnosis, but not to modern serological screening methods in the form of antibody testing. This middle portion is composed of asymptomatic and latent celiac disease as well as “out of the intestine” varieties of wheat intolerance. Finally, at the base of this massive iceberg sits approximately 20-30% of the world’s population – those who have been found to carry the HLA-DQ locus of genetic susceptibility to celiac disease on chromosome 6.*
The “Celiac Iceberg” may not simply illustrate the problems and issues associated with diagnosis and disease prevalence, but may represent the need for a paradigm shift in how we view both CD and wheat consumption among non-CD populations.
First let us address the traditional view of CD as a rare, but clinically distinct species of genetically-determined disease, which I believe is now running itself aground upon the emerging, post-Genomic perspective, whose implications for understanding and treating disease are Titanic in proportion.
It Is Not In the Genes, But What We Expose Them To
Despite common misconceptions, monogenic diseases, or diseases that result from errors in the nucleotide sequence of a single gene are exceedingly rare. Perhaps only 1% of all diseases fall within this category, and Celiac disease is not one of them. In fact, following the completion of the Human Genome Project (HGP) in 2003 it is no longer accurate to say that our genes “cause” disease, any more than it is accurate to say that DNA alone is sufficient to account for all the proteins in our body. Despite initial expectations, the HGP revealed that there are only 20,000-25,000 genes in human DNA (genome), rather than the 100,000 + believed necessary to encode the 100,000 + proteins found in the human body (proteome).
The “blueprint” model of genetics: one gene → one protein → one cellular behavior, which was once the holy grail of biology, has now been supplanted by a model of the cell where epigeneticfactors (literally: “beyond the control of the gene”) are primary in determining how DNA will be interpreted, translated and expressed. A single gene can be used by the cell to express a multitude of proteins and it is not the DNA alone that determines how or what genes will be expressed. Rather, we must look to the epigenetic factors to understand what makes a liver cell different from a skin cell or brain cell. All of these cells share the exact same 3 billion base pairs that make up our genome, but it is the epigenetic factors, e.g. regulatory proteins and post-translational modifications, that make the determination as to which genes to turn on and which to silence, resulting in each cell’s unique phenotype. Moreover, epigenetic factors are directly and indirectly influenced by the presence or absence of key nutrients in the diet, as well as exposures to chemicals, pathogens and other environmental influences.
In a nutshell, what we eat and what we are exposed to in our environment directly affects our DNA and its expression.
Within the scope of this new perspective even classical monogenic diseases like cystic fibrosis (CF) can be viewed in a new, more promising light. In CF many of the adverse changes that result from the defective expression of the Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) gene may be preventable or reversible, owing to the fact that the misfolding of the CFTR gene product has been shown to undergo partial or full correction (in the rodent model) when exposed to phytochemicals found in turmeric, cayenne, and soybean Moreover, nutritional deficiencies of seleniun, zinc, riboflavin, vitamin e, etc. in the womb or early in life, may “trigger” the faulty expression or folding patterns of the CFTR gene in cystic fibrosis which might otherwise have avoided epigenetic activation. This would explain why it is possible to live into one’s late seventies with this condition, as was the case for Katherine Shores (1925-2004). The implications of these findings are rather extraordinary: epigenetic and not genetic factors are primary in determining disease outcome. Even if we exclude the possibility of reversing certain monogenic diseases, the basic lesson from the post-Genomic era is that we can’t blame our DNA for causing disease. Rather, it may have more to do with what we choose to expose our DNA to.
Celiac Disease Revisited
 
What all of this means for CD is that the genetic susceptibility locus, HLA-DQ, does not by itself determine the exact clinical outcome of the disease. Instead of being ‘the cause,’ the HLA genes may be activated as a consequence of the disease process. Thus, we may need to shift our epidemiological focus from viewing this as a classical “disease” involving a passive subject controlled by aberrant genes, to viewing it as an expression of a natural, protective response to the ingestion of something that the human body was not designed to consume.
If we view celiac disease not as an unhealthy response to a healthy food, but as a healthy response to an unhealthy food, classical CD symptoms like diarrhea may make more sense. Diarrhea can be the body’s way to reduce the duration of exposure to a toxin or pathogen, and villous atrophy can be the body’s way of preventing the absorption and hence, the systemic effects of chronic exposure to wheat.
I believe we would be better served by viewing the symptoms of CD as expressions of bodily intelligence rather than deviance. We must shift the focus back to the disease trigger, which is wheat itself.
People with celiac disease may actually have an advantage over the apparently non-afflicted because those who are “non-symptomatic” and whose wheat intolerance goes undiagnosed or misdiagnosed because they lack the classical symptoms and may suffer in ways that are equally or more damaging, but expressed more subtly, or in distant organs. Within this view celiac disease would be redefined as a protective (healthy?) response to exposure to an inappropriate substance, whereas “asymptomatic” ingestion of the grain with its concomitant “out of the intestine” and mostly silent symptoms, would be considered the unhealthy response insofar as it does not signal in an obvious and acute manner that there is a problem with consuming wheat.
It is possible that celiac disease represents both an extreme reaction to a global, species-specific intolerance to wheat that we all share in varying degrees. CD symptoms may reflect the body’s innate intelligence when faced with the consumption of a substance that is inherently toxic. Let me illustrate this point using wheat germ agglutinin (WGA), as an example.
WGA is classified as a lectin and is known to play a key role in kidney pathologies, such as IgA nephropathy. In the article: “Do dietary lectins cause disease?” the Allergist David L J Freed points out that WGA binds to “glomerular capillary walls, mesangial cells and tubules of human kidney and (in rodents) binds IgA and induces IgA mesangial deposits,” indicating that wheat consumption may lead to kidney damage in susceptible individuals. Indeed, a study from the Mario Negri Institute for Pharmacological Research in Milan Italy published in 2007 in theInternational Journal of Cancer looked at bread consumption and the risk of kidney cancer. They found that those who consumed the most bread had a 94% higher risk of developing kidney cancer compared to those who consumed the least bread. Given the inherently toxic effect that WGA may have on kidney function, it is possible that in certain genetically predisposed individuals (e.g. HLA-DQ2/DQ8) the body – in its innate intelligence – makes an executive decision: either continue to allow damage to the kidneys (or possibly other organs) until kidney failure and rapid death result, or launch an autoimmune attack on the villi to prevent the absorption of the offending substance which results in a prolonged though relatively malnourished life. This is the explanation typically given for the body’s reflexive formation of mucous following exposure to certain highly allergenic or potentially toxic foods, e.g. dairy products, sugar, etc? The mucous coats the offending substance, preventing its absorption and facilitating safe elimination via the gastrointestinal tract.   From this perspective the HLA-DQ locus of disease susceptibility in the celiac is not simply activated but utilized as a defensive adaptation to continual exposure to a harmful substance. In those who do not have the HLA-DQ locus, an autoimmune destruction of the villi will not occur as rapidly, and exposure to the universally toxic effects of WGA will likely go unabated until silent damage to distant organs leads to the diagnosis of a disease that is apparently unrelated to wheat consumption.
Loss of kidney function may only be the “tip of the iceberg,” when it comes to the possible adverse effects that wheat proteins and wheat lectin can generate in the body. If kidney cancer is a likely possibility, then other cancers may eventually be linked to wheat consumption as well. This correlation would fly in the face of globally sanctioned and reified assumptions about the inherent benefits of wheat consumption. It would require that we suspend cultural, socio-economic, political and even religious assumptions about its inherent benefits. In many ways, the reassessment of the value of wheat as a food requires a William Boroughs-like moment of shocking clarity when we perceive “in a frozen moment….what is on the end of every fork.” Let’s take a closer look at what is on the end of our forks.
Our biologically inappropriate diet
 
In a previous article, I discussed the role that wheat plays as an industrial adhesive (e.g. paints, paper mache’, and book binding-glue) in order to illustrate the point that it may not be such a good thing for us to eat. The problem is implicit in the word gluten, which literally means “glue” in Latin and in words like pastry and pasta, which derives from wheatpaste, the original concoction of wheat flour and water which made such good plaster in ancient times. What gives gluten its adhesive and difficult-to-digest qualities are the high levels of disulfide bonds it contains. These same sulfur-to-sulfur bonds are found in hair and vulcanized rubber products, which we all know are difficult to decompose and are responsible for the sulfurous odor they give off when burned.
There will be 676 million metric tons of wheat produced this year alone, making it the primary cereal of temperate regions and third most prolific cereal grass on the planet. This global dominance of wheat is signified by the Food & Agricultural Organization’s (FAO) (the United Nation’s international agency for defeating hunger) use of a head of wheat as its official symbol. Any effort to indict the credibility of this “king of grains” will prove challenging. As Rudolf Hauschka once remarked, wheat is “a kind of earth-spanning organism.” It has vast socio-economic, political, and cultural significance.   For example, in the Catholic Church, a wafer made of wheat is considered irreplaceable as the embodiment of Christ. .
Our dependence on wheat is matched only by its dependence on us. As Europeans have spread across the planet, so has this grain. We have assumed total responsibility for all phases of the wheat life cycle: from fending off its pests; to providing its ideal growing conditions; to facilitating reproduction and expansion into new territories. We have become so inextricably interdependent that neither species is sustainable at current population levels without this symbiotic relationship.
It is this co-dependence that may explain why our culture has for so long consistently confined wheat intolerance to categorically distinct, “genetically-based” diseases like “celiac.” These categorizations may protect us from the realization that wheat exerts a vast number of deleterious effects on human health in the same way that “lactose intolerance” distracts attention from the deeper problems associated with the casein protein found in cow’s milk. Rather than see wheat for what it very well may be: a biologically inappropriate food source, we “blame the victim,” and look for genetic explanations for what’s wrong with small subgroups of our population who have the most obvious forms of intolerance to wheat consumption, e.g. celiac disease, dermatitis herpetiformis, etc.   The medical justification for these classifications may be secondary to economic and cultural imperatives that require the inherent problems associated with wheat consumption be minimized or occluded.
In all probability the celiac genotype represents a surviving vestigial branch of a once universal genotype, which through accident or intention, have had through successive generations only limited exposure to wheat. The celiac genotype, no doubt, survived through numerous bottlenecks or “die offs” represented by a dramatic shift from hunted and foraged/gathered foods to gluten-grain consumption, and for whatever reason simply did not have adequate time to adapt or select out the gluten-grain incompatible genes. The celiac response may indeed reflect a prior, species-wide intolerance to a novel food source: the seed storage form of the monocotyledonous cereal grasses which our species only began consuming 1-500 generations ago at the advent of the Neolithic transition (10-12,000 BC). Let us return to the image of the celiac iceberg for greater clarification.
Our Submerged Grain-Free Prehistory
The iceberg metaphor is an excellent way to expand our understanding of what was once considered to be an extraordinarily rare disease into one that has statistical relevance for us all, but it has a few limitations. For one, it reiterates the commonly held view that Celiac is a numerically distinct disease entity or “disease island,” floating alongside other numerically distinct disease “ice cubes” in the vast sea of normal health. Though accurate in describing the sense of social and psychological isolation many of the afflicted feel, the celiac iceberg/condition may not be a distinct disease entity at all.
Although the HLA-DQ locus of disease susceptibility on chromosome 6 offers us a place to project blame, I believe we need to shift the emphasis of responsibility for the condition back to the disease “trigger” itself: namely, wheat and other prolamine rich grains, e.g. barley, rye, spelt, and oats. Without these grains the typical afflictions we call celiac would not exist. Within the scope of this view the “celiac iceberg” is not actually free floating but an outcropping from an entire submerged subcontinent, representing our long-forgotten (cultural time) but relatively recent metabolic prehistory as hunters-and-gatherers (biological time), where grain consumption was, in all likelihood, non-existent, except in instances of near-starvation.
The pressure on the celiac to be viewed as an exceptional case or deviation may have everything to do with our preconscious belief that wheat, and grains as a whole are the “health foods,” and very little to do with a rigorous investigations of the facts.
Grains have been heralded since time immemorial as the “staff of life,” when in fact they are more accurately described as a cane, precariously propping up a body starved of the nutrient-dense, low-starch vegetables, fruits, edible seeds and meats, they have so thoroughly supplanted (c.f. Paleolithic Diet). Most of the diseases of affluence, e.g. type 2 diabetes, coronary heart disease, cancer, etc. can be linked to the consumption of a grain-based diet, including secondary “hidden sources” of grain consumption in grain-fed fish, poultry, meat and milk products.
Our modern belief that grains make for good food, is simply not supported by the facts. The cereal grasses are within an entirely different family: monocotyledonous (one leafed embryo) than that from which our body sustained itself for millions of years: dicotyledonous (two leafed embryo). The preponderance of scientific evidence points to a human origin in the tropical rainforests of Africa where dicotyledonous fruits would have been available for year round consumption. It would not have been monocotyledonous plants, but the flesh of hunted animals that would have allowed for the migration out of Africa 60,000 years ago into the northern latitudes where vegetation would have been sparse or non-existent during winter months. Collecting and cooking grains would have been improbable given the low nutrient and caloric content of grains and the inadequate development of pyrotechnology and associated cooking utensils necessary to consume them with any efficiency. It was not until the end of the last Ice Age 20,000 years ago that our human ancestors would have slowly transitioned to a cereal grass based diet coterminous with emergence of civilization.   20,000 years is probably not enough time to fully adapt to the consumption of grains. Even animals like cows with a head start of thousands of years, having evolved to graze on monocotyledons and equipped as ruminants with the four-chambered fore-stomach enabling the breakdown of cellulose and anti-nutrient rich plants, are not designed to consume grains. Cows are designed to consume the sprouted mature form of the grasses and not their seed storage form. Grains are so acidic/toxic in reaction that exclusively grain-fed cattle are prone to developing severe acidosis and subsequent liver abscesses and infections, etc. Feeding wheat to cattle provides an even greater challenge:
“Beef:  Feeding wheat to ruminants requires some caution as it tends to be more apt than other cereal grains to cause acute indigestion in animals which are unadapted to it. The primary problem appears to be the high gluten content of which wheat in the rumen can result in a “pasty” consistency to the rumen contents and reduced rumen motility.”
(source: Ontario ministry of Agriculture food & Rural affairs)
Seeds, after all, are the “babies” of these plants, and are invested with not only the entire hope for continuance of its species, but a vast armory of anti-nutrients to help it accomplish this task: toxic lectins, phytates and oxalates, alpha-amalyase and trypsin inhibitors, and endocrine disrupters. These not so appetizing phytochemicals enable plants to resist predation of their seeds, or at least preventing them from “going out without a punch.”
Wheat: An Exceptionally Unwholesome Grain
Wheat presents a special case insofar as wild and selective breeding has produced variations which include up to 6 sets of chromosomes (3x the human genome worth!) capable of generating a massive number of proteins each with a distinct potentiality for antigenicity. Common bread wheat (Triticum aestivum), for instance, has over 23,788 proteins cataloged thus far. In fact, the genome for common bread wheat is actually 6.5 times larger than that of the human genome!
With up to a 50% increase in gluten content of some varieties of wheat, it is amazing that we continue to consider “glue-eating” a normal behavior, whereas wheat-avoidance is left to the “celiac” who is still perceived by the majority of health care practitioners as mounting a “freak” reaction to the consumption of something intrinsically wholesome.
Thankfully we don’t need to rely on our intuition, or even (not so) common sense to draw conclusions about the inherently unhealthy nature of wheat. A wide range of investigation has occurred over the past decade revealing the problem with the alcohol soluble protein component of wheat known as gliadin, the sugar-binding protein known as lectin (Wheat Germ Agglutinin), the exorphin known as gliadomorphin, and the excitotoxic potentials of high levels of aspartic and glutamic acid found in wheat. Add to these the anti-nutrients found in grains such as phytates, enzyme inhibitors, etc. and you have a substance which we may more appropriately consider the farthest thing from wholesome.
The remainder of this article will demonstrate the following adverse effects of wheat on both celiac and non-celiac populations: 1) wheat causes damage to the intestines 2) wheat causes intestinal permeability 3) wheat has pharmacologically active properties 4) wheat causes damage that is “out of the intestine” affecting distant organs 5) wheat induces molecular mimicry 6) wheat contains high concentrations of excitoxins.
1) WHEAT GLIADIN CREATES IMMUNE MEDIATED DAMAGE TO THE INTESTINES
Gliadin is classified as a prolamin, which is a wheat storage protein high in the amino acids proline and glutamine and soluble in strong alcohol solutions. Gliadin, once deamidated by the enzyme Tissue Transglutaminase, is considered the primary epitope for T-cell activation and subsequent autoimmune destruction of intestinal villi. Yet gliadin does not need to activate an autoimmune response, e.g. Celiac disease, in order to have a deleterious effect on intestinal tissue.
In a study published in GUT in 2007 a group of researchers asked the question: “Is gliadin really safe for non-coeliac individuals?”   In order to test the hypothesis that an innate immune response to gliadin is common in patients with celiac disease and without celiac disease, intestinal biopsy cultures were taken from both groups and challenged with crude gliadin, the gliadin synthetic 19-mer (19 amino acid long gliadin peptide) and 33-mer deamidated peptides.   Results showed that all patients with or without Celiac disease when challenged with the various forms of gliadin produced an interleukin-15-mediated response. The researchers concluded:
“The data obtained in this pilot study supports the hypothesis that gluten elicits its harmful effect, throughout an IL15 innate immune response, on all individuals [my italics].”
The primary difference between the two groups is that the celiac disease patients experienced both an innate and an adaptive immune response to the gliadin, whereas the non-celiacs experienced only the innate response.   The researchers hypothesized that the difference between the two groups may be attributable to greater genetic susceptibility at the HLA-DQ locus for triggering an adaptive immune response, higher levels of immune mediators or receptors, or perhaps greater permeability in the celiac intestine. It is possible that over and above the possibility of greater genetic susceptibility, most of the differences are from epigenetic factors that are influenced by the presence or absence of certain nutrients in the diet. Other factors such as exposure to NSAIDs like naproxen or aspirin can profoundly increase intestinal permeability in the non-celiac, rendering them susceptible to gliadin’s potential for activating secondary adaptive immune responses. This may explain why in up to 5% of all cases of classically defined celiac disease the typical HLA-DQ haplotypes are not found. However, determining the factors associated greater or lesser degrees of susceptibility to gliadin’s intrinsically toxic effect should be a secondary to the fact that it is has been demonstrated to be toxic to both non-celiacs and celiacs. 
2) WHEAT GLIADIN CREATES INTESTINAL PERMEABILITY
Gliadin upregulates the production of a protein known as zonulin, which modulates intestinal permeability. Over-expression of zonulin is involved in a number of autoimmune disorders, including celiac disease and Type 1 diabetes. Researchers have studied the effect of gliadin on increased zonulin production and subsequent gut permeability in both celiac and non-celiac intestines, and have found that “gliadin activates zonulin signaling irrespective of the genetic expression of autoimmunity, leading to increased intestinal permeability to macromolecules.”10   These results indicate, once again, that a pathological response to wheat gluten is a normal or human, species specific response, and is not based entirely on genetic susceptibilities. Because intestinal permeability is associated with wide range of disease states, including cardiovascular illness, liver disease and many autoimmune disorders, I believe this research indicates that gliadin (and therefore wheat) should be avoided as a matter of principle.
3) WHEAT GLIADIN HAS PHARMACOLOGICAL PROPERTIES
Gliadin can be broken down into various amino acid lengths or peptides. Gliadorphin is a 7 amino acid long peptide: Tyr-Pro-Gln-Pro-Gln-Pro-Phe which forms when the gastrointestinal system is compromised. When digestive enzymes are insufficient to break gliadorphin down into 2-3 amino acid lengths and a compromised intestinal wall allows for the leakage of the entire 7 amino acid long fragment into the blood, glaidorphin can pass through to the brain through circumventricular organs and activate opioid receptors resulting in disrupted brain function.
There have been a number of gluten exorphins identified: gluten exorphin A4, A5, B4, B5 and C, and many of them have been hypothesized to play a role in autism, schizophrenia, ADHD and related neurological conditions.   In the same way that the celiac iceberg illustrated the illusion that intolerance to wheat is rare, it is possible, even probable, that wheat exerts pharmacological influences on everyone. What distinguishes the schizophrenic or autistic individual from the functional wheat consumer is the degree to which they are affected.
Below the tip of the “Gluten Iceberg,” we might find these opiate-like peptides to be responsible for bread’s general popularity as a “comfort food”, and our use of phrases like “I love bread,” or “this bread is to die for” to be indicative of wheat’s narcotic properties. I believe a strong argument can be made that the agricultural revolution that occurred approximately 10-12,000 years ago as we shifted from the Paleolithic into the Neolithic era was precipitated as much by environmental necessities and human ingenuity, as it was by the addictive qualities of psychoactive peptides in the grains themselves.
The world-historical reorganization of society, culture and consciousness accomplished through the symbiotic relationship with cereal grasses, may have had as much to do with our ability to master agriculture, as to be mastered by it.   The presence of pharmacologically active peptides would have further sweetened the deal, making it hard to distance ourselves from what became a global fascination with wheat.
An interesting example of wheat’s addictive potential pertains to the Roman army. The Roman Empire was once known as the “Wheat Empire,” with soldiers being paid in wheat rations. Rome’s entire war machine, and its vast expansion, was predicated on the availability of wheat. Forts were actually granaries, holding up to a year’s worth of grain in order to endure sieges from their enemies. Historians describe soldiers’ punishment included being deprived of wheat rations and being given barley instead.   The Roman Empire went on to facilitate the global dissemination of wheat cultivation which fostered a form of imperialism with biological as well as cultural roots.
The Roman appreciation for wheat, like our own, may have had less to do with its nutritional value as “health food” than its ability to generate a unique narcotic reaction. It may fulfill our hunger while generating a repetitive, ceaseless cycle of craving more of the same, and by doing so, enabling the surreptitious control of human behavior. Other researchers have come to similar conclusions. According to the biologists Greg Wadley & Angus Martin:
 “Cereals have important qualities that differentiate them from most other drugs. They are a food source as well as a drug, and can be stored and transported easily. They are ingested in frequent small doses (not occasional large ones), and do not impede work performance in most people. A desire for the drug, even cravings or withdrawal, can be confused with hunger. These features make cereals the ideal facilitator of civilization (and may also have contributed to the long delay in recognizing their pharmacological properties).”
4) WHEAT LECTIN (WGA) DAMAGES OUR TISSUE.
Wheat contains a lectin known as Wheat Germ Agglutinin which is responsible for causing direct, non-immune mediated damage to our intestines, and subsequent to entry into the bloodstream, damage to distant organs in our body.
Lectins are sugar-binding proteins which are highly selective for their sugar moieties. It is believed that wheat lectin, which binds to the monosaccharide N-acetyl glucosamine (NAG), provides defense against predation from bacteria, insects and animals. Bacteria have NAG in their cell wall, insects have an exoskeleton composed of polymers of NAG called chitin, and the epithelial tissue of mammals, e.g. gastrointestinal tract, have a “sugar coat” called the glycocalyx which is composed, in part, of NAG. The glycocalyx can be found on the outer surface (apical portion) of the microvilli within the small intestine.
There is evidence that WGA may cause increased shedding of the intestinal brush border membrane, reduction in surface area, acceleration of cell losses and shortening of villi, via binding to the surface of the villi. WGA can mimic the effects of epidermal growth factor (EGF) at the cellular level, indicating that the crypt hyperplasia seen in celiac disease may be due to a mitogenic reponse induced by WGA. WGA has been implicated in obesity and “leptin resistance” by blocking the receptor in the hypothalamus for the appetite satiating hormone leptin. WGA has also been shown to have an insulin-mimetic action, potentially contributing to weight gain and insulin resistance.15   And, as discussed earlier, wheat lectin has been shown to induce IgA mediated damage to the kidney, indicating that nephropathy and kidney cancer may be associated with wheat consumption.
5) WHEAT PEPTIDES EXHIBIT MOLECULAR MIMICRY
Gliadorphin and gluten exporphins exhibit a form of molecular mimicry that affects the nervous system, but other wheat proteins effect different organ systems. The digestion of gliadin produces a peptide that is 33 amino acids long and is known as 33-mer which has a remarkable homology to the internal sequence of pertactin, the immunodominant sequence in the Bordetella pertussis bacteria (whooping cough). Pertactin is considered a highly immunogenic virulence factor, and is used in vaccines to amplify the adaptive immune response. It is possible the immune system may confuse this 33-mer with a pathogen resulting in either or both a cell-mediated and adaptive immune response against Self.
6) WHEAT CONTAINS HIGH LEVELS OF EXCITO-TOXINS
John B. Symes, D.V.M. is responsible for drawing attention to the potential excitotoxicity of wheat, dairy, and soy, due to their exceptionally high levels of the non-essential amino acids glutamic and aspartic acid. Excitotoxicity is a pathological process where glutamic and aspartic acid cause an over-activation of the nerve cell receptors (e.g. NMDA and AMPA receptor) leading to calcium induced nerve and brain injury.   Of all cereal grasses commonly consumed wheat contains the highest levels of glutamic acid and aspartic acid. Glutamic acid is largely responsible for wheat’s exceptional taste. The Japanese coined the word umami to describe the extraordinary “yummy” effect that glutamic acid exerts on the tongue and palate, and invented monosodium glutamate(MSG) to amplify this sensation. Though the Japanese first synthesized MSG from kelp, wheat can also be used due to its high glutamic acid content.   It is likely that wheat’s popularity, alongside its opiate-like activity, has everything to do with the natural flavor-enhancers already contained within it. These amino acids may contribute to neurodegenerative conditions such as multiple sclerosis, Alzhemier disease, Huntington’s disease, and other nervous disorders such as epilepsy, attention deficit disorder and migraines.
CONCLUSION
In this article I have proposed that celiac disease be viewed not as a rare “genetically-determined” disorder, but as an extreme example of our body communicating to us a once universal, species-specific affliction: severe intolerance to wheat. Celiac disease reflects back to us how profoundly our diet has diverged from what was, until only recently a grain free diet, and even more recently, a wheat free one. We are so profoundly distanced from that dramatic Neolithic transition in cultural time that “missing is any sense that anything is missing.” The body, on the other hand, cannot help but remember a time when cereal grains were alien to the diet, because in biological time it was only moments ago.
Eliminating wheat, if not all of the members of the cereal grass family, and returning to dicotyledons or pseudo-grains like quinoa, buckwheat and amaranth, may help us roll back the hands of biological and cultural time, to a time of clarity, health and vitality that many of us have never known before. When one eliminates wheat and fills the void left by its absence with fruits, vegetables, high quality meats and foods consistent with our biological needs we may begin to feel a sense of vitality that many would find hard to imagine. If wheat really is more like a drug than a food, anesthetizing us to its ill effects on our body, it will be difficult for us to understand its grasp upon us unless and until we eliminate it from our diet. I encourage everyone to see celiac disease not as a condition alien to our own. Rather, the celiac gives us a glimpse of how profoundly wheat may distort and disfigure our health if we continue to expose ourselves to its ill effects. I hope this article will provide inspiration for non-celiacs to try a wheat free diet and judge for themselves if it is really worth eliminating.

People With Celiac Disease Advised to Consider Pneumonia Vaccine


Seniors are advised to get vaccinated against pneumonia, but that precaution is also a good idea for people of any age with celiac disease, according to a U.K. study.

Despite their greater risk for pneumonia, only about one quarter of celiac patients get a pneumonia vaccine after they are diagnosed, the study team writes in the journal Alimentary Pharmacology and Therapeutics, online May 5.

To assess the link between pneumonia and celiac disease, the researchers used data on English patients collected between 1997 and 2011, including 9,803 with celiac disease and a comparison group of 101,755 people without celiac.

Overall people with and without celiac disease had pneumonia at similar rates, with 3.42 cases per 1,000 people per year among those with celiac and 3.12 per 1,000 people per year among those without the condition.

But among people under age 65 with a celiac diagnosis, those who didn’t get a pneumonia vaccine were 28 percent more likely to get pneumonia than those who were vaccinated, researchers found.

People under 65 with celiac disease were also 7 percent more likely to contract pneumonia than counterparts without celiac.

Only people under 65 years old showed a higher risk of pneumonia, probably because older age is an even greater risk factor for pneumonia than celiac disease, the study team speculates.
Only around 37 percent of celiac patients had ever had a pneumonia vaccine and only 26 percent of patients got a vaccine after their condition was diagnosed.

In people with celiac disease, the immune system attacks and damages the intestines when the gluten protein in wheat, barley or rye is consumed. The Celiac Disease Foundation estimates that one in 100 people worldwide suffer from the condition.

Colin Crooks, one of the study’s authors, offered a possible explanation for why celiac patients might be at greater risk of pneumonia.

“There is some evidence that in some patients with untreated celiac disease the spleen does not work as well, which is important in fighting certain infections,” he told Reuters Health by email.

Crooks, a researcher at the University of Nottingham, noted that when people with celiac begin a gluten-free diet, their spleen function will begin to improve, which may help them fight off infections.

Celiac disease can cause spleen issues for up to a third of patients and these people may be at greater risk for infections, said Dr. Shamez Ladhani of Public Health England in London, who was not involved in the study.

“It is important that they discuss their risk with their doctor and consider appropriate actions to reduce the risk, which may include vaccination, not only against pneumococcal disease but also other bacteria and viruses, such as influenza vaccination,” Ladhani said by email.

Getting a flu vaccine can also help protect against bacterial infections like pneumonia, he added.

Ladhani recommended that patients with spleen problems should get a flu vaccine every year and the pneumonia vaccine every five years.

“Bacterial pneumonia can be serious, but can be treated with common antibiotics. It is important that individuals with celiac disease seek medical advice early when they become unwell with fever or respiratory symptoms,” Ladhani said.

Roundup herbicide linked to celiac disease and gluten intolerance, new study suggests


A new study has found that the world’s best-selling herbicide is linked to the global rise of celiac disease, gluten intolerance and irritable bowel syndrome.

According to the U.S. peer-reviewed study, the details of which were published by Dr. Anthony Samsel and Dr. Stephanie Seneff, the rise in disease has coincided with the increased use of glyphosate (Roundup) herbicide.

Roundup herbicide

The paper has been published in the Journal of Interdisciplinary Toxicology.

The researchers say that, based on their findings, one in 20 people in North America and Western Europe now suffer from celiac disease, which is essentially gluten intolerance.

“Gluten intolerance is a growing epidemic in the U.S. and, increasingly, worldwide,” said the researchers in their paper.

“All of the known biological effects of glyphosate — cytochrome P450 inhibition, disruption of synthesis of aromatic amino acids, chelation of transition metals, and antibacterial action — contribute to the pathology of celiac disease,” the paper states.

Widespread planting of Roundup Ready genetically modified crops has made Monsanto’s Roundup the number-one selling herbicide. But, according to this latest study, its increased use has come at a price.

As described by Sustainable Pulse:

Celiac disease is a digestive disease that damages the small intestine and interferes with absorption of nutrients from food. People who have celiac disease cannot tolerate gluten, a protein in wheat, rye, and barley. Gluten is found mainly in foods but may also be found in everyday products such as medicines, vitamins, and lip balms.

When people with the condition eat foods that contain gluten, their immune systems respond by destroying or damaging villi — the small, fiber-like protrusions that line the small intestine. Normally, villi allow food nutrients to be absorbed through the walls of the small intestine into the bloodstream, but without healthy villi, celiac sufferers can become malnourished, regardless of the amount of food they ingest.

Celiac Disease: Which Children Should Be Tested?


Serological screening studies indicate that celiac disease (CD) has a prevalence of 1% to 2% in Western populations and that the incidence is increasing across all age groups. Although many individuals develop symptomatic CD, others do not. In fact, researchers estimate that just 10% to 30% of patients with CD are ever diagnosed, in part because they may be asymptomatic.

For children, that lack of diagnosis can be critical: Young children with undiagnosed CD are particularly vulnerable to the effects of malabsorption and failure to thrive.
Because of that danger, some researchers and clinicians are pushing for widespread screening. Some experts go even further, suggesting that the entire pediatric population should be routinely screened for CD. Yet the cost and the need for invasive confirmatory biopsies leave others skeptical of that approach unless better, cheaper diagnostics become available

New data published in the March issue of the American Journal of Gastroenterology by Rok Seon Choung, MD, PhD, from the Division of Gastroenterology and Hepatology, Mayo Clinic, Rochester, Minnesota, and colleagues, show that among a representative sample of nearly 15,000 Americans aged 6 years or older, the prevalence of confirmed CD was 0.8% (95% confidence interval [CI], 0.5% – 1.0%) in 2009 to 2012.

Extrapolating to the population at large, the data suggest that more than 1.5 million people in the United States have CD. Moreover, the analysis showed a near doubling in the prevalence among adults aged 50 years or older, rising from 0.17% (95% CI, 0.03 – 0.33%) in the 1988 to 1994 National Health and Nutrition Examination Survey to 0.44% (95% CI, 0.24% 0 0.81%) in the 2009 to 2012 survey. However, the adjusted prevalence is uneven among racial and ethnic groups, ranging from 1.0% of whites to 0.2% of blacks and 0.3% of Hispanics.

For Ritu Verma, MBChB, director of the Center for Celiac Disease and the Lustgarten Endowed Chair for Clinical Care of GI Motility Disorders at the Children’s Hospital of Philadelphia in Pennsylvania, those numbers and the consequences of undiagnosed CD lead to a straightforward conclusion: “The biggest plea I have for pediatricians is to just do the panel,” she told Medscape Medical News, referring to an antibody test panel that can help to determine whether a child likely has the disease.

However, other pediatric gastroenterologists, including Saeed Mohammad, MD, from Northwestern University Feinberg School of Medicine in Chicago, Illinois, feel that screening the entire population is just not feasible, and the cost of increased screening may not be justified.
“[CD] is more common in certain ethnic groups and less common in others,” he told Medscape Medical News. “Unless the cost of genetic testing drastically decreases, I believe the current system of screening children with risk factors for [CD] such as diabetes, Down syndrome, or autoimmune diseases, as well as testing for celiac-specific antibodies in children whose growth is faltering [and] those who have diarrhea or chronic abdominal pain, is the most cost-effective option with the greatest sensitivity.”

Moreover, he notes that children younger than 2 years should not be tested without high clinical suspicion, “and [diagnosis] should always be confirmed with endoscopy and biopsy,” he continued.

When and How to Test?

Classic symptoms of CD include prolonged abdominal pain, poor growth, chronic diarrhea, and abdominal distention. When children present with these symptoms, physicians may think of testing for CD, but unfortunately, there is no standardized celiac panel.

 The Children’s Hospital of Philadelphia, where Dr Verma practices, recommends a screening panel that includes total immunoglobulin A (IgA), antitransglutaminase (anti-tTG), and antiendomysial. Deaminated gliadin can be ordered in conjunction with anti-tTG. In addition, for children younger than 3 years, it may be advisable to test for antigliadin, she says.

Dr Verma would prefer that all children be screened for CD, but if that is not possible, then she encourages pediatricians to screen when they have even the slightest suspicion that the child might have CD.

She notes that many children who are diagnosed with CD in her clinic do not have a classic presentation. Thus, Dr Verma actively educates primary care physicians about CD and encourages them to consider the possibility of CD if they are treating a patient who does not seem to get better. The primary care physician can then order the simple celiac blood test.

Meanwhile, Boston Children’s Hospital in Massachusetts focuses their screening efforts on high-risk children. These would be children with classic symptoms of CD such as abdominal pain and chronic diarrhea, as well as those at known genetic risk for CD and children with type 1 diabetes.

Boston Children’s Hospital uses a similar celiac panel to the one used by the Children’s Hospital of Philadelphia. “We recommend that ‘high-risk’ children be screened with serologic celiac tests (specifically tissue transglutaminase IgA and a total IgA), even if they have no identifiable symptoms,” Dascha C. Weir, MD, associate director of the Celiac Program at Boston Children’s Hospital in Massachusetts, told Medscape Medical News.

A child who is strongly positive for all of these antibody tests likely has CD.

Dr Verma acknowledges that the tests can give false-negatives and provide a false sense of security to parents. They can also give a false-positive. This is why the definitive test for CD is upper endoscopy.

Dr Verma explained that physicians also need to take a dietary history. Some children do not eat a lot of gluten between the ages of 2 and 3 years, and therefore may not test positive on the celiac panel. Unfortunately, most pediatricians do not have the time to take a complete dietary history.

Moreover, the United States has a heterogeneous population, and some experts suggest that a one-size-fits-all screening approach may not be practical. That said, “care providers should have a low threshold to order celiac testing for children with gastrointestinal symptoms or nongastrointestinal symptoms,” Dr Weir emphasized.

Classic Symptoms Inadequate Cues

Regrettably, many primary care physicians have the false impression that children who lack classic symptoms of CD should not be screened for the disease. Unfortunately, relying on symptoms to trigger CD testing could miss a substantial proportion of affected children, according to recently published data from Sweden.

Daniel Agardh, MD, PhD, from Lund University in Sweden, and colleagues published their subanalysis of data from The Environmental Determinants of Diabetes in the Young (TEDDY) study online March 2 in Pediatrics. Whereas the primary goal of TEDDY is to identify genetic and environmental factors that contribute to the development of type 1 diabetes, Dr Agardh and colleagues used the birth cohort data to compare children who were positive for anti-tTGA with those who were negative for anti-tTGA.

They found that, compared with children who were negative for anti-tTGA, children positive for anti-tTGA were more likely to be symptomatic at 2 and 3 years of age, but not at 4 years of age. In addition, the researchers report that anti-tTGA levels correlate with severity of mucosal lesion in symptomatic and asymptomatic children, and some asymptomatic children had quite elevated levels of anti-tTGA. These children were diagnosed with CD because they were enrolled in the study, but likely would have been missed in standard practice.

Results from the study have led the experts to propose a two-tiered screening system by which all newborns are first specifically screened for genetic susceptibility such as HLA type and, if positive, then screened for celiac-specific antibodies. This screening protocol would be a departure from that used in the United States and would include much more comprehensive genetic screening.

“With awareness and availability of gluten-free foods increasingly entrenched within the mainstream of the North American lifestyle, the burden lies on the identification of all children who may benefit from treatment. The prospective data from TEDDY effectively demonstrate the utility of 2-tiered screening and constitute a step forward in devising a population-screening strategy that best offers the appropriate treatment at a stage in life where it may yield the most lifelong benefit,” writes Richard J. Noel, MD, PhD, from Duke University Medical Center in Durham, North Carolina, in an accompanying commentary in Pediatrics.

Dr Verma believes the study from Sweden advances the conversation, but does not take the concept of screening far enough. “I am glad that at least someone is thinking about screening people early,” Dr Verma said. “[but] if you are between 2 and 3 years of age, you should just have a celiac panel done.” She emphasized again that there are many atypical symptoms of CD, especially in children.

Genetic Risk Factors

The 2-tiered screening test may not yet be realistic for the United States.

Currently, children in the United States may be targeted for screening because they have a family history of CD or have Down syndrome, Turner syndrome, or Williams syndrome. The 2004 consensus guidelines from the National Institutes of Health indicate that patients with type 1 diabetes should also be screened for CD.

Although individuals who are homozygotic for HLA-DR3-DQ2 are at highest genetic risk for CD and tend to develop CD very early in life, HLA tests are not routinely performed in the United States, and therefore HLA status rarely triggers a CD test. According to Dr Weir, this is an evolving issue.

“Screening high-risk groups (such as family members or patients with predisposing conditions) for [CD] with serologic testing (tissue transglutaminase [immunoglobulin G and total immunoglobulin A] is widely accepted. However, in the USA, the use of HLA typing in screening children to identify who is at higher risk of [CD] has not been fully evaluated and is not currently recommended. However, as our understanding of the HLA typing and [CD] risk deepens, and as HLA testing becomes cheaper and more readily available, this may change,” Dr Weir noted.

The genetic basis of CD does mean that a diagnosis of CD has implications for the entire family. If a child tests positive for CD, the family (siblings, parents, and grandparents) should be tested for CD. In addition, siblings should be periodically retested to look for the development of CD.

Poor Growth May Point to CD

Another newly proposed screening strategy suggests children should be tested for CD if they have poor growth. Although CD does not always affect growth, most children who are diagnosed experience a decrease or slowing of growth before diagnosis. A new study suggests these children could be diagnosed earlier if they were being followed in a well-established growth-monitoring program.

Antti Saari, MD, from the University of Eastern Finland in Kuopio, and colleagues published the results of their longitudinal retrospective study on the relationship between growth and CD online March 2 and in the March issue of JAMA Pediatrics. They propose that screening for CD would include the use of several growth-monitoring parameters in combination. Furthermore, the parameters should be integrated into a computerized screening algorithm associated with an electronic health record system.

“I have sort of mixed feelings about [the study]…. I am personally of the belief that there should be mass screening of people,” stated Dr Verma. She explained that because not all of the children with CD have growth issues, the approach proposed by Dr Saari and colleagues would miss many children with CD.

Atypical Symptoms

Dr Verma described some of the atypical symptoms of CD. For example, anemia may be a sign of CD. As a consequence, Dr Verma recommends that children with unexplained anemia be tested for CD.

She added that chronic headaches, seizures, chronic constipation, elevated liver enzymes, chronic pancreatitis, alopecia, and tiredness can all be signs and symptoms of CD. Moreover, many patients who are originally diagnosed with irritable bowel syndrome are ultimately diagnosed with CD.

Parents should be counseled to have the blood work done before experimenting with a gluten-free diet. A gluten-free diet will eventually lower the levels of antibodies, thereby making the celiac panel unreliable. Moreover, physicians should keep in mind that children who consume low levels of gluten or consume occasional gluten may have lower antibody levels, even though they have CD.

A patient who tests positive on the celiac panel should be referred to a gastroenterologist. The gastroenterologist can decide whether endoscopy is required and can counsel the patient on the importance of a gluten-free diet. Patients must understand that the gluten-free diet is for life, and they need encouragement to adopt a gluten-free lifestyle.

Pediatric Arthritis, Look to the Gut


Alterations in the intestinal microbiota have been identified in children with enthesitis-related arthritis (ERA), suggesting the possibility that the microbiome may play a triggering role in the disease, researchers reported.

Among children with ERA, which is a subtype of juvenile idiopathic arthritis considered to be the pediatric equivalent to axial spondyloarthritis in adults, there were lower levels of organisms of theFaecalibacterium genus compared with controls (5.2% versus 11%, P=0.005), according to Matthew L. Stoll, MD, and colleagues from the University of Alabama at Birmingham.

Specifically, F. prausnitzii constituted 10% of the microbiota of healthy controls compared with only 3.8% in children with ERA (P=0.008).

This decrease has previously been reported among patients with inflammatory bowel disease, and may be relevant because of the potential for this organism to exert anti-inflammatory effects through production of butyrate, the researchers explained online inArthritis Research & Therapy.

“The human intestine is colonized with an estimated 100 trillion bacteria, a process that begins shortly after birth. It is becoming increasingly clear that these bacteria play important roles in immune function as well as in a variety of autoimmune and inflammatory disorders,” they wrote.

Next-generation sequencing has permitted the full assessment of intestinal microbiota, with distinct alterations identified in a number of inflammatory diseases such as rheumatoid arthritis and celiac disease. This has sparked interest in the microbiome as an environmental trigger for the diseases.

Furthermore, “intestinal bacteria need not be present in abnormal quantities to trigger arthritis; it is also possible that a pathologic immune response to normal resident microbes may result in disease,” Stoll and colleagues wrote.

For instance, many patients with Crohn’s disease develop antibodies against intestinal microbial flagellins, and the presence of those antibodies is associated with stricturing disease.

To explore the possibility of abnormalities in commensal quantities and immune responses in children with ERA, the researchers recruited 25 patients and 13 controls.

They performed sequencing of 16S ribosomal DNA on fecal samples of all participants, and ELISAs to measure bacterial IgG and IgA in order to evaluate antigenic reactivity.

Participants’ mean age was 13, and median disease duration was 2.4 years. Among the 25 ERA patients, 11 were female and two had concomitant inflammatory bowel disease. Immunosuppressive therapies were being used by 21.

Sequencing analysis of fecal samples of all 38 children showed that there were notable differences in three taxa other than Faecalibacterium.

As with Faecalibacterium, the ERA patients had lower levels of Lachnospiraceae (7% versus 12%, P=0.020).

However, levels of Bifidobacterium (primarily B. adolescentis) were higher in the ERA group (1.8% versus 0%, P=0.032), as were Bacteroides (21% versus 11%).

“Thus, at the genus and species levels, our data demonstrate some statistically significant differences between patients and controls,” Stoll and colleagues wrote.

Further analysis divided the ERA patients into two clusters. The eight patients in cluster 1 had higher levels of Bacteroides than those in cluster 2 (32% versus 13%, P<0.001), but similar levels of F. prausnitzii (4.7 versus 3.2%, P=0.897).

None of the patients in cluster 1 had high levels of Akkermansia, but 41% of those in cluster 2 had levels of 2% or greater.

“A novel finding of this study is that both Bacteroides species and Akkermansia muciniphilawere found to be associated with disease states in largely non-overlapping subsets of ERA patients,” the researchers noted.

Then, to examine the possible influence of immunoreactivity, the researchers conducted ELISA tests against B. fragilis and F. prausnitzii in 31 of the participants, and found that patients in cluster 1, who had higher levels of Bacteroides, also exhibited greater IgA and IgG reactivity against B. fragilis.

Abnormal immune responses to Bacteroides also have been observed in adult patients with ankylosing spondylitis, they pointed out.

The presence of increased levels of A. muciniphila also was of interest. This is a recently identified bacterium that thrives on mucin in the intestine, “suggesting that high quantities could potentially lead to a defect in the intestinal wall barrier function,” they noted.

In addition, the IgA responses to B. fragilis may result in invasion of the intestinal wall and a subsequent systemic immune response.

The study findings represent “the first comprehensive evaluation of the microbiota in pediatric or adult [spondyloarthritis], confirming a potential role for insufficient protectiveF. prausnitzii in the pathogenesis of ERA and introducing potential novel bacteria as associative agents,” Stoll and colleagues observed.

“These findings suggest that altering the gut microbiota may be beneficial in children with ERA,” they concluded.

Limitations of the study included its small sample size and a lack of information about diet.