Magnesium: The Missing Link in Mental Health?

by James Greenblatt, MD

Chief Medical Officer at Walden Behavioral Care in Waltham, MD
Assistant Clinical Professor of Psychiatry at Tufts University School of Medicine and Dartmouth College Geisel School of Medicine

Magnesium is a cofactor in more than 325 enzymatic reactions—in DNA and neurotransmitters; in the bones, heart and brain; in every cell of the body. Unfortunately, a deficiency of this crucial mineral is the most common nutritional deficiency I see in my practice as an integrative psychiatrist. Fortunately, supplementation with magnesium is the most impactful integrative treatment I use, particularly in depression and attention deficit hyperactivity disorder (ADHD).

Why is magnesium deficiency so common, and why is restoring the mineral so essential to mental and emotional well-being and behavioral balance? The rest of this article addresses those two questions, and presents aspects of my therapeutic approach.

Magnesium Deficiency

 The population is deficient in magnesium—found abundantly in whole grains, beans and legumes, nuts and seeds, and leafy greens, as well as cocoa and molasses—for several reasons.

Soil depletion. Intensive agricultural practices rob the soil of magnesium and don’t replace it. As a result, many core food crops—such as whole grains—are low in magnesium. A recent paper in Crop Journal put it this way: Magnesium’s “importance as a macronutrient ion has been overlooked in recent decades by botanists and agriculturists, who did not regard Mg deficiency in plants as a severe health problem. However, recent studies have shown, surprisingly, that Mg contents in historical cereal seeds have markedly declined over time, and two thirds of people surveyed in developed countries received less than their minimum daily Mg requirement.” [1]  

Food processing. Magnesium is stripped from foods during food processing. For example, refined grains—without magnesium-rich germ and bran—have only 16% of the magnesium of whole grains. [2]

Stress. Physical and emotional stress—a constant reality in our 24/7 society—drain the body of magnesium. In fact, studies show inverse relationships between serum cortisol and magnesium—the higher the magnesium, the lower the cortisol. Stress robs the body of magnesium—but the body must have magnesium to respond effectively to stress.

Other factors. Many medications—such as medications for ADHD—deplete magnesium. So does the intake of alcohol, caffeine and soft drinks.

The result: In 1900, the average intake of magnesium was 475 to 500 mg daily. Today, it’s 175 to 225 mg daily. Which means that only one-third of adult Americans get the daily RDA for magnesium—320 mg for women, and 420 mg for men. (And many researchers consider the RDA itself inadequate.)  And that magnesium deficit causes deficits in health. Magnesium deficiency has been cited as contributing to atherosclerosis, hypertension, type 2 diabetes, obesity, osteoporosis and certain types of cancer. [4] But detecting that deficiency in laboratory testing is difficult, because most magnesium in the body is stored in the skeletal and other tissues. Only 1% is in the blood, so plasma levels are not a reliable indicator. That means a “normal” magnesium blood level may exist despite a serious magnesium deficit. An effective therapeutic strategy: Assume a deficit is present, and prescribe the mineral along with other appropriate medical and natural treatments. That’s particularly true if the patient has symptoms such as anxiety, irritability, insomnia and constipation, all of which indicate a magnesium deficiency.

The Mind Mineral

Some of the highest levels of magnesium in the body are found in the central nervous system, with studies dating back to the 1920s showing how crucial magnesium is for a balanced brain…

It’s known, for example, that magnesium interacts with GABA receptors, supporting the calming actions of this neurotransmitter. Magnesium also keeps glutamate—an excitatory neurotransmitter—within healthy limits. Patients with higher magnesium levels also have healthy amounts of serotonin in the cerebrospinal fluid. And the synthesis of dopamine requires magnesium.

In summary, the body needs magnesium to create neurotransmitters (biosynthesis) and for those neurotransmitters to actually transmit. Magnesium also acts at both the pituitary and adrenal levels. In the pituitary gland, it modulates the release of ACTH, a hormone that travels to the adrenal glands, stimulating cortisol release. In the adrenal gland, it maintains a healthy response to ACTH, keeping cortisol release within a normal range. As a result, magnesium is a must for maintaining the homeostasis of the HPA axis. Given all these key mechanisms of action, it’s not surprising that a lack of the mineral can produce psychiatric and other types of problems. The patient may have: Difficulty with memory and concentration. Depression, apathy and fatigue. Emotional lability. Irritability, nervousness and anxiety. Insomnia. Migraine headaches. Constipation. PMS. Dysmenorrhea. Fibromyalgia. Autism. ADHD. Fortunately, studies show that magnesium repletion—restoring normal levels of the mineral—produces positive changes in mood and cognition, healthy eating behavior, healthy stress responses, better quality of sleep, and better efficacy of other modalities, such as medications. Let’s look at two areas in which magnesium supplementation is particularly effective: Depression and ADHD.

Depression

A cross-sectional, population-based data set—the National Health and Nutrition Examination Survey—was used to explore the relationship of magnesium intake and depression in nearly 9,000 US adults. Researchers found significant association between very low magnesium intake and depression, especially in younger adults. [5] And in a recent meta-analysis of 11 studies on magnesium and depression, people with the lowest intake of magnesium were 81% more likely to be depressed than those with the highest intake. [6] In a clinical study of 23 senior citizens with depression, low blood levels of magnesium and type 2 diabetes, magnesium was compared to the standard antidepressant medication imipramine (Tofranil)—one group received 450 mg of magnesium daily and one group received 50 mg of imipramine. After 12 weeks, depression ratings were equally improved in both groups. [7] In my practice, I nearly always prescribe magnesium to a patient with diagnosed depression. You can read more about the integrative approach to depression in Integrative Therapies for Depression: Redefining Models for AssessmentTreatment and Prevention (CRC Press), which I co-edited, and in Breakthrough Depression Solution: Mastering Your Mood with Nutrition, Diet & Supplementation (Sunrise River Press, 2nd Edition).

Attention Deficit Hyperactivity Disorder

Magnesium deficiency afflicts 90% of all people with ADHD and triggers symptoms like restlessness, poor focus, irritability, sleep problems, and anxiety. These symptoms can lessen or vanish one month after supplementation starts. Magne­sium can also prevent or reverse ADHD drug side effects. That’s why all of my ADHD patients get a prescription for magnesium. For adolescents, I typically prescribe 200 mg, twice daily. For children 10 to 12, 100 mg, twice daily. For children 6 to 9, 50 mg, twice daily. Typically, I recommend magnesium glycinate, using a powdered product. I describe my entire approach to magnesium and ADHD (and to the disorder’s overall integrative treatment) in my book Finally Focused: The Breakthrough Natural Treatment Plan for ADHD That Restores AttentionMinimizes Hyperactivity, and Helps Eliminate Drug Side Effects. (Forthcoming from Harmony Books in May 2017)

Dosage and Form

I have found that 125 to 300 mg of magnesium glycinate at meals and a bedtime (four times daily) produces clinically significant benefits in mood. (This form of magnesium is gentle on the digestive tract.) 200 to 300 mg of magnesium glycinate or citrate before bed supports sleep onset and duration through the night. You can also find magnesium in powder or liquid form, which are effective alternatives to capsules, particularly for children with ADHD. Ways to increase the bioavailability of magnesium include: Supplementing with vitamin D3, which increases cellular uptake of the mineral. Vitamin B6 also helps magnesium accumulate in cells. Taking the mineral in divided doses instead of a single daily dose. Taking it with carbohydrates, with improves absorption from the intestine. And taking an organic form, such as glycinate or citrate, which improves absorption by protecting the mineral from antagonists in the digestive tract. Avoid giving magnesium in enteric-coated capsules, which decreases absorption in the intestine.

Magnesium oxide is poorly absorbed and tends to cause loose stools. Magnesium-l-threonate has been shown to readily cross the blood-brain barrier, and animal studies show that it supports learning ability, short and long-term memory and brain function, I don’t typically prescribe it, however, because of its higher cost, and the clinical effectiveness of other forms. The therapeutic response to magnesium typically takes several weeks, as levels gradually increase in the body.

CITATIONS

[1] Guo W., et al. Magnesium deficiency in plants: An urgent problem. The Crop Journal, Volume 4, Issue 2, April 2016, Pages 83-91.

[2] http://www.ancient-minerals.com/magnesium-sources/dietary/

[3] http://www.prnewswire.com/news-releases/majority-of-americans-not-getting-enough-magnesium-71354497.html

[4] Volpe, SL. Magnesium in Disease Prevention and Overall Health. Advances in Nutrition, 2013 May; 4(3): 378S-383S.

[5] Tarleton EK, at al. Magnesium Intake in Depression in Adults. Journal of the American Board of Family Medicine, 2015 Mar-Apr;28(2):249-56.

[6] Li B, et al. Dietary magnesium and calcium intake and risk of depression in the general population: A meta-analysis. Australian and New Zealand Journal of Psychiatry, 2016 Nov 1. [Epub ahead of print].

[7] Barragan-Rodriquez L, et al. Efficacy and safety or oral magnesium supplementation in the treatment of depression in the elderly with type 2 diabetes: a randomized, equivalent trial. Magnesium Research, 2008 Dec;21(4):218-23.

Examining the Gut-Brain Connection and Its Implications for Trichotillomania Treatment

James Greenblatt, MD

Trichotillomania (TTM) is an impulsive disorder that causes people to repeatedly pull out their hair, most often from the scalp. It affects about 1-2% of adults and adolescents, but it is ten times more prevalent in women than in men (APA, 2013). The name is Greek in origin: thrix (hair), tillein (to pull), and mania (madness). The first allusion to TTM may have come from the Greek philosopher Epictetus in 101 AD: “Indeed I think that the men who pluck out their hairs do what they do without knowing what they do…Much from his head he tore his rooted hair. And what does he say himself? 'I am perplexed,' he says, 'and disturbed I am,' and 'my heart out of my bosom is leaping.'" (Epictetus, 1981). The first medical case was described by French dermatologist Francois Henri Hallopeau in 1889, who described a young man who pulled out his hair in tufts (Parakh & Srivastava, 2010).

The American Psychiatric Association first recognized TTM as a mental disorder in 1987. The DSM V classifies TTM as an obsessive-compulsive disorder, a change from the DSM IV where is was classified as an impulsive-control disorder (APA, 2013). The cause is complex and unclear. Those with TTM often suffer from other psychiatric conditions such as major depression, generalized anxiety disorder, OCD, eating disorders, substance abuse, and excoriation (skin-picking) disorder (Parakh & Srivastava, 2010).

The last 20 years have begun to shed some light on the disorder with an increase in clinical and research attention; however, there is yet to be a consensus on the best treatment. Traditional treatments primarily involve cognitive-behavioral therapy including habit reversal therapy. Cognitive-behavioral therapy identifies factors triggering hair pulling behavior and then teaches skills to interrupt the behavior. This includes keeping records of hair pulling, being aware of emotional states or environmental cues causing the behavior, or bandaging fingers to interfere with hair pulling. Habit reversal therapy is currently the most effectively used treatment, although treatment varies on an individual basis and relapse is common. Medications used to treat TTM include selective serotonin reuptake inhibitors (SSRIs), olanzapine, clomipramine, fluoxetine, and paroxetine. SSRIs are currently the most commonly used treatment in children and adults (Bruce et al., 2005).

Unfortunately, the effectiveness of these traditional treatments is mixed. One meta-analysis concluded that there was no evidence to demonstrate that SSRIs are more efficacious than placebo in the treatment of trichotillomania (Bloch et al., 2007). According to a Trichotillomania Impact Project survey, treatments for TTM have only been successful with 15% of adult patients and 17% of pediatric patients (Woods et al., 2006). Due to the lack of effective treatment options for TTM, individuals struggling with TTM are seeking alternative treatments that may be more successful than traditional forms of treatment, such as probiotics, N-acetylcysteine, and inositol.

Probiotics are beneficial bacteria that are introduced into the gastrointestinal tract. Interestingly, gut bacteria are able to synthesize the same neurotransmitters that are found in the brain. These gut neurotransmitters have the same structure and are produced via the same biosynthetic pathway as those in the brain. Gut bacteria are able to communicate with the brain through the vagus nerve, a phenomenon known as the “gut-brain connection.” Researchers have found that probiotics can improve many aspects of psychological health including depression and anxiety by modifying the gut microbiome. Probiotics can also directly modulate the immune system (Lyte, 2011).

Success stories attest to the ability of probiotics to offer relief to individuals suffering from TTM. A year after my article “Gut feelings: the future of psychiatry may be inside your stomach” was published on The Verge, I was contacted via email by a gentleman who shared his incredible story on how he was able to cure himself from trichotillomania by using probiotics after reading the article. Here is his story:

“I am a middle aged Caucasian male, and my first history of chronic hair pulling was when I had a very brief episode when I was in the 7th grade. First off, let me explain what I tried to do in the past, all unsuccessfully, to find a solution. I tried pure willpower. I tried discussing my hair pulling with my family doctor. I went to a psychiatrist, one of the most talented psychiatrists in the field, and he told me that there was nothing that psychiatry could do for me. I just happened to stumble onto an article on the web about a psychiatrist that was successfully treating some of his patients with various Obsessive Compulsive disorder symptoms with Probiotics! I started taking two capsules of 30 billion CFU capsules a day on the day after Thanksgiving, 2013. In the last week of January of 2014 I all of a sudden realized, one day, “Hey, wait a minute, I have not pulled my hair for the past 2 weeks now”. It seems I had stopped hair pulling in mid-January and didn’t even notice it until two weeks had gone by. I was hopeful, but skeptical at that point. Over the past 15 years, I have never, ever, had more than a 1 day period of time that I did not pull out my own hair. I continued taking the probiotics every day. As of the day I am writing this, today, July 17, 2014, I have not pulled even one hair since mid-January. Not only have I been symptom free, but I never had to apply any will power or focus on stopping the hair pulling to help me stop. What happened is that the urges did not need to be fought off, they simply dissipated by themselves and have completely disappeared, all by themselves. I did no counseling sessions, no coaching sessions, no group therapy, no psychiatric medications, no psychological treatments of any kind, nothing except the probiotics.”

You can read his full story on https://howicuredmyhairpulling.wordpress.com

N-acetylcysteine (NAC) also shows promise for reducing compulsive behavior. NAC is an amino acid that is converted in the body to a powerful antioxidant known as glutathione. In a double-blind trial, 50 adults with trichotillomania were randomized to NAC (1,200-2,400 mg/d) or placebo for 12 weeks. Those receiving NAC significantly improved on measures of urges to pull hair, actual amount of pulling, perceived control over the behavior, and distress associated with hair pulling. Of those taking NAC, 56% were “much” or “very much” improved compared with 16% of those taking placebo (Grant et al., 2009).

There is also emerging evidence for inositol as treatment for TTM. Inositol is a sugar produced by the human body from glucose. The sugar is found in many foods, particularly fruits such as cantaloupe and oranges. Inositol is a signaling molecule involved in many important functions such as nerve guidance and the breakdown of fats. In the past, inositol has been used effectively for depression, anxiety, and OCD. Two case studies have been documented of young women with TTM who befitted from 18 grams per day of inositol (Seedat et al., 2001). Inositol is thought to help regulate serotonin levels, which is particularly relevant for disorders including TTM, OCD, depression, and anxiety that may be caused by low levels of serotonin.

Indeed, disruptions in the gastrointestinal tract or gut microbiota can manifest as physiological and psychological symptoms. Fortunately, several animal studies have found that the introduction of probiotics were effective at modulating the gut microbiota. While the complex connection between the gut and brain continue to be examined, the available research suggests that probiotics may be a promising intervention for several illnesses including depression, anxiety, and compulsive disorders. 

REFERENCES

American Psychiatric Association (APA). (2013). Diagnostic and statistical manual of mental disorders (5th ed.). Arlington, VA: American Psychiatric Publishing.

Bloch, M. H., Landeros-Weisenberger, A., Dombrowski, P., Kelmendi, B., Wegner, R., Nudel, J., & ... Coric, V. (2007). Review: Systematic Review: Pharmacological and Behavioral Treatment for Trichotillomania. Biological Psychiatry, 62(Bipolar Disorder and OCD: Circuitry of Impulsive and Compulsive Behaviors), 839-846.

Bruce, T. O., Barwick, L. W., & Wright, H. H. (2005). Diagnosis and management of Trichotillomania in children and adolescents. Pediatric Drugs, (6), 365.

Epictetus, Long, G., & Epictetus. (1891). The discourses of Epictetus ; with the Encheiridion and fragments / reprinted from the translation of George Long. London : G. Bell and Sons, 1891 ([London] : Chiswick Press).

Grant, J., Odlaug, B., & Suck, W. (2009). N-acetylcysteine, a glutamate modulator, in the treatment of trichotillomania: A double-blind, placebo-controlled study. Archives Of General Psychiatry, 66(7), 756-763.

Lyte, M. (2011). Probiotics function mechanistically as delivery vehicles for neuroactive compounds: Microbial endocrinology in the design and use of probiotics. Bioessays, 33(8), 574-581.

Parakh, P., & Srivastava, M. (2010). The Many Faces of Trichotillomania. International Journal of Trichology, 2(1), 50–52.

Seedat, S., Stein, D. J., & Harvey, B. H. (2001). Inositol in the treatment of trichotillomania and compulsive skin picking. The Journal Of Clinical Psychiatry, 62(1), 60-61.

Woods, D. W., Flessner, C. A., Franklin, M. E., Keuthen, N. J., Goodwin, R. D., Stein, D. J., & Walther, M. R. (2006). The trichotillomania impact project (TIP): Exploring phenomenology, functional impairment, and treatment utilization. Journal Of Clinical Psychiatry, 67(12), 1877-1888.

The Future of Depression Treatment (Audio Interview)

JAMES GREENBLATT, MD

Guest: Dr. James Greenblatt
Presenter: Neal Howard
Guest Bio: James M. Greenblatt, MD, is a pioneer in the field of integrative medicine and one of the founders of Integrative Medicine for Mental Health (IMMH). He currently serves as the chief medical officer and vice president of medical services at Walden Behavioral Care in Waltham, Massachusetts. Dr. Greenblatt is also an assistant clinical professor in the Department of Psychiatry at Tufts University School of Medicine in Boston.

Segment overview: Dr. James Greenblatt, MD, author of “Breakthrough Depression Solution: Mastering Your Mood with Nutrition, Diet & Supplementation”, talks about the treatment of depression in the future and how it is not a one-size-fits-all prescription.

Originally published to Health Professional Radio

The Effect of Vitamin D on Psychosis and Schizophrenia

JAMES GREENBLATT, MD

Vitamin D deficiency has been linked to a wide range of major psychiatric illnesses and is an emerging area of interest for researchers. From my experience working with individuals with psychosis and schizophrenia in both inpatient and outpatient settings, I have often found low vitamin D levels in this patient population where the severity of symptoms were inversely correlated to serum vitamin D levels. Most recently, laboratory tests of individuals with schizophrenia, psychosis, elective mutism, and bipolar disorders revealed consistent serum vitamin D levels below 20 ng/ml. As vitamin D levels normalized, symptoms improved. While the mechanism is unclear, recent research suggests that vitamin D’s action on the regulation of inflammatory and immunological processes likely affects the manifestation of clinical symptoms and treatment response in schizophrenic patients (Chiang, Natarajan, & Fan, 2016).

The link between vitamin D deficiency and the development of schizophrenia has been researched among patients of all ages around the globe. One meta-analysis reviewed 19 studies published between 1988 and 2013 and found a strong association between vitamin D deficiency and schizophrenia. Of the 2,804 participants from these studies, over 65% of the participants with schizophrenia were vitamin D deficient. Vitamin D deficient participants were 2.16 times more likely to have schizophrenia than vitamin D sufficient participants (Valipour, Saneei, & Esmaillzadeh, 2014).

The risk of schizophrenia and vitamin D status vary with season of birth, latitude, and skin pigmentation. The UV rays required to make vitamin D are reduced in the months most associated with an increase in the birth of individuals who later develop schizophrenia. One review including a total of 437,710 individuals with schizophrenia found that most individuals were born in January and February. These newborns were thus exposed to lower levels of UV rays in their prenatal and perinatal periods. An increased rate of schizophrenia is also seen at higher latitudes, especially among immigrants. This may again be related to UV availability and subsequent vitamin D status. At higher latitudes, a dark skinned individual will also have a more pronounced reduction in vitamin D than a lighter skinned individual. The lighter skinned individual will have less melanin which allows the skin to absorb UV rays more effectively. It is estimated that individuals with darker skin at higher latitudes are more likely to develop schizophrenia than the general population (Chiang et al., 2016).

Swedish researchers reviewed medical charts at a psychiatric outpatient department to identify possible predictors of vitamin D deficiency. Over 85% of the 117 psychiatric patients had suboptimal vitamin D levels. Those with schizophrenia and autism had the lowest levels. Middle East, Mediterranean, South-East Asian or African ethnic origin were strong predictors of low vitamin D. The patients receiving vitamin D supplements to correct their deficiencies achieved considerable improvement of psychosis and depression symptoms (Humble et al., 2010).

Vitamin D concentrations were measured in 50 schizophrenia patients in Israel aged 19-65. Lower mean vitamin D concentrations were detected among patients with schizophrenia (15 ng/ml) compared to controls (20 ng/ml) after adjusting for the impact of sun exposure and supplements (Itzhaky et al., 2012). Likewise, 92% of 102 adult psychiatric inpatients in New Zealand also had suboptimal vitamin D levels and were more than twice as likely as Europeans to have severely deficient levels below <10 ng/ml (Menkes et al., 2012).

In a prospective birth cohort of 3,182 children in England, researchers measured vitamin D levels at age 9.8 years and assessed psychotic experiences at age 12.8 years. Vitamin D concentrations during childhood were associated with psychotic experiences during early adolescence. If psychotic experiences are related to the development of schizophrenia, this supports a possible protective association of higher vitamin D concentrations with schizophrenia (Tolppanen et al., 2012).

Vitamin D deficiency is associated with more severe symptoms. Cross sectional analyses were carried out on mentally ill adolescents aged 12-18 who required either inpatient or partial hospitalization. Of the 104 patients evaluated, 72% had insufficient vitamin D levels. Vitamin D status was related to mental illness severity. Those with vitamin D deficiency were 3.5 times more likely to have hallucinations, paranoia, or delusions (Gracious et al., 2012). A second study supports this finding. Vitamin D was analyzed from 20 patients with first-episode schizophrenia. Greater severity of negative symptoms (blunted affect, emotional withdrawal, poor rapport, passive-apathetic social withdrawal, abstract thinking, and stereotyped thinking) was strongly correlated with lower vitamin D status. Lower vitamin D levels were also associated with more severe overall cognitive deficits (Graham et al., 2015).

McGrath et al. (2010) investigated the relationship between neonatal vitamin D status and later risk of schizophrenia. They identified 424 cases with schizophrenia from the Danish Psychiatric Central Register and analyzed their neonatal dried blood spots. Not surprisingly they found a significant seasonal variation in vitamin D status and significantly lower levels of vitamin D in the offspring of mothers who immigrated to Denmark. They also found that those with lower neonatal concentrations of vitamin D had an increased risk of schizophrenia. The researchers estimated that if all these neonates had optimal vitamin D levels, over 40% of schizophrenia cases could have been averted.

The same group of researchers also discovered that taking vitamin D supplements during the first year of life is associated with a reduced risk of schizophrenia in males. They looked at a Finnish birth cohort and collected data about the frequency and dose of vitamin D supplementation during infancy. Males who regularly took vitamin D supplements had an 88% decreased risk of schizophrenia compared to those who never took supplements (McGrath et al., 2004).

The mechanism underlying this nutrient-illness relationship can only be speculated upon. Those with schizophrenia commonly have elevated markers of inflammation. Cells that are low in vitamin D produce high levels of inflammatory cytokines while cells with adequate vitamin D release significantly less of these cytokines. Thus there may be an anti-inflammatory mechanism (Chiang et al., 2016). Vitamin D regulates the transcription of many genes involved in pathways implicated in schizophrenia, including genes involved in synaptic plasticity, neuronal development, and protection against oxidative stress (Graham et al., 2015). Animal studies show that vitamin D deficiency in the gestational period affects dopamine metabolism and alters the dopamine system in the developing brain. Dopamine has been implicated in the pathogenesis of schizophrenia. Vitamin D deficiency during the gestational period can also affect brain structures that are associated with schizophrenia (Valipour, Saneei, & Esmaillzadeh, 2014).

While there is a lack of trials analyzing vitamin D supplements in the treatment of psychosis and schizophrenia, individuals with low levels of vitamin D within this patient population will tend to benefit from supplementation. Based on over 25 years of clinical experience, I have observed significant improvement in treatment outcomes utilizing vitamin D 5,000 to 10,000 i.u. once daily as an adjunct therapy. Serum vitamin D levels should be re-evaluated every two months until optimal levels are achieved.


REFERENCES

  1. Chiang, M., Natarajan, R., & Xiaoduo, F. (2016). Vitamin D in schizophrenia: a clinical review. Evidence Based Mental Health, 19(1), 6-9.
  2. Cieslak, K., Feingold, J., Antonius, D., Walsh-Messinger, J., Dracxler, R., Rosedale, M., & ... Malaspina, D. (2014). Low vitamin D levels predict clinical features of schizophrenia.
  3. Crews, M., Lally, J., Gardner-Sood, P., Howes, O., Bonaccorso, S., Smith, S., & ... Gaughran, F. (2013). Vitamin D deficiency in first episode psychosis: A case–control study. Schizophrenia Research, 150(Special Section: Negative Symptoms), 533-537.
  4. Graham, K., Lieberman, J. , Lansing, K., Perkins, D., Calikoglu, A., & Keefe, R. (2015). Relationship of low vitamin D status with positive, negative and cognitive symptom domains in people with first-episode schizophrenia. Early Intervention In Psychiatry, 9(5), 397-405. Schizophrenia Research, 159(2/3), 543-545.
  5. Hedelin, M., Löf, M., Olsson, M., Lewander, T., Nilsson, B., Hultman, C. M., & Weiderpass, E. (2010). Dietary intake of fish, omega-3, omega-6 polyunsaturated fatty acids and vitamin D and the prevalence of psychotic-like symptoms in a cohort of 33,000 women from the general population. BMC Psychiatry, 10,38.
  6. Humble, M. B., Gustafsson, S., & Bejerot, S. (2010). Low serum levels of 25-hydroxyvitamin D (25-OHD) among psychiatric out-patients in Sweden: Relations with season, age, ethnic origin and psychiatric diagnosis. Journal Of Steroid Biochemistry And Molecular Biology, 121(Proceedings of the 14th Vitamin D Workshop), 467-470.
  7. Itzhaky, D., Bogomolni, A., Amital, D., Arnson, Y., Amital, H., & Gorden, K. (2012). Low serum Vitamin D concentrations in patients with schizophrenia. Israel Medical Association Journal, 14(2), 88-92.
  8. McGrath, J., Saari, K., Hakko, H., Jokelainen, J., Jones, P., Järvelin, M., & ... Isohanni, M. (2004). Vitamin D supplementation during the first year of life and risk of schizophrenia: a Finnish birth cohort study. Schizophrenia Research, 67, 237-245.
  9. McGrath, J. J., Eyles, D. W., Pedersen, C. B., Anderson, C., Ko, P., Burne, T. H., & ... Mortensen, P. B. (2010). Neonatal Vitamin D status and risk of schizophrenia: a population-based case-control study. Archives Of General Psychiatry, (9), 889.
  10. Menkes, D., Marsh, R., Lancaster, K., Grant, M., Dean, P., & du Toit, S. (2012). Vitamin D status of psychiatric inpatients in New Zealand's Waikato region. BMC Psychiatry, 12, 68.
  11. Shivakumar, V., Kalmady, S. V., Amaresha, A. C., Jose, D., Narayanaswamy, J. C., Agarwal, S. M., & ... Gangadhar, B. N. (2015). Serum vitamin D and hippocampal gray matter volume in schizophrenia. Psychiatry Research, 233(2), 175-179.
  12. Tolppanen, A., Sayers, A., Fraser, W. D., Lewis, G., Zammit, S., McGrath, J., & Lawlor, D. A. (2012). Serum 25-Hydroxyvitamin D3 and D2 and Non-Clinical Psychotic Experiences in Childhood. Plos ONE, 7(7), 1-8.
  13. Valipour, G., Saneei, P., & Esmaillzadeh, A. (2014). Serum vitamin D levels in relation to schizophrenia: a systematic review and meta-analysis of observational studies. The Journal Of Clinical Endocrinology And Metabolism, 99(10), 3863-3872.
  14. Yüksel, R. N., Altunsoy, N., Tikir, B., Cingi Külük, M., Unal, K., Goka, S., … Goka, E. (2014). Correlation between total vitamin D levels and psychotic psychopathology in patients with schizophrenia: therapeutic implications for add-on vitamin D augmentation. Therapeutic Advances in Psychopharmacology, 4(6), 268–275.

Radio Show Interview about Glyphosate - Dr. William Shaw

This is a radio show interview with Dr. William Shaw on local New York station WBAI 99.5 from April 15, 2016. Take Charge of Your Health hosts Corinne Funari, RPA, CCN and Linda Segal interviewed Dr. Shaw about the dangers of glyphosate, the world's most widely used herbicide being sprayed on our crops.  To listen to the show, click here.  Dr. Shaw's interview starts at 13:00.

Response to Article on the Lack of Oxalate Dangers in the Green Smoothie Diet

William Shaw, PhD

In response to the inaccurate, unscientific article by Thomas Lodi, M.D. on oxalates1 in the December 2015 issue of Townsend Letter, I will make the following point by point responses:

(1)Cartoons about Popeye.
I will not use any cartoons in my response. Anyone interested in cartoons should immediately stop reading this article and start reading their local paper’s comic section.

(2)Inaccurate references.
The tone for accuracy of the author is set in the very first paragraph of his article in which his first reference, #23, has nothing to do with my green smoothie article, which is reference #24. A better reference would actually be #2 from my article2.  When the clock strikes 13, the accuracy of the other 12 hours of the clock is in serious question.

(3)Inaccuracy about the contribution of endogenous production to total oxalate load.
Lodi states that 80-90% of oxalates in the body are endogenously produced. Unfortunately, the best scientific study refutes his assertion. According to Holmes et al3, who did extremely well-controlled studies on every aspect of oxalate metabolism and has publishedforty-one scientific articles on oxalates in the peer reviewed literature, the mean dietary oxalate contribution to total oxalate in the diet is 52.6 % on a high oxalate diet which was defined as a diet of 250 mg oxalate per day. The person drinking a green smoothie with 2 cups of raw spinach ingests 1312 mg of oxalates or over five times the level of what Holmes considers a high-oxalate diet, just in the spinach consumption alone and over 26 times the amount of oxalates in a low oxalate diet (50 mg per day)4. The estimated human production of oxalates is 40 mg per day3. On a green smoothie diet with two cups of spinach, the diet in normal humans contains 33 times the endogenous human production of oxalates just based on the spinach alone.

All of Lodi’s assertions about the benefits of a vegetarian diet are meaningless since there is no single vegetarian diet; there are as many vegetarian diets as there are vegetarians.

 (4)Inaccuracy about the availability of calcium and magnesium in spinach.
Lodi states that “every plant, green and otherwise (including spinach) has abundant magnesium and calcium and potassium”. Unfortunately, none of the calcium and magnesium in spinach or other high oxalate plants is bioavailable since it is strongly bound to oxalates. Furthermore, the average oxalate value of spinach is 7.5 times its calcium content, making spinach a very poor choice for someone to maintain adequate calcium stores5. According to Kohmani, who added a good deal of spinach, similar to the diet of a person ingesting a daily green smoothie or a large daily spinach salad, to the diet of rats to determine its effects5:

“If to a diet of meat, peas, carrots and sweet potatoes, relatively low in calcium but permitting good though not maximum growth and bone formation, spinach is added to the extent of about 8% to supply 60% of the calcium, a high percentage of deaths occurs among rats fed between the age of 21 and 90 days. Reproduction is impossible. The bones are extremely low in calcium, tooth structure is disorganized and dentine poorly calcified. Spinach not only supplies no available calcium but renders unavailable considerable of that of the other foods. Considerable of the oxalate appears in the urine, much more in the feces.”

(5)Lodi argues that his patients haven’t complained about kidney stones while drinking a lot of green smoothies so oxalates must not be problematic.
Lodi’s contention that his patients on a high oxalate diet don’t have kidney stones is anecdotal. He presents no data from active chart review of his patients to determine if questions about kidney stones were ever asked. Furthermore, it is doubtful that his patients would have even have connected their diet with their kidney stones. I have had numerous seminars on the connection between oxalates and kidney stones and it is common to get feedback from the audience members that they had kidney stones shortly after starting either a diet including a spinach green smoothie or a large spinach salad on a regular basis.  Since these comments were not even solicited, it is likely that even a larger number of individuals may have experienced kidney stones but were shy to voice their experiences. A neurologist friend attributes his recent severely-disabling stroke to the dietary changes encouraged by his wife that placed him on a daily green spinach smoothie for a considerable time.

Furthermore, Lodi seems to think that a lack of kidney stones indicates a lack of oxalate problems. However, oxalates may form in virtually every organ of the body including the eyes, vulva, lymph nodes, liver, testes, skin, bones, gums, thyroid gland, heart, arteries, and muscles6-7. Oxalates may occur in these other organs without appearing in the urinary tract at all and in individuals without genetic hyperoxalurias7. Oxalates have been implicated in heart disease7, stroke, vulvodynia, and autism8-10. Women of child-bearing age need to be especially careful of the spinach green smoothie diet because of the autism oxalate connection and the negative effects of spinach containing oxalates on fertility5. Prisoners in the state prisons in Illinois were encouraged by the Weston-Price Nutrition Foundation to file a lawsuit against the state because of their deteriorating health due to a high amount of soy protein in the prison diet11. Soy protein is tied with spinach as the highest oxalate foods4. Oxalates are especially toxic to the endothelial cells of the arteries, leading to atherosclerosis12. Oxalate crystals are concentrated in the atherosclerotic lesions7.  Such lesions have commonly been overlooked by the use of stains of atherosclerotic lesions that make the oxalate crystals difficult to visualize.  The relatives of people consuming the green smoothie diet would only know of their loved ones’ oxalate deposits throughout their organs on the day of their autopsies which employed pathological examinations that can detect oxalates.

Primary genetic hyperoxaluria is not the major cause of kidney stones in adults since 80% of individuals died of this disorder before age 20 and it is so rare that it could not possibly be the cause of most cases of oxalate kidney stones13.  However, a genetic polymorphism present in up to 20% of Caucasian groups called P11L codes for a protein with three times less activity of alanine: glyoxylate aminotransferase (AGT) than the predominant normal activity polymorphism, leading to excessive endogenous production of oxalates14. This substantial group of individuals would be even more susceptible to the harm of a high oxalate diet. Kidney stones were rampant in the United Kingdom during the World Wars when rhubarb, another high oxalate food, was recommended as a substitute for other low oxalate but unavailable vegetables13.

In summary, those who do not care for their health can eat or drink whatever they want. But they should realize that their diets are fad-based and/or based on quasi-religious ( “feasts” as part of the “awakening” according to Lodi) reasons, not based on hard scientific evidence. Furthermore, they should be aware that their diet may kill them15. The green smoothie fad will go down in medical history with the AMA journal allowing cigarette advertising with physician endorsements and the use of mercury-containing teething powder for babies as one of the greatest health follies in a considerable time.


References

1.       Lodi, T. Green smoothie bliss: Was Popeye secretly on dialysis?  Townsend Letter for Doctors. Dec 2015 pgs 28-39

2.       Shaw, W.  The Green Smoothie Health Fad: This Road to Health Hell is Paved with Toxic   Oxalate Crystals.  Townsend Letter for Doctors. Jan 2015 Available online at: http://www.townsendletter.com/Jan2015/green0115.html

3.       Holmes RP, Goodman HO, and Assimos DG. Contribution of dietary oxalate to urinary oxalate excretion. Kidney International, Vol. 59 (2001), pp. 270–276

4.       Harvard T.H. Chan School of Public Health Nutrition Department's File Download Site on oxalates in the diet. https://regepi.bwh.harvard.edu/health/Oxalate/files Accessed December 1,2015

5.       Kohmani,EF. Oxalic acid in foods and its fate in the diet. Journal of Nutrition 18(3):233-246,1939

6.       Jessica N. Lange, Kyle D.Wood, John Knight, Dean G. Assimos, and Ross P. Holmes. Glyoxal Formation and Its Role in Endogenous Oxalate Synthesis. Advances in Urology Volume 2012, Article ID 819202, 5 pages doi:10.1155/2012/819202

7.       G.A. Fishbein, R. G. Micheletti, J. S. Currier, E. Singer, and M. C. Fishbein, Atherosclerotic oxalosis in coronary arteries, Cardiovascular Pathology, vol. 17, no. 2, pp. 117–123, 2008.

8.       Giuseppe Di Pasquale, , Mariangela Ribani, Alvaro Andreoli, , Gian Angelo Zampa, and Giuseppe Pinelli,  Cardioembolic Stroke in Primary Oxalosis With Cardiac Involvement. Stroke 1989, 20:1403-1406

9.       Solomons CC, Melmed MH, Heitler SM.Calcium citrate for vulvar vestibulitis. A case report. J Reprod Med. 1991 Dec;36(12):879-82.

10.   Konstantynowicz J, Porowski T, Zoch-Zwierz W, Wasilewska J, Kadziela-Olech H, Kulak W, Owens SC, Piotrowska-Jastrzebska J, Kaczmarski M. A potential pathogenic role of oxalate in autism. Eur J Paediatr Neurol. 2012 Sep;16(5):485-91.

11.   Monica Eng, Chicago Tribune reporter. Soy in Illinois prison diets prompts lawsuit over health effects. December 21, 2009. http://articles.chicagotribune.com/2009-12-21/news/0912200121_1_soy-protein-soy-cheeses-soyfoods-association. Accessed December 2,2015

12.   RI Levin, PW Kantoff and EA Jaffe Uremic levels of oxalic acid suppress replication and migration of human endothelial cells. Arterioscler Thromb Vasc Biol 1990, 10:198-207

13.   A. J. Chaplin Histopathological occurrence and characterization of calcium oxalate: a review. J. Clin. Path., 1977, 30, 800-811

14.   Michael J. Lumb and Christopher J. Danpure.  Functional Synergism between the Most Common Polymorphism in Human Alanine:Glyoxylate Aminotransferase and Four of the Most Common Disease-causing Mutations.  Journal of Biological Chemistry Vol. 275, No. 46, November 17, pp. 36415–36422, 2000

Sanz P, Reig R: Clinical and pathological findings in fatal plant oxalosis. Am J Forensic Med Pathol 13:342–345, 1992

Genetic Testing – The Key to Truly Personalized Medicine

Matthew Pratt-Hyatt, PhD

Personalized medicine has been called the future of medicine since the inception of the Human Genome Project (HGP) in the early 90s, which was a project set up by the United States government to sequence the complete human genome.  The HGP was completed in 2003.(1)   This new wealth of knowledge allowed scientist to develop tests that sequence the 3 billion base pairs and the 20-25 thousand genes in the human genome.(2) Over those 25 thousand genes there are over 80 million variants in the human genome.(3)   These variations include single nucleotide polymorphisms (SNPs) as well as small deletions and insertions throughout the genome and many of those variants play a significant role in patient health.  The dream of personalized healthcare is to use genetic testing to understand a patient’s predisposition for developing different conditions, and then undergo molecular diagnostic tests to determine how the environment is interacting with these genes. 

At The Great Plains Laboratory, Inc., we have been primarily focused on looking at the second half of this equation -- finding the root cause of patient symptoms in a wide variety of chronic disorders.  We have developed tests that look at hundreds of different analytes and have worked with doctors to help them interpret how these data can be used to personalize treatment for patients. Even though traditional medicine has mostly followed the philosophy that one size fits most, functional medicine says that each person is unique and deserves unique care.  That is why we have developed our new genetic test, GPL-SNP1000, which now allows us to have a more complete picture of what contributes to a patient’s health status.

The first generation of genetic sequencing was first published in 1977 by Frederick Sanger.  This technology first used radiolabeling and then later fluorescent labeling for sequencing reactions.  This technology uses these labeled nucleotides and the length of the copied DNA in order to arrange the nucleotide sequence. The Sanger method  is good for sequencing short (300-1000 nucleotides long) amounts of DNA in a single reaction.(4) There are some benefits and drawbacks to this type of sequencing.  The Sanger technology allowed scientists to sequence one stretch of DNA and then compare it to a database and look for differences. This technology was useful if you had a suspected mutation in a known gene, because you could sequence the whole gene in a small number of reactions.  However, there are also drawbacks to this technology, such as only being able to sequence a low number of both genes and patients at one time. 

The next major advance in genotyping technology was the advent of the TaqMan Allelic Discrimination assay.  This assay uses a fluorescent reporter that is generated during the Polymerase chain reaction (PCR).(5)  The TaqMan assay uses DNA probes that differ at the polymorphic SNP site.  One set of probes is complementary to the wild-type allele and another set is complementary to the variant allele. These probes only bond to sequences of DNA that are 100% complementary. These probes, which are bonded to fluorescent reporter dyes, are also bonded to quencher dyes.  The quencher dye prevents the reporter from becoming fluorescent when both are attached to the reporter. The probes hybridize to the complementary strands.  When DNA is copied during the PCR reaction by Taq polymerase the probe is degraded and the dyes are released.  The DNA is then genotyped by determining the signal intensity ratio of the dyes bonded to the wild-type probe and the mutant variant.(6)

 The most recent advance in sequencing technology has been the advent of Next Generation Sequencing (NGS).  There are several companies that use different means to accomplish this, but NGS machines are able to monitor what nucleotide is added at each place during the DNA chain prolongation reaction.  This principle has been labeled “sequencing-by-synthesis.”  This new technique allows for sequencing to move from about 1000 nucleotides long to about 1000 billion bases per run.  This gives researchers the ability to perform a very in-depth sequence for one patient, or sequence several dozen patients at a time using more pinpointed analysis.(7)

Using NGS, our scientists at The Great Plains Laboratory, Inc., in partnership with the genetic company Courtagen, have developed what we think will be the next great tool for personalized medicine.  Our new test GPL-SNP1000 is a genetic screen that covers 1048 SNPs over 144 different genes.  These genes are broken up into nine different groups, which are: DNA methylation, mental health, drug metabolism/chemical detoxification, autism risk, oxalate metabolism, cholesterol metabolism, acetaminophen toxicity, and the transporter genes.

The GPL-SNP1000 test report (see figure 1) is programmed to only depict the SNPs that are mutated.  We are including the gene symbol, the RS number (or reference SNP number), which indicates which SNP is mutated (so that you can look up new research on that mutation), a pathogenicity number (we look at all available research on each SNP and predict how severe a mutation at that SNP would be) genotype (what is the change in nucleotide), phenotype (whether the patient is heterozygous or homozygous [one of two mutated copies], and the disease(s) associated with that mutation (we have listed the most common conditions associated with every SNP in our assay).  The report also has interpretations that are auto-generated for genes that are found to be mutated in the assay. One additional feature our report has is hyperlinks to the references on Pubmed used to make the interpretations.  This allows both patients and healthcare practitioners to review the literature about those particular mutations, without having to search the Internet for these articles. 

Figure 1

We were also very strategic about selecting the nine specific groups of genes and SNPs that our test evaluates. We talked to dozens of functional medicine professionals and asked them what groups of genes would help them the most in their practices. The top answer was the DNA methylation pathway, which was not surprising because the most utilized genetic tests on the market are currently the MTHFR tests.  The MTHFR pathway is a process by which carbons are added onto folic acid from amino acids and redistributed onto other compounds throughout the body.  This process is responsible for the formation of methionine, S-Adenosyl methionine (SAMe), and thymidylate monophosphate (dTMP).  These compounds play critical roles in nucleotide synthesis, neurotransmitter function, detoxification, and numerous other processes.(8)  We believed that we could provide better coverage of these genes than previously done by other genetic tests. We knew that no other test had more than 35 SNPs in their assay for the MTHFR gene, so we redesigned our existing DNA Methylation Profile by increasing the number of SNPs from 32 to 105.    One reason why this test is so popular is the very common occurrence of one of the more serious SNPs of the MTHFR gene, rs1801133 (C667T).  This mutation has mutant allele frequency of 39%  for the heterozygous genotype and a 17% frequency for the homozygous mutant.  It can decrease the enzyme’s functionality by 90%, causing patients to have an increased risk of developmental delay, mental retardation, vascular disease, and stroke.(9) 

Our second most requested group of genes was those that correlate with mental health.  Mutations to these genes can predispose patients to a variety of ailments including depression, schizophrenia, anxiety, and bipolar disorder.  We designed this group to include the nine genes and 53 SNPs that are most commonly the cause of mental disorders.    One of the more important genes in this group is the catechol-o-methyltransferase (COMT) gene.  This enzyme is responsible for the degradation of catecholamines, which include dopamine, epinephrine, and norepinephrine.  Mutations to COMT can lead to bipolar disorder, anxiety, obsessive compulsive disorder, and attention deficit disorder.  One of the more common mutates of COMT is the Val108Met mutation (rs4680), which can cause a heightened risk of developing anxiety.(10)

The next gene group we focus on is the group for drug metabolism/chemical detoxification.  These enzymes include the cytochrome P450s, sulfur transferases, glutathioine transferases, and the methyltransferases.  The P450s are important for multiple molecular functions including drug metabolism, hormone production, toxicant detoxification, and more. The P450s are expressed throughout the body, but primarily in the liver.  There are 57 different genes for the cytochrome P450 enzymes, however eight are responsible for most of the drug metabolism done by the body.  The P450 enzymes are responsible for 75% of all drug metabolism.(11)  Mutations to P450s can cause changes in the rate of metabolism of some medications, causing decreased effectiveness and other dangerous complications.  Some medications known to be affected by drug mutations include but are certainly not limited to warfarin, Diazepam, antiarrhythmic drugs, antidepressants, and antipsychotics.(12-13)  P450s that are known to have alleles in the population that dramatically affect drug metabolism include CYP2C9, CYP2C19, and CYP2D6.(14)  Besides the P450s,which are considered phase I detoxification, GPL-SNP1000 covers phase II detoxification enzymes that include glutathione S-transferse, Sulfotranferase 1a1, betaine-homocysteine methyltransferase 2, and UDP glucuronosyltransferease 1A1 .

The next group of genes we analyze tells parents if they or their children may have a mutation that is commonly found in autistic patients.  It has been reported that the prevalence of autism has increased dramatically in the last two decades.(15)  We looked at many different studies to determine what mutations are more commonly found in autistic patients, but not found in the neurotypical, non-autistic public.  Three large studies that were done using over 3000 participants were very useful in developing this panel.(16-18)  We selected 252 SNPs that cover 33 genes that were found in these three studies.  These genes cover many different pathways including glucose metabolism, ion and calcium channels, DNA transcription regulation, and nervous system genes.

Next, we included a group of genes that are involved with oxalate metabolism.  Oxalate and its acidic form, oxalic acid, are formed from diet, human metabolism, and yeast/fungal.  Oxalates are known to combine with calcium to form crystals that can cause kidney stones.  These crystals may also form in the bones, joints, blood vessels, lungs, and even the brain.(19)  The oxalate group from our test analyzes 32 SNPs that cover five different genes.  One of these genes is Alanine-glyoxylate aminotransferase (AGXT).  Mutations to AGXT can lead to kidney stones and primary hyperoxaluria.(20)       

In addition to these groups of genes, our new test also looks at genes for cholesterol metabolism, as well as transporters.  Both of these pathways are important for the body to regulate itself properly.  Cholesterol is important because it is critical for producing cellular membranes, hormones, and bile acids.  There are numerous recent articles discussing the importance of these cholesterol-produced molecules that regulate sugar metabolism and our metabolic rate.  Transporters are also necessary because they move large molecules and other chemicals into and out of the cell, which are not able to move across cellular membranes without assistance.  Without transporters, cells are not able to attain the proper building blocks necessary for optimum functionality or dispose of toxic cellular waste.

Truly personalized medicine may not be a reality today; however I believe the recent developments in genetic testing are the biggest leaps we’ve had in a long time.  GPL-SNP1000 helps healthcare professionals know what problems their patients may have now or in the future due to genetic mutations, as well as what specific treatments may be beneficial.  The Great Plains Laboratory, Inc. offers cutting-edge diagnostic tools that help identify underlying causes of many chronic conditions and provides recommendations for treatment based on test results.  In addition to our new genetic test, we offer other comprehensive biomedical testing, including our Organic Acids Test (OAT), IgG Food Allergy Test, GPL-TOX (our Toxic Organic Chemical Profile), and many more.  Utilizing a combination of our genetic and molecular diagnostics, we can now see a more complete picture of a patient’s overall health, both at present and potential problems for the future, which can all be addressed now.  I think the sun is rising on a new horizon of health.


References

1. Biello D, Harmon K. Tools for Life. Sci Am. 2010;303:17-18.

2. Marian AJ. Sequencing your genome: what does it mean? Methodist Debakey Cardiovasc J. 2014;10(1):3-6.

3. McCarthy DJ, Humburg P, Kanapin A, et al. Choice of transcripts and software has a large effect on variant annotation. Genome Med. 2014;6(3):26.

4. Sanger F, Nicklen S, Coulson AR. DNA sequencing with chain-terminating inhibitors. Proc Natl Acad Sci U S A. 1977;74(12):5463-5467.

5. Livak KJ, Flood SJ, Marmaro J, Giusti W, Deetz K. Oligonucleotides with fluorescent dyes at opposite ends provide a quenched probe system useful for detecting PCR product and nucleic acid hybridization. PCR Methods Appl. 1995;4(6):357-362.

6. Shi MM, Myrand SP, Bleavins MR, de la Iglesia FA. High throughput genotyping for the detection of a single nucleotide polymorphism in NAD(P)H quinone oxidoreductase (DT diaphorase) using TaqMan probes. Mol Pathol. 1999;52(5):295-299.

7. Lin B, Wang J, Cheng Y. Recent Patents and Advances in the Next-Generation Sequencing Technologies. Recent Pat Biomed Eng. 2008;2008(1):60-67.

8. Wiemels JL, Smith RN, Taylor GM, et al. Methylenetetrahydrofolate reductase (MTHFR) polymorphisms and risk of molecularly defined subtypes of childhood acute leukemia. Proc Natl Acad Sci U S A. 2001;98(7):4004-4009.

9. Deloughery TG, Evans A, Sadeghi A, et al. Common mutation in methylenetetrahydrofolate reductase. Correlation with homocysteine metabolism and late-onset vascular disease. Circulation. 1996;94(12):3074-3078.

10. Craddock N, Owen MJ, O'Donovan MC. The catechol-O-methyl transferase (COMT) gene as a candidate for psychiatric phenotypes: evidence and lessons. Mol Psychiatry. 2006;11(5):446-458.

11. Guengerich FP. Mechanisms of drug toxicity and relevance to pharmaceutical development. Drug Metab Pharmacokinet. 2011;26(1):3-14.

12. Ingelman-Sundberg M. Genetic polymorphisms of cytochrome P450 2D6 (CYP2D6): clinical consequences, evolutionary aspects and functional diversity. Pharmacogenomics J. 2005;5(1):6-13.

13. Ingelman-Sundberg M. Genetic susceptibility to adverse effects of drugs and environmental toxicants. The role of the CYP family of enzymes. Mutat Res. 2001;482(1-2):11-19.

14. Kalra BS. Cytochrome P450 enzyme isoforms and their therapeutic implications: an update. Indian J Med Sci. 2007;61(2):102-116.

15. Rutter M. Incidence of autism spectrum disorders: changes over time and their meaning. Acta Paediatr. 2005;94(1):2-15.

16. Sanders SJ, He X, Willsey AJ, et al. Insights into Autism Spectrum Disorder Genomic Architecture and Biology from 71 Risk Loci. Neuron. 2015;87(6):1215-1233.

17. Iossifov I, O'Roak BJ, Sanders SJ, et al. The contribution of de novo coding mutations to autism spectrum disorder. Nature. 2014;515(7526):216-221.

18. De Rubeis S, He X, Goldberg AP, et al. Synaptic, transcriptional and chromatin genes disrupted in autism. Nature. 2014;515(7526):209-215.

19. Hall BM, Walsh JC, Horvath JS, Lytton DG. Peripheral neuropathy complicating primary hyperoxaluria. J Neurol Sci. 1976;29(2-4):343-349.

20. Poore RE, Hurst CH, Assimos DG, Holmes RP. Pathways of hepatic oxalate synthesis and their regulation. Am J Physiol. 1997;272(1 Pt 1):C289-294.

Urine Calcium and Magnesium in Adults: Recommended Test for Nutritional Adequacy

William Shaw, PhD

Calcium
Calcium is one of the most tightly regulated substances in the body. In addition to the role of calcium as a structural element in bones and teeth (99% of the body’s calcium is in the bones), calcium is critically needed for nerve function. When calcium in the plasma drops about 30%, the person may develop tetany, a condition that is often fatal due to overstimulation of the nerves in both the central nervous system and peripheral nervous system, leading to tetanic contraction of the skeletal muscles. The concentration of calcium in the plasma is one of the most constant laboratory values ever measured. In the great majority of normal people, calcium only varies from 9-11 mg per dL, regardless of the diet (1). The reason is a complex hormonal system that utilizes the bones as a source of calcium. This regulatory system employs the parathyroid gland that secretes parathyroid hormone or parathormone to digest the bones and release calcium when there is only a small decrease in the plasma calcium. Parathormone also increases the absorption of calcium from the gastrointestinal tract and the kidney tubules. When calcium rises in the plasma, parathormone secretion decreases, depositing more calcium in the bones while renal and gastrointestinal absorption are decreased.  Calcitonin, a polypeptide hormone produced by the thyroid gland, opposes the effects of parathyroid hormone. In addition, vitamin D increases the absorption of calcium from the gastrointestinal tracts and the kidney tubules like parathyroid hormone but has little effect on digesting bones to release calcium. One of the most controversial and misunderstood topics is what is the optimum nutritional intake of calcium and vitamin D. In the center of the controversy is the role of calcium in the initiation of plaque in the arteries, leading to atherosclerosis and cardiovascular disease. 

An average adult ingests about 750 mg per day of calcium and secretes about 625 mg of calcium in the intestinal juices. If all the ingested calcium is absorbed, there would be a net absorption of 125 mg per day of calcium. Since the average person excretes about 125 mg calcium per day in the urine, the average person has a zero net calcium balance except when bone is being deposited. If bone is being deposited due to the stress of exercise or following a fracture, the regulation of the amount of urinary calcium excretion is the major factor to allow for bone growth. One of the major factors that prevents calcium absorption is the presence of high amounts of oxalates in the diet. The human body has the ability to make some oxalate endogenously, perhaps about 40 mg per day in individuals with a favorable genetic makeup. A low oxalate diet contains less than 50 mg per day of oxalates while a high oxalate diet with two cups or more of spinach, nuts, and berries in a smoothie or salad per day could easily contain 1500 mg per day of oxalates. Such high amounts of oxalates readily use up the 125 mg of available calcium, forming insoluble calcium oxalate salts which can deposit in every organ of the body. These deposits can easily initiate endothelial damage that can lead to strokes and myocardial infarctions (heart attacks) and such oxalate deposits have been detected in atherosclerotic lesions. The person on a high oxalate diet will have a much greater need for calcium and/or magnesium than the person on a low oxalate diet.       

Since urine is the major controlling element for maintaining calcium balance that is under tight hormonal control, it appears to me that urine calcium is the best indicator of adequate dietary calcium. The most common reasons for low urine calcium are inadequate dietary calcium and/or a high oxalate diet.  Other reasons for calcium deficiency include hypoparathyroidism, pseudohypoparathyroidism, vitamin D deficiency, nephrosis, nephritis, bone cancer, hypothyroidism, celiac disease, and malabsorption disorders.

The most common reason for high urine calcium is a diet high in calcium. Other reasons for calcium excess are vitamin D intoxication, hyperparathyroidism, osteolytic bone metastases, myeloma, excessive immobilization, Cushing’s syndrome, acromegaly, distal renal tubular acidosis, thyrotoxicosis, Paget’s disease, Fanconi’s syndrome, schistosomiasis, breast and bladder cancers, and sarcoidosis.

Magnesium
Magnesium is an essential element like calcium and is also in the bones (66% of the body’s magnesium is in the bones). It is a cofactor with many enzymatic reactions especially those requiring vitamin B6. Like extremely low calcium, extremely low magnesium can also cause tetany of the muscles.

Low magnesium
The most common reason for low urine magnesium is low magnesium in the diet. Low magnesium in the diet may increase the incidence of oxalate crystal formation in the tissues and kidney stones. Less common causes of low magnesium include celiac disease, other malabsorption disorders, dysbiosis, vitamin D deficiency, pancreatic insufficiency, and hypothyroidism. Early signs of magnesium deficiency include loss of appetite, nausea, vomiting, migraine headaches, fatigue, and weakness. As magnesium deficiency worsens, numbness, tingling, muscle contractions and cramps, seizures, personality changes, anxiety, depression, attention deficit, abnormal heart rhythms, and coronary spasms can occur. Low urinary magnesium for long time periods is associated with increased risk of ischemic heart disease.

High magnesium
The most common reason for high urine magnesium is high magnesium in the diet. Less common causes of high urine magnesium include alcoholism, diuretic use, primary aldosteronism, hyperthyroidism, vitamin D excess, gentamicin toxicity, and cis-platinum toxicity.  Increased urinary magnesium excretion can occur in people with insulin resistance and/or type 2 diabetes. Symptoms of marked magnesium excess can include diarrhea, hypotension, nausea, vomiting, facial flushing, retention of urine, ileus, depression, lethargy before progressing to muscle weakness, difficulty breathing, extreme hypotension, irregular heartbeat, and cardiac arrest.


REFERENECS

  1. Guyton, Arthur. Textbook of Medical Physiology,3rd edition. WB Saunders Co, Philadelphia, 1966,pgs1100-1118.
  2. Fleming, CR, et al.  The importance of urinary magnesium values in patients with gut failure.  Mayo Clinic Proceedings. 1996 Jan;71(1):21-4.

The Role of Probiotics in Candida and Clostridia Treatment: What Does the Evidence Say?

Jessica Bonovich, RN, BSN

Trillions of friendly microbes are currently living symbiotically within each of us right now (give or take a few billion). In fact, there are more of “them” than there are of “us”. While this phenomenon has been known for quite some time, only recently has modern medicine starting to examine this relationship and how it affects human health. Studies on the use of probiotics have been performed on a wide range of populations. Despite the seemingly obvious importance of these microbes, there is still some confusion about the role of probiotics in reducing the colonization of opportunistic pathogens like Clostridia and Candida? For example, some people believe that probiotics alone can treat an infection. Others believe
that they “compete” for resources and “crowd out” the bad guys. So, what does the evidence say?

Promising data by several studies have demonstrated the use of probiotics is effective against numerous pathological conditions caused by Candida. In these studies, Lactobacillus GG, L. acidophilus, and Saccharomyces boulardi were the predominant probiotics shown to be effective, with L. GG demonstrating the ability to induce antibody formation against Candida (PMID 15932169). Supplementation with probiotics has been shown to accelerate the healing of various pathological conditions in the gastro‐intestinal tract when Candida is present (PMID 17242486, 17251510). The use of probiotics have also been shown to accelerate immune response to Candida in both human and murine model simulations (PMID 17242486, 15813696). Probiotics have shown to decrease occurrence of Candida overgrowth in the elderly population (PMID 17251510). In preterm infants, probiotics were shown to prevent colonization of Candida, a common problem in this patient population (PMID 16705580). According to these data, probiotics promote and stimulate the host immune response against intestinal Candida overgrowth and accelerate healing in the intestinal mucosa. In most of the studies, probiotics were used alongside antifungals not instead of. 

So can probiotics help reduce oxalates? Since Candida can produce oxalates (PMID 11452311) many people are interested in using probiotics therapeutically to minimize the oxalate load. Many studies have demonstrated that Oxalobacter formingenes bacteria can reduce oxalate stone formation (hence the name) (PMID 16284877). As of right now, testing for Oxalobacter formingenes is available primarily in research settings and supplements are not widely available to the public (though I expect that they will be soon). Fortunately, there are other beneficial bacteria species shown to reduce oxalic acid. Many of these are already available probiotic form. These include Lactobacillus acidophilus, Lactobacillus Casei, Bifidobacterium breve, and Bifidobacterium lactis all of which are available in Lactoprime probiotic formula (PMID 17953571, 19214493, 15345383, 20602988, 20601517).  

Studies on the use of probiotics in the prevention and treatment of C. difficile infections have been promising with many well‐respected institutions incorporating them into protocol, particularly for patients with reoccurring C. difficile infections. A meta‐analysis was recently conducted that reviewed several randomized controlled trials investigating the use of probiotics against C. difficile in human subjects. The results demonstrated a reduction in the reoccurrence of infection in patients with reoccurring C. difficile infection when probiotic strains of Lactobacillus or Saccharomyces boulardi were used in combination with antibiotic treatment (PMID 19324296). A separate study indicated that S. boulardi inhibits toxins associated with C. difficile and mitigates the inflammation associated with infection (PMID: 9864230). Restoration of the intestinal microbial balance is thought to be an important
aspect to preventing reoccurring infections (PMID 18199029). Patients receiving treatment with vancomycin have better outcomes when the treatment is combined with probiotic supplementation (PMID 11049785). Here probiotics can potentially prevent reoccurring infection associated with Clostridia species and reduce inflammation associated with the toxins produced by Clostridia. Here again, these are used in conjunction with antibiotic therapy, not alone to fight infection.

The majority of these studies look at the efficacy of specific strains. Doing so helps to control the variables of the study but ignores the more clinically relevant aspect of multiple species. The broader question still remains to be asked, what is the appropriate ratio of beneficial bacterium to prevent disease states and illicit an appropriate humoral immune response in vulnerable versus healthy populations? As challenging as this question is to answer, it is one worth pursuing. In the mean time, a healthy dose of probiotics is likely to be a good choice for individuals concerned with combating or preventing opportunistic pathogens.

Lithium: The Cinderella Story about a Mineral That May Prevent Alzheimer’s Disease

James Greenblatt, MD and Kayla Grossman, RN

*Originally published in the December 2015 issue of The Neuropsychotherapist

Every four seconds, someone in the world develops dementia. Worldwide, an estimated 35.6 million people already live with a form of this neurodegenerative disorder, and these numbers are on a staggering rise. The World Health Organization has projected that the number of cases of dementia will double by 2030 (65.7 million) and triple by the year 2050 (115.4 million). Already in America the most common type of dementia, Alzheimer’s disease, is the sixth leading cause of death; one in three seniors passes with this type of crippling memory loss. (WHO, 2015)

Progressive memory loss that interferes with activities of daily living is not a normal part of aging. In fact, research is showing that cognitive decline is the result of pathophysiological processes deep within the brain beginning many years, even decades, before dementia symptoms start.

This knowledge is frightening. It brings attention to the pervasive and silent nature of these diseases. Neurodegenerative disorders have become an international public health issue with devastating medical, social and economic consequences. And yet, from the perspective of conventional medicine, relatively little is known about how to treat or stop them.

In the midst of a harrowing race to find answers, one unassuming prevention strategy has shown promise above the rest. This remedy is none other than the simple, brain-protecting mineral: lithium.

Understanding Alzheimer’s Disease
Chances are you’ve heard of Alzheimer’s disease before,or may even know someone who has suffered from it. Alzheimer’s disease is a tragic neurological malady characterized by a progressive and irreparable shrinking of brain tissue. The result is a devastating decline in memory, social abilities and communication skills in sufferers leading, eventually, to death.

Less than 5 percent of the time, Alzheimer’s disease results from a specific genetic combination that essentially guarantees a person will develop the disease. More commonly it is the result of a complex combination of subtle genetic, lifestyle and environmental factors that affect the brain over a lifetime. Scientists believe that Alzheimer’s disease is not an acute condition, but rather the result of numerous damages that occur over the years. This slow, cumulative patterning helps to explain why most patients with Alzheimer’s disease don’t present with symptoms until over the age of 65.

Pathologically, Alzheimer’s disease is the result of two trademark injuries or lesions that occur at the cellular level: plaques and tangles. Plaques are formed by deposits of small protein fragments called amyloid-B or beta-amyloid peptides. Clumps of these proteins block the synapses or spaces between brain cells or neurons. With the synapses barricaded, normal cell-to-cell signaling cannot occur and communication is essentially stopped in certain regions of the brain. Meanwhile, other lesions, called neurofibrillary tangles, develop within the neurons themselves. These tangles result from a disruption in the production of a different type of protein, called tau. Normally, tau protein filaments help to circulate nutrients and other essential supplies throughout the cell. In Alzheimer’s disease however, the strands destabilize, becoming twisted or “tangled”. Without this system to circulate vital compounds, neurons “starve” or die. The physiological processes required for memory and learning are halted, and symptoms begin to arise.

There is now evidence to show that these damaging beta-amyloid plaques and neurofibrillary tangles may actually be a relatively common malformation in the aging human brain. New research is revealing that plaques can appear a full 30-40 years before symptoms of cognitive decline even begin to show. (Langbaum et al., 2013) One recent study published in the Journal of the American Medical Association churned up the following statistics: 10% percent of healthy 50-year-olds have amyloid deposits. This figure swells to 33% by age 80, and 44% at age 90. (Visser et al., 2015). Individuals with a mental illness, specifically patients with depression or bipolar disorder, are at an even greater risk for developing these dementia-precursors in the brain. (DaSilva et al., 2013)

Nutrition and Brain Health
Currently there are no widely accepted preventative, or even ameliorative, treatments for most dementias including Alzheimer’s disease. A swarm of clinical trials have been launched in recent years, all with the goal of finding effective pharmaceutical interventions to stop or slow the progression of neurodegenerative disorders like Alzheimer’s disease. However between the years 2002 and 2012, 99.6% of drugs studies aimed at preventing, curing or improving Alzheimer’s symptoms were either halted or discontinued. (Devlin, 2013) Most of the tested drugs were making patients sicker, not better, and came with appalling side effects.

With pharmaceutical approaches failing, many clinicians and researchers are turning to nutrition to find their answers. Accumulating study results show that nutrition has profound effects on brain health. The brain functions at a high metabolic rate and uses a substantial portion of total nutrient intake. It relies on amino acids, fats, vitamins, minerals, and trace elements. These influence both brain structure and function. Nutrition also contributes to neuron plasticity and repair, key functions for mental health and well-being over the long term.

A collaborative research project funded by the National Institute on Aging recently found that individuals on a whole foods diet, rich in items like berries, leafy greens and fish, are at less of a risk for Alzheimer’s disease. (RUMC, 2015) Essential fatty acids such as omega-3s are also being studied at several large universities for their role in supporting brain health. Other experts have called Alzheimer’s disease “type 3 diabetes,” pointing to excess sugar intake as a major contributor to the disorder. The overlap between nutrition and cognitive function is becoming more widely accepted in the world of neurology.

Lithium: The Unlikely Treatment
One mineral that has shown great promise in the treatment of Alzheimer’s disease is the mineral lithium, a nutrient with established benefits for the treatment of mental health disorders.

Lithium salts have been used for centuries as a popular health tonic. Over the course of history this simple mineral has been applied to heal ailments as wide-ranging as asthma, gout, and migraines. Lithium springs were once sought-after health destinations, visited by authors, political figures and celebrities. Throughout the 19th and into the 20th century, lithium was used as a mineral supplement to fortify a variety of foods and beverages. The Sears, Roebuck & Company Catalogue of 1908 advertised Schieffelin’s Effervescent Lithia Tablets for a variety of afflictions. By 1907, The Merck Index listed 43 different medicinal preparations containing lithium. In 1929, a soft drink inventor named Charles Leiper Grigg even created a new lithiated beverage he called Bib-Label Lithiated Lemon-Lime Soda, now known as “7-Up.” The beverage contained lithium citrate until 1950, and was originally known and marketed for its potential to cure hang-overs after a night of drinking alcohol, and to lift mood.

Today lithium is still found naturally in food and water. The U.S. Environmental Protection Agency has estimated that the daily lithium intake of an average adult ranges from about 0.65 mg to 3 mg. Grains and vegetables serve as the primary sources of lithium in a standard diet, with animal byproducts like egg and milk providing the rest. Lithium has even been officially added to the World Health Organization’s list of nutritionally essential trace elements alongside zinc, iodine and others. 

In modern medicine, lithium is most widely acknowledged for its ability to encourage mood stability in patients with affective disorders. With years of research and clinical use to back it, a substantial body of evidence now exists to show that high-dose lithium restores brain and nervous system function, right down to the molecular level. This incredible mineral is now being considered for the treatment of cognitive decline.

Scientists first became interested in the use of lithium for treating neurodegenerative disorders when they observed that bipolar patients using lithium therapy seemed to have lower rates of cognitive decline than peers on other medications. In an attempt to figure out the legitimacy of this observation, one study compared the rates of Alzheimer's disease in 66 elderly patients with bipolar disorder and chronic lithium therapy, with the occurrence in 48 similar patients who were not prescribed the mineral. Findings in favor of lithium were staggering: patients receiving continuous lithium showed a decreased prevalence of Alzheimer’s disease (5%) as compared with those in the non-lithium group (33%). (Nunes et al., 2007) Two further studies in Denmark confirmed this phenomenon using different study designs, but achieving strikingly similar results. In this study series, investigators surveyed the records of over 21,000 patients who had received lithium treatment, and found that therapy was associated with decreased levels of both dementia and Alzheimer’s. (Kessing et al., 2008, 2010)

Unfortunately, the first clinical trials testing lithium with dementia patients proved disappointing. Researchers attempted to fit lithium into the same diagnostic treatment framework used by drug companies in the beginning: testing the therapy on patients who already had fully developed Alzheimer’s. At this point, the damages to the brain were simply too great to turn around.

One small, open-label study looked at low-dose lithium use in 22 Alzheimer’s disease patients over the course of one year. (MacDonald et al., 2008) While researchers concluded that prescription lithium salts were relatively safe in this population, there were no observed cognitive benefits. The small baseline sample size coupled with a high discontinuation rate, may have been to blame for these discouraging results. It may also have been too late for lithium to make a difference in these advanced stages of illness.

Another multi-center, single-blind study looked at the use of lithium sulfate in participants with mild Alzheimer’s disease over a 10 week period. (Hampel et al., 2009) They too failed to find significant effects of lithium treatment on cognitive performance or related biomarkers. One major issue with this trial however, was the length of observation. It likely takes months, not weeks, to see substantial cognitive shifts in patients.

A group led by Forlenza et al (2011), sought to correct for these initial design flaws. Focus was shifted away from the post-diagnosis period and settled on prevention. This unique study attempted to determine whether long-term lithium treatment could stop Alzheimer’s disease from occurring in high risk individuals. Forty-five participants with mild cognitive impairment (MCI), a precursor to Alzheimer’s, were randomized to receive lithium or a placebo. Over the 12-month trial, lithium dosages were kept at sub-therapeutic levels (150mg to 600mg daily) to minimize potential side effects. At the conclusion of the study, researchers discovered that those in the lithium group had a decreased presence of destructive tau proteins when compared to pre-study levels. This finding came in stark contrast to the the tau levels of the placebo group, which had increased steadily over the course of the study. What’s more, the lithium group showed improved performance on multiple cognitive scales. Overall tolerability of lithium was deemed good as patients reported limited side effects and the adherence rate to treatment was an impressive 91%. Researchers concluded that lithium had a significant disease-modifying impact on preventing dementia and Alzheimer’s disease when initiated early on in the disease progression.

The Promise of Low-Dose Lithium
Additional testing has found that lithium can be effective when used at low-doses or supplemental levels, similar to those found naturally in water and foods. Studies are beginning to show that the benefits of pharmaceutical lithium (used at an average of 600-1200 mg daily) can be achieved with much smaller and safer doses (between 1-20 mg). When lithium is used at these low or nutritional doses, the risks of side effects plummet.

Evidence pointing to the usefulness of low-dose lithium has come primarily from epidemiological studies conducted by geology specialists and other professionals. Eleven different studies have looked at lithium levels in the drinking water from various regions throughout the globe. Two dozen counties in Texas, the 100 largest American cities and 99 districts in Austria have been considered, alongside other locations in Greece and Japan.  (Dawson, 1970; Schrauzer & Shrestha, 1990; Kapusta et al., 2011 Kabacs et a., 2011; Giotakos et al., 2015; Sugawara et al., 2013) Lithium levels in the water have been compared to rates of behavioral issues (including psychiatric admissions, suicide, homicide, crimes), medical illnesses and overall mortality in these areas. Collectively the studies have analyzed outcomes in well-over 10 million subjects. In 9 of the 11 studies, a positive association between high lithium levels and beneficial behavioral, legal and medical outcomes has been observed. In each of the negative studies, levels of lithium were likely too low to yield any significant health effects. (Mauer et al., 2014)

These studies have spurred interest in the clinical applications of low-dose lithium, although trials have been slow coming. Because lithium is a naturally occurring mineral and is not patentable (and therefore not profitable), little financial backing has been put towards the cause. In one highly-regarded study published in Alzheimer’s Research however, a scant 0.3 mg of lithium was administered once daily to Alzheimer’s patients for 15 months. (Nunes, 2013) Those receiving lithium demonstrated stable cognitive performance scores throughout the duration of the study, while those in the control group suffered progressive declines.  Moreover, three months into the study, the seemingly impossible happened: the lithium treatment cohort began showing increasing mini-mental status scores.

Additional, high-quality trials using low-dose lithium are essential, especially in the realm of dementia and cognitive decline.

Key Neuroprotective Mechanisms
There is now clean scientific evidence to suggest not only that lithium protects the brain, but also how it does so. Lithium ions (at both high and low concentrations) have been shown to modify key cellular cascades that increase neuronal viability and resilience. Most prominently, lithium disrupts the key enzyme responsible for the development of the amyloid plaques and neurofibrillary tangles associated with Alzheimer’s disease. This enzyme is called Glycogen Synthase Kinase-3 (GSK-3)—a serine/threonine protein kinase that normally plays a major role in neural growth and development. In the healthy brain, GSK-3 is very important; it helps to carry out the synaptic remodeling that drives memory formation.

In Alzheimer’s disease however, GSK-3 becomes hyperactive in the areas of the brain that control cognition and behavior, including the hippocampus and frontal cortex. When “revved-up” in this way, GSK-3 phosphorylates, or activates, amyloid-B and tau proteins within the neurons. Eventually, these proteins accumulate and create the signature plaques and neurofibrillary tangles that disrupt brain function and result in symptoms of cognitive decline. Lithium works as a direct GSK-3 inhibitor to prevent this over-expression, halting inappropriate amyloid production and the hyper-phosphoryation of tau proteins before they become problematic. (Hooper et al., 2008; Wada, 2009)

In addition to protecting the brain from the development of plaques and tangles, lithium has been shown to repair existing damages brought on by the Alzheimer’s disease pathogenesis. Lithium ions for example, encourage the synthesis and release of key neurotrophic factors such as Brain Derived Neurotrophic factor (BDNF) and neurotrophin-3 (NT-3) which in turn stimulate the growth and repair of neurons. (Leyhe et al., 2009) Patients on lithium have been found to have significantly higher gray matter volumes in the brain, hinting that lithium has powerful stimulatory effects on neurogenesis. One study has even directly demonstrated that damaged nerve cells exposed to lithium respond with increases in dendritic number and length. (Dwidivi & Zhang, 2014)

Conclusion
Alzheimer’s and dementia have become modern health problems of epidemic proportions. Nonetheless, relatively few pharmacological solutions have been discovered for preventing, treating and reversing associated cognitive decline. As conventional treatment approaches falter, clinicians and researchers have been turning more and more to natural alternatives. It has become increasingly evident that nutrition is a key factor when it comes to brain health.

Evidence suggests that the mineral lithium in particular, may play a major role in shifting the pathophysiological cascade associated with dementia and Alzheimer’s disease. In clinical studies, long-term lithium therapy has been found to decrease the problematic plaques and tangles leading to symptoms of cognitive decline. This powerful mineral acts by inhibiting damaging enzymes and stimulating the release of protective neurotrophic factors in the brain.

Lithium ions have been found to operate efficiently at low doses mimicking those found in nutritional sources. At these sub-pharmaceutical levels, lithium has been shown to be a beneficial and safe neuroprotective therapy across age groups and with minimal side effects.

The safety profile of low-dose lithium is particularly attractive, as prevention strategies for dementia are most effective when started early and continued for long periods of time. The dangerous plaques and tangles involved in Alzheimer’s disease start up to 40 years before the appearance of symptoms. What’s more, 10% of healthy 50 year olds already have amyloid deposits developing in the brain tissues. Thus, for optimal effectiveness, steps to protect the brain must be taken at a much younger age than previously thought.

When started early, low-dose lithium may be the key intervention to prevent cognitive decline. But first, we must move past the stigma that surrounds it. As psychiatrist Ana Fels wrote in her recent article for The New York Times, “[o]ne could make a case that lithium is the Cinderella of psychotropic medications, neglected and ill used.” Lithium is the single most proven substance to keep neurons alive, and yet it continues to be viewed in the public mind as a dangerous and scary drug. Lithium is found readily in our environment, food, water and each and every cell in the human body. It is time we change the conversation around one of nature’s most effective and powerful neuroprotective remedies.


References

  1. Da Silva, J., et al. 2013. Affective disorders and risk of developing dementia: systematic review. Br J Psychiatry 202:177-186
  2. Dawson, E.P., Moore, T.D. & McGanity, W.J. 1970.  The mathematical relationship of drinking water lithium and rainfall on mental hospital admission. Dis Nerv Syst 31:1–10.
  3. Devlin, H. 2015. Scientists find first drug that appears to slow Alzheimer’s Disease. The Guardian. Retrieved from: http://www.theguardian.com/science/2015/jul/22/scientists-find-first-drug-slow-alzheimers-disease.
  4. Dwivedi, T. & Zhang, H. 2014. Lithium-induced neuroprotection is associated with epigenetic modification of specific BDNF gene promoter and altered apoptotic-regulatory proteins. Front Neurosci 8:1-8.
  5. Fels, A. 2014. Should we all take a bit of lithium? The New York Times. Retrieved from: http://www.nytimes.com/2014/09/14/opinion/sunday/should-we-all-take-a-bit-of-lithium.html
  6. Forlenza, O. V., et al. 2011. Disease-modifying properties of long-term lithium treatment for amnestic mild cognitive impairment: randomized controlled trial. Br J Psychiatry 198:351-365.
  7. Giotakos, O., et al. 2015.  Lithium in the public water supply and suicide mortality in Greece. Biol Trace Elem Res 156(1–3):376–379.
  8. Hampel, H., et al. 2009. Lithium trial in Alzheimer’s disease: a randomized, single-blind, placebo- controlled, multicenter 10-week study. J Clin Psychiatry 70 (6): 922-31.
  9. Hooper, C., Killick, R., Lovestone, S. 2008. The GSK3 hypothesis of Alzheimer’s Disease. J Neurochem 104, 1433-1439.
  10. Kapusta N.D., et al. 2011. Lithium in drinking water and suicide mortality. Br J Psychiatry.198(5):346–350.
  11. Kessing, L. V., et al. 2008. Lithium treatment and risk of dementia. Arch Gen Psychiatry 65(11):1331-1335.
  12. Kessing, L.V., Forman, J.L., & Andersen, P.K. 2010. Does lithium protect against dementia? Bipolar Disord 12(1): 87-94.
  13. Langbaum, J.B.S., et al. 2013. Ushering in the study and treatment of preclinical Alzheimer’s Disease. Nat Rev Neurol 9(7): 371-381.
  14. Leyhe, T., et al. 2009. Increase of BDNF serum concentration in lithium treated patients with early Alzheimer’s Disease. J Alzheimers Dis 16:649-656.
  15. Macdonald, A., et al. 2008. A feasibility and tolerability study of lithium in Alzheimer’s disease. Int J Geriatr Psychiatry 23 (7): 704-11.
  16. Mauer, S., Vergne, D.,  & Ghaemi, S. N. 2014. Standard and trace-dose lithium: A systematic review of dementia prevention and other behavioral benefits. Aust NZ J Psychiatry Retrieved from: http://anp.sagepub.com/content/early/2014/06/10/0004867414536932
  17. Nunes, M. A., Viel, T. A., Buck, H. S. 2013. Microdose lithium treatment stabilized cognitive impairment in patients with Alzheimer’s Disease. Curr Alzhiemer Res 10, 104-107.
  18. Nunes, P. V., Forlenza, O. V., Gattaz, W. F. 2007. Lithium and risk for Alzheimer’s disease in elderly patients with bipolar disorder. Br J Psychiatry 190:359-60.
  19. Rush University Medical Center. 2015. Diet may help prevent Alzheimer’s: MIND diet rich in vegetables, berries, whole grains, nuts. Retrieved from: https://www.rush.edu/news/diet-may-help-prevent-alzheimers
  20. Schruazer, G.N.., & Shrestha, K.P. 1990. Lithium in drinking water and the incidences of crimes, suicide and arrests related to drug addiction. Biol Trace Elem Res 25: 105-113.
  21. Sugawara, N., et al. 2013. Lithium in tap water and suicide mortality in Japan. Int J Environ Res Public Health. 10(11):6044–6048.
  22. Visser P.J., et al. 2015. Prevalence of cerebral amyloid pathology in persons without dementia. JAMA 313(19):1924-1938.
  23. Wada, A. 2009. Lithium and neurpsychiatric therapeutics: neuroplascticity via glycogen synthase kinase-3B, B-catenin and neurotrohpin cascades. J Pharmacol Sci 110, 14-28.
  24. World Health Organization. 2015. Facts and Figures: Dementia. Retrieved from: http://www.who.int/mediacentre/factsheets/fs362/en/.

The Importance of Genetic Testing in Mental Health

Matthew Pratt-Hyatt, PhD
Associate Laboratory Director, The Great Plains Laboratory, Inc.

Personalized medicine has been called the future of medicine since the inception of the Human
Genome Project (HGP) in the early 90s. The dream of personalized healthcare is to use genetic
testing to understand a patient’s predisposition for developing different conditions, and then
undergo molecular diagnostic tests to determine how the environment is interacting with these genes. Genetic testing has become a much more economical tool with the advent of Next Generation Sequencing (NGS) technology. Even though traditional medicine has mostly
followed the philosophy that one size fits most, functional medicine says that each person is
unique and deserves unique care. That is why The Great Plains Laboratory, Inc. has developed
our new genetic test, GPL‐SNP1000, which now allows us to have a more complete picture of what contributes to a patient’s health status, including mental health.

GPL‐SNP1000 looks for mutations in over 140 genes and over 1,000 different SNPs (singlenucleotide polymorphisms) and is a very useful tool for everyone working in the fields of functional and integrative medicine. GPL‐SNP1000 looks at genes and SNPs in nine specific
pathways that we believe are most important to integrative medicine. Three of the nine gene
groups we analyze ‐‐ the mental health group, the autism risk group, and the drug metabolism
group are particularly invaluable for those working in the mental health field, helping guide
practitioners in both diagnoses and more personalized treatment.

For mental health, we analyze 88 SNPs across 14 different genes. The mental health genes are
CaMkk, ELOVL6, MAOA, COMT, DAOA, SHMT1, AHCY, GAMT, MAT2B, MAT1A, MTRR, MUT, and
MTR. Some of the important mutations in these genes are:

COMT: Catechol‐O‐methltransferase (COMT) is present in the body in two different
forms. The short form is called soluble catechol‐O‐methyltransferase (S‐COMT). The
longer form is called membrane‐bound catechol‐O‐methyltransferase (MB‐COMT). MBCOMT is mainly present in the nerves of the brain, while S‐COMT is located in the liver, kidney, and blood. In the brain, MB‐COMT is responsible for degrading
neurotransmitters called catecholamines, which include dopamine, epinephrine, and
norepinephrine. GPL‐SNP1000 analyzes six different SNPs for COMT. Conditions
associated with these mutations include OCD, depression, and schizophrenia.

MAOA: Monoamine oxidase A is important for the metabolism of biogenic amines such
as the neurotransmitters dopamine, norepinephrine, and serotonin. Patients with
mutations in this gene can have Norrie disease (an eye disease that causes blindness in
males at birth or soon after), severe intellectual disability, autistic behaviors, and
seizures. Mutations to this gene have also been linked to depression, borderline
personality disorder, and bipolar disorder.

The autism risk genes are another group that would be important to practitioners of mental
health and integrative medicine, especially those who focus on pediatrics. We looked at many
different studies to determine which mutations are more commonly found in autistic patients,
but not found in the neurotypical, non‐autistic public. We selected 252 SNPs in 33 genes that
cover many different pathways including glucose metabolism, ion and calcium channels, DNA
transcription regulation, and autoimmune system genes. If a patient has one of these
mutations, it does not mean that he/she will develop Autism Spectrum Disorder, but their risk
for developing ASD may be higher than that of the general public.

The other group of genes that could be of great use in mental health is the cytochrome P450
drug metabolizers. Even though many functional practitioners are trying to move away from
using pharmaceuticals, antidepressants, neuroleptics, and beta‐blockers are still some of the
most commonly used medications. The P450 enzymes metabolize 75% of all medications.
However, many of these enzymes have possible mutations that could affect their efficacy and
safety. Over 100,000 hospitalizations occur annually because of adverse drug reactions. GPLSNP1000
looks at 241 SNPs covering all of the major mutations that could cause a decrease in
drug efficacy and safety. A recent study indicated that genetic tests could reduce the adverse
drug reactions for some medications by as much as 66%.

The new genetic test from The Great Plains Laboratory, Inc. will be a great tool for all healthcare practitioners, but especially those practicing in the mental health field. We hope that you’ll make great use of it to deliver more personalized diagnoses and treatments for your patients. For more information about GPL‐SNP1000, please visit our website or contact us and ask to speak with one of our lab scientists or consultants. www.GreatPlainsLaboratory.com


References:

1. Biello D, Harmon K. Tools for Life. Sci Am. 2010;303:17‐18.
2. Marian AJ. Sequencing your genome: what does it mean? Methodist Debakey
Cardiovasc J. 2014;10(1):3‐6.
3. McCarthy DJ, Humburg P, Kanapin A, et al. Choice of transcripts and software has a large
effect on variant annotation. Genome Med. 2014;6(3):26.
4. Sanger F, Nicklen S, Coulson AR. DNA sequencing with chain‐terminating inhibitors. Proc
Natl Acad Sci U S A. 1977;74(12):5463‐5467.
5. Livak KJ, Flood SJ, Marmaro J, Giusti W, Deetz K. Oligonucleotides with fluorescent dyes
at opposite ends provide a quenched probe system useful for detecting PCR product and nucleic
acid hybridization. PCR Methods Appl. 1995;4(6):357‐362.
6. Shi MM, Myrand SP, Bleavins MR, de la Iglesia FA. High throughput genotyping for the
detection of a single nucleotide polymorphism in NAD(P)H quinone oxidoreductase (DT
diaphorase) using TaqMan probes. Mol Pathol. 1999;52(5):295‐299.
7. Lin B, Wang J, Cheng Y. Recent Patents and Advances in the Next‐Generation
Sequencing Technologies. Recent Pat Biomed Eng. 2008;2008(1):60‐67.
8. Wiemels JL, Smith RN, Taylor GM, et al. Methylenetetrahydrofolate reductase (MTHFR)
polymorphisms and risk of molecularly defined subtypes of childhood acute leukemia. Proc Natl
Acad Sci U S A. 2001;98(7):4004‐4009.
9. Deloughery TG, Evans A, Sadeghi A, et al. Common mutation in
methylenetetrahydrofolate reductase. Correlation with homocysteine metabolism and lateonset
vascular disease. Circulation. 1996;94(12):3074‐3078.
10. Craddock N, Owen MJ, O'Donovan MC. The catechol‐O‐methyl transferase (COMT) gene
as a candidate for psychiatric phenotypes: evidence and lessons. Mol Psychiatry.
2006;11(5):446‐458.
11. Guengerich FP. Mechanisms of drug toxicity and relevance to pharmaceutical
development. Drug Metab Pharmacokinet. 2011;26(1):3‐14.
12. Ingelman‐Sundberg M. Genetic polymorphisms of cytochrome P450 2D6 (CYP2D6):
clinical consequences, evolutionary aspects and functional diversity. Pharmacogenomics J.
2005;5(1):6‐13.
13. Ingelman‐Sundberg M. Genetic susceptibility to adverse effects of drugs and
environmental toxicants. The role of the CYP family of enzymes. Mutat Res. 2001;482(1‐2):11‐
19.
14. Kalra BS. Cytochrome P450 enzyme isoforms and their therapeutic implications: an
update. Indian J Med Sci. 2007;61(2):102‐116.
15. Rutter M. Incidence of autism spectrum disorders: changes over time and their
meaning. Acta Paediatr. 2005;94(1):2‐15.
16. Sanders SJ, He X, Willsey AJ, et al. Insights into Autism Spectrum Disorder Genomic
Architecture and Biology from 71 Risk Loci. Neuron. 2015;87(6):1215‐1233.
17. Iossifov I, O'Roak BJ, Sanders SJ, et al. The contribution of de novo coding mutations to
autism spectrum disorder. Nature. 2014;515(7526):216‐221.
18. De Rubeis S, He X, Goldberg AP, et al. Synaptic, transcriptional and chromatin genes
disrupted in autism. Nature. 2014;515(7526):209‐215.
19. Hall BM, Walsh JC, Horvath JS, Lytton DG. Peripheral neuropathy complicating primary
hyperoxaluria. J Neurol Sci. 1976;29(2‐4):343‐349.
20. Poore RE, Hurst CH, Assimos DG, Holmes RP. Pathways of hepatic oxalate synthesis and
their regulation. Am J Physiol. 1997;272(1 Pt 1):C289‐294.

Clostridia difficile - The Role of Toxin A and B in its Pathogenicity

Kurt Woeller, DO

There are approximately 100 species of clostridia bacteria that can inhabit the gastrointestinal tract of humans. Not all of these clostridia are disease causing, but a certain few can lead to serious illness in susceptible individuals. There are 5 main species of clostridia known to cause disease: Clostridia botulinum, Clostridia perfringens, Clostridia tetani, Clostridia sordellii, and Clostridia difficile. This article will focus on the role of Clostridia difficile (C. difficile), and its production of various gastrointestinal toxins in human illness.

What Are Clostridia Bacteria?
Clostridia difficile, like all clostridia bacteria, is an obligate anaerobe. This means it is an organism that thrives in an oxygen devoid environment and is susceptible to being killed by normal atmospheric oxygen. It is unique in its ability to survive in hostile environments in part because of its spore development. Clostridia spores, the reproductive cell of clostridia, have thick cell walls which resist heat and antimicrobial compounds. These spores of clostridia are highly contagious and can be spread person to person even from individuals without symptoms of clostridia overgrowth.

C. difficile is a complex species of clostridia because of the various toxins it can produce. There are certain strains of C. difficile that produce compounds known to alter mitochondrial function by interfering with various steps in Kreb Cycle metabolism decreasing the amount of nicotinamide adenine-dinucleotide (NADH) used by the electron transport chain for adenosine-triphosphate (ATP) production (1). Other C. difficile strains create neurochemical compounds that disrupt dopamine production leading to various issues with regards to mental health. However, the most commonly known toxins produced by C. difficile are those that disrupt gastrointestinal function and in some individuals can lead to serious health problems.

The Prevalence of C. difficile Associated Disease
The rates of C. difficile infections leading to serious illness and death have been on the rise. According to the Centers for Disease Control even as recent as 2011 there were approximately 450,000+ documented case of Clostridium difficile infections (CDI) and upwards of 29,000 deaths (2). A large number of individuals who have an initial episode of CDI will develop at least one recurrence of the disease. The recurrence rates for CDI are high, in part because of the complex nature of C. difficile, and its spore forms that resist antibiotic intervention.

One of the problems with C. difficile is that not only can it lead to serious illness in susceptible individuals, but that it can be found in people, even children, who may not necessarily be suffering with primary issues related to C. difficile. For example, a paper in 2010 out of Poland (3) discussed the prevalence of C. difficile in fecal samples taken from 178 children ages 2 months to 2 years who were hospitalized for a variety of reasons. Their stools where examined for the presence of C. difficile Toxin A and B, the two main intestinal toxins known to trigger chronic diarrhea and bowel inflammation. The percentage of children infected with C. difficile was 68.6% and many of these children were not acutely sick from C. difficile. However, as mentioned previously, toxin A and toxin B from certain strains of C. difficile can lead to serious problems.

The Role of Toxin A and Toxin B
These two toxins are the main virulence factors related to mucosal damage from C. difficile. Toxin A and B are capable of causing mucosal damage resulting in digestive tract inflammation leading to either clostridia difficile associated diarrhea (CDAD) or Pseudomembranous colitis (4).

Toxin A is categorized as an enterotoxin, which means it is a toxin released by microorganisms that target the digestive system. It functions by changing host cell metabolism and tight junction formation. This can lead to mucosal cell damage, fluid accumulation and even cell death. Toxin A is considered to be the main cause of CDAD as it causes intestinal villi and brush border destruction. In severe cases of Toxin A production, it can lead to ulceration formation seen in Pseudomembranous colitis.

Pseudomembranous colitis is a type of inflammatory bowel disease of the colon that manifests with various ulcerations from mucosal damage and the development of a “pseudo” membrane (aka. ‘inflammatory membrane’), that overlays the site of mucosal injury. This inflammatory membrane is an accumulation of fibrin and inflammatory and necrotic cells that appear as a yellowish globule spread throughout the colon. In the late 1970s it was determined that C. difficile via the production of various toxins, was the causative organism for Pseudomembranous colitis.

Like Toxin A, Toxin B also plays a significant role in damage to the mucosal lining of the digestive system. Toxin B is categorized as a cytotoxin which means it is toxic to cells. Examples of cytotoxins would be chemicals produced by the immune system that damage other cells in the body. Bee venom, as well as poisons from spiders or snakes, are all classified as cytotoxins.

Toxin B causes major cellular disruption by interfering with signaling pathways, tight junction formation, formation of the cytoskeleton of the cell, and derangement of overall cell structure. It is a major virulence factor of C. difficile leading to vascular swelling and hemorrhaging. Also, Toxin B can not only have local inflammatory effects in the digestive system, but also systemic effects through its production of proinflammatory cytokines such as Tumor Necrosis Factor-αlpha.

It was felt for many years that serious bowel inflammation from C. difficile was generated by a single toxin, but both toxins are now known to be capable of causing mucosal damage.

Treatment and Testing
Treatment of C. difficile infections are mostly done by antibiotics. The two most common antibiotics are oral Flagyl (metronidazole) and oral Vancomycin (vancocin). Traditional intervention calls for 7 to 10 days of either antibiotic. As mentioned previously recurrence rates for C. difficile can be high, primarily because of C. difficile spore formation. There is a trend in C. difficile treatment to use cyclical courses of either antibiotic to aide in reduction of recurrence rates and increase clinical outcomes.

Stool testing for C. difficile is effective and is used as a primary diagnostic tool analyzing for the presence of Toxin A and Toxin B. Prior to stool testing direct visualization of inflammatory membranes was used through colonoscopy or sigmoidoscopy.

If either Toxin A or B is detected on stool pathogen screening the practitioner needs to correlate the information to the clinical presentation of the individual and treat accordingly or refer to a specialist for further evaluation. Not everyone with C. difficile will be symptomatic of intestinal disease so each situation needs to be evaluated individually. However, the presence of Toxin A or Toxin B found on stool pathogen testing certainly documents the presence of a strain of C. difficile. The individual should be treated appropriately.

Dr. Kurt N. Woeller is an author, international speaker, practicing clinician and founder of Integrative Medicine Academy (www.IntegrativeMedicineAcademy.com), which is an online training academy that provides various courses for health practitioners interested in integrative medicine.


REFERENECS

1.      Richard E. Frye, et. al. Gastrointestinal dysfunction in autism spectrum disorders: the role of the mitochondria and the enteric microbiome. Microbial Ecology in Health and Disease. Volume 26, 2015.

2.      Lessa, FC, et.al. Burden of Clostridium difficile infection in the United States. N. Engl J Med, 2015;372:825-834.

3.      Prevalence of Clostridium difficile in the gastrointestinal tract of hospitalized children under two years of age. Med Dosw Mikrobiol; 2010;62(1):77-84 (Poland).

4.      (Kuehne SA, Cartman ST, Heap JT, Kelly ML, Cockayne A, Minton NP; October 2010. "The role of toxin A and toxin B in Clostridium difficile infection". Nature 467 (7316): 711–3.

Rickets and dangerous eye-poking behavior in autism associated with calcium deficiency: Preventing and detecting deficiency with a simple urine test for calcium and magnesium

William Shaw, PhD

Failure to provide adequate calcium to persons on the autistic spectrum is very dangerous and could lead to the loss of the eyes due to severe eye-poking behavior. This is an especially important topic because some individuals like Amy Yasko warns that calcium may cause overstimulation of neurons. Every element in our food and drink including water may cause death with excess intake but you will not find skull and cross-bone warnings on bottled water at the supermarket. The most relevant question is: How much calcium in the diet and in supplements is excessive?

 Calcium deficiency can be a severe problem in normal children on a milk-free and dairy-free diet since milk is a significant source of protein, vitamin D, and calcium needed for strong bones and teeth. Some physicians have reported that rickets (1), a severe bone deformity, occurred in children with autism on the gluten and casein-free diet who did not receive added calcium supplements. Calcium and vitamin D supplementation is essential to children on a casein-free diet since most children with autism do not eat substantial amounts of other calcium-rich foods. Failure to provide adequate calcium to children on casein-free diets leads physicians to view such parents as negligent and ignorant and leads to skepticism about other nonstandard treatments for autism.

Children with autism may have an even more severe problem with calcium deficiency. Mary Coleman, M.D. (2) reported that children with autism who are calcium deficient are much more likely to poke out their eyes and a substantial number of children with autism have done so. I have talked to numerous parents of children with autism that began to touch their eyes after starting the casein-free diet. This abnormal behavior is associated with low urine calcium; blood calcium levels were usually normal. Parathyroid hormone, calcitonin, and vitamin D were all normal in patients with autism but all of them had low urine calcium. Treatment with calcium supplementation prevents this behavior but dietary supplementation with high calcium foods does not. (I suspect that this behavior is due to increased eye pain due to high deposits of oxalate crystals in the eye. Oxalates are high in urine samples of children with autism and can deposit in many tissues including the eyes.  Low calcium may act to intensify this pain and poking out the eye relieves the pain.) Dr. Coleman also found that speech developed very quickly after calcium supplementation in a portion of mute children with autism who had low urine calcium. In one case, according to a parent who contacted me, her child with autism persisted in poking at the eyes even after one eye had been partially poked out and surgically re-implanted. Calcium supplementation stopped this behavior immediately. I am aware of many other children with eye-poking behavior in which calcium supplements stopped this behavior in less than two days. Verbal autistic children say that their eye pain is severe and that calcium supplementation stopped their pain quickly. In Coleman’s study of 78 children with autism, 20% had urine calcium values two standard deviations below the normal child’s range for urine calcium. Clearly, this extremely low group requires supplementation with calcium. I would recommend calcium supplementation for any child below the mean value urine calcium for normal children of the same age.

Magnesium research in autism is often combined with research on vitamin B6 since the two nutritional factors work together in a host of biochemical reactions. In one study in France (4), children on the autistic spectrum were given 6 mg per kilogram of body weight per day of magnesium and 0.6 mg per kilogram body weight of vitamin B6. This supplementation improved autistic symptoms including the following: social interactions (23/33), communication (24/33), stereotyped restricted behavior (18/33), and abnormal/delayed functioning (17/33). When the Mg-B6 treatment was stopped, autistic symptoms reappeared in a few weeks. Low magnesium levels may be associated with restlessness, sensitivity to noise, poor attention span, poor concentration, irritability, aggressiveness, and anxiety.

From a parent- “Our daughter also used to look in the mirror all the time - really up close and wanting to look at herself and poke her eyes. I was so worried about it that I finally put pepper juice on her fingers so she would stop. I know that sounds awful - but she had really gotten bad. Dr. Shaw said that their eyes are hurting so much from lack of calcium. He recommended 1000mg. daily - our daughter was about 43 pounds at the time. I started giving it to her and her eye poking stopped and I noticed that so many of her other stimming behaviors also decreased.”


REFERENCES

1. Hediger ML, England LJ,Molloy CA, Yu KF, Manning-Courtney P, Mills JL. Reduced bone cortical thickness in boys with autism or autism spectrum disorder. J Autism Dev Disord. 2008;38(5):848–856

2. Coleman, M. Clinical presentations of patients with autism and hypocalcinuria. Develop. Brain Dys. 7: 63-70, 1994

3. Caudarella R, Vescini F, Buffa A, Stefoni S. Citrate and mineral metabolism: kidney stones and bone disease. Front Biosci. 2003 Sep 1;8:s1084-106.

4. M. Mousain-Bosc et al Improvement of neurobehavioral disorders in children supplemented with magnesium-vitamin B6 II. Pervasive developmental disorder-autism. Magnesium Research 2006;19(1): 53-62

5. Fleming, CR, et al.  The importance of urinary magnesium values in patients with gut failure.  Mayo Clinic Proceedings. 1996 Jan;71(1):21-4

Clinical Usefulness of IgG Food Allergy Testing

William Shaw Ph.D

Immunoglobulin G (IgG) food allergy testing has made vast advancements since the year 2003 when the American Academy of Allergy, Asthma, and Immunology published a statement that "Measurement of specific IgG antibodies to foods is also unproven as a diagnostic tool"(1) Most of the IgG food allergy throughout the world is done using the same immunochemical technique. First, soluble food proteins in solution are reacted to a solid phase that chemically binds to a variety of proteins. The use of plastic microtiter trays with one to several hundred wells has become the most common material used as the solid phase. Then these trays are washed, dried, and stored for later use. A sample of diluted serum is then added to each of the wells. Antibodies of all types in the diluted serum bind to the specific food molecules that are attached to the plastic wells of the tray. Next, the plates are washed to remove any nonspecific antibodies in the diluted serum. At this time, food antibodies from all of the five major immunoglobulin classes called G, A, M, E, and D may be attached to the food antigens on the plate. The next step confers specificity on the assay. Antisera from sheep, goats, rabbits, or other animals that specifically binds to IgG is added to microtiter wells and only binds to IgG, not to IgA, IgM, IgE, or IgD. This antibody to IgG has previously been modified by the attachment of an enzyme that can be measured conveniently. The amount of enzyme bound to food antigen-IgG complexes on the plate is directly related to how much IgG antibody is attached to a given food. The overall technique is termed Enzyme Linked Immuno Assay or ELISA. If IgG4 is measured, an antiserum specific for IgG4 only must be used for the final step.

The clinical usefulness of IgG testing in an array of illnesses is illustrated in an early article published by an otolaryngologist who reported that the majority of his patients had substantial health improvements after an elimination of foods positive by IgG food allergy tests (2). The overall results demonstrated a 71% success rate for all symptoms achieving at least a 75% improvement level. Of particular interest was the group of patients with chronic, disabling symptoms, unresponsive to other intensive treatments. Whereas 70% obtained 75% or more improvement, 20% of these patients obtained 100% relief. Symptoms most commonly improved 75%-100% on the elimination diets included asthma, coughing, ringing in the ears, chronic fatigue, all types of headaches, gas, bloating, diarrhea, skin rash and itching, and nasal congestion. The most common IgG food allergies were cow's milk, garlic, mustard, egg yolk, tea, and chocolate.

The usefulness of IgG food allergy to design customized elimination diets has now been documented in scientific studies. Irritable bowel syndrome (IBS) is a common, costly, and potentially disabling gastrointestinal (GI) disorder characterized by abdominal pain/discomfort with altered bowel habits (e.g., diarrhea, constipation). The major symptoms of IBS are (1) abnormality of bowel movement, (2) reduction in bowel sensitivity thresholds, and (3) psychological abnormality. Many IBS patients have psychological symptoms including depression, anxiety, tension, insomnia, frustration, hypochondria. psychosocial factors (3). Atkinson et al (4)evaluated a total of 150 outpatients with irritable bowel syndrome (IBS) who were randomized to receive, for three months, either a diet excluding all foods to which they had raised IgG antibodies (ELISA test) or a sham diet excluding the same number of foods but not those to which they had antibodies . Patients on the diet dictated by IgG testing had significantly less symptoms than those on the sham diet after 120 days on the diets. Patients who adhered closely to the diet had a marked improvement in symptoms while those with moderate or low adherence to the IgG test dictated diets had poorer response. Similar results were also obtained by Drisko et al (5). They used both elimination diet and probiotic treatment in an open label study of 20 patients with irritable bowel syndrome diagnosed at a medical school gastroenterology department. The most frequent positive serologic IgG antigen-antibody complexes found on the food IgG tests were: baker's yeast, 17 out of 20 (85%); onion mix, 13 out of 20 (65%); pork, 12 out of 20 (60%); peanut 12 out of 20 (60%); corn, 11 out of 20 (55%);wheat, 10 out of 20 (50%); soybean, 10 (50%); carrot, 9 out of 20 (45%); cheddar cheese, 8 out of 20 (40%); egg white, 8 out of 20 (40%). Only 5 out of 20 reacted by IgG antibody production to dairy; however the majority of patients reported eliminating dairy prior to trial enrollment presumably clearing antigen-antibody complexes prior to testing. Significant improvements were seen in stool frequency, pain, and IBS quality of life scores. Imbalances of beneficial flora and dysbiotic flora were identified in 100% of subjects by comprehensive stool analysis. There was a trend to improvement of beneficial flora after treatment but no change in dysbiotic flora. The one-year follow up demonstrated significant continued adherence to the food rotation diet, minimal symptomatic problems with IBS, and perception of control over IBS. The continued use of probiotics was considered less helpful.

IgG food allergy testing was also proved effective in the gastrointestinal disorder Crohn's disease. Bentz et al (6) found that an elimination diet dictated by IgG food allergy testing resulted in a marked reduction of stool frequency in a double blind cross-over study in which the IgG-dictated diet was compared to a sham diet in 40 patients with Crohn's disease. IgG food allergies were significantly elevated compared to normal controls. Cheese and baker's yeast (Saccharomyces cerevisiae) allergies were extremely common with rates of 83% and 84% respectively. Main et al (7), focusing on the baker's yeast allergy, also found extremely high prevalence of IgG allergy in patients with Crohn's disease. Titers of both IgG and IgA to S. cerevisiae in the patients with Crohn's disease were significantly higher than those in the controls. In contrast, antibody titers in the patients with ulcerative colitis were not significantly different from those in the controls. Among the patients with Crohn's disease there was no significant difference in antibody titers between patients with disease of the small or large bowel. Since IgG antibodies to S. cerevisiae cross react with Candida albicans (8), Candida species colonization might be a trigger for the development of Crohn's disease.

IgG food allergy to wheat, gluten, gliadin, rye, and barley are prevalent in the gastrointestinal disorder celiac disease. Virtually all patients with celiac disease have elevated IgG antibodies to gliadin if they currently have wheat or related grains in their diet. The confirmation of celiac disease is confirmed by the presence of flattened mucosa with a lack of villi when a biopsy sample of the small intestine is examined microscopically. Another confirmation test with equal sensitivity is a blood test for IgA transglutaminase antibodies. The antibody confirmation test is equal in accuracy to the biopsy test with the exception that individuals with IgA deficiency may have false negative results. However, I would estimate that only 1% of people with elevated IgG antibodies to gliadin and other grains related to wheat have celiac disease. If the individual is negative for the confirmation tests for celiac disease, many patients are frequently erroneously advised that that have no problem with wheat. Hadjivassiliou et al argued that it is a significant clinical error to classify wheat allergy through the filter of celiac disease (9) and argue that celiac disease is a subtype of wheat sensitivity. Many of their patients with wheat allergy but celiac disease negative had remission of severe neurological illnesses when they adopted a gluten free diet and expressed that in these patients the gluten molecule causes an autoimmune reaction in the brain rather than in the intestinal tract, likely against the Purkinje cells that are predominant in the cerebellum.

A wide range of additional studies has proven the clinical value of IgG antibodies in autism (10), bipolar depression (11), schizophrenia (12), migraine headaches (13), asthma (14), and obesity (15).

Total IgG Versus IgG4 Food Allergy

Immunoglobulin G (IgG) is classified into several subclasses termed 1, 2, 3, and 4. IgGs are composed of two heavy chain–light chain pairs (half-molecules), which are connected via inter–heavy chain disulfide bonds situated in the hinge region (Figure 1). IgG4 antibodies usually represent less than 6% of the total IgG antibodies. IgG4 antibodies differ functionally from other IgG subclasses in their lack of inflammatory activity, which includes a poor ability to induce complement and immune cell activation because of low affinity for C1q (the q fragment of the first component of complement). Consequently, IgG4 has become the preferred subclass for immunotherapy, in which IgG4 antibodies to antigens are increased to reduce severe antigen reactions mediated by IgE. If antigens preferentially react with IgG4 antibodies, the antigens cannot react with IgE antibodies that might cause anaphylaxis or other severe reactions. Thus, IgG4 antibodies are often termed blocking antibodies. Another property of blood-derived IgG4 is its inability to cross-link identical antigens, which is referred to as "functional monovalency". IgG4 antibodies are dynamic molecules that exchange half of the antibody molecule specific for one antigen with a heavy-light chain pair from another molecule specific for a different antigen, resulting in bi-specific antibodies that are unable to form large cross-linked antibodies that bind complement and thus cause subsequent inflammation(16). In specific immunotherapy with allergen in allergic rhinitis, for example, increases in allergen-specific IgG4 levels indeed correlate with improved clinical responses. IgG4 antibodies not only block IgE mediated food allergies but also block the reactions of food antigens with other IgG subclasses, reducing inflammatory reactions caused by the other IgG subclasses of antibodies to food antigens.

In IgG mediated food allergy testing, the goal is to identify foods that are capable of causing inflammation that can trigger a large number of adverse reactions. IgG1, IgG2, and IgG3 all are capable of causing inflammation because these antibodies do not exchange heavy and light chains with other antibodies to form bispecific antibodies. Thus, IgG1, IgG2, and IgG3 antibodies to food antigens can and do form large immune complexes or lattices that fix complement and increase inflammation. The presence of IgG4 antibodies to food antigens indicates the presence of antibodies to foods that will not usually cause inflammation even though high amounts of these antibodies do indicate the presence of immune reactions against food antigens. Testing only for IgG4 antibodies in foods limits the ability of the clinician to determine those foods that are causing significant clinical reactions that are affecting their patients. The importance of measuring other subtypes of IgG antibodies is highlighted in an article by Kemeny et al. (17). They found that IgG1 antibodies to gluten were elevated in all 20 patients with celiac disease but none of the patients had elevated IgG4 antibodies to gluten.

Clinical References

  • 1. Statement of the AAAAI Work Group Report: Current Approach to the Diagnosis and Management of Adverse Reactions to Foods, October 2003. http://www.aaaai.org/ask-the-expert/usefulness-of-measurements-of-IgG-antibody.aspx (Accessed October 27,2013).
  • 2. Dixon H, Treatment of delayed food allergy based on specific immunoglobulin G RAST testing relief. Otoloryngol Head Neck Surg 2000;123:48-54.
  • 3. Nagisa Sugaya N and Nomura S, Relationship between cognitive appraisals of symptoms and negative mood for subtypes of irritable bowel syndrome. BioPsychoSocial Medicine 2008;2:9-14
  • 4.Atkinson, W et al. Food elimination based on IgG antibodies in irritable bowel syndrome: a randomised controlled trial Gut 2004;53:1459-1464
  • 5. Drisko J, Bischoff B, Hall M, McCallum R, Treating Irritable Bowel Syndrome with a Food Elimination Diet Followed by Food Challenge and Probiotics. Journal of the American College of Nutrition 2006; 25: 514–522
  • 6. Bentz S, et al. Clinical relevance of IgG antibodies against food antigens in Crohn's disease: a double-blind cross-over diet intervention study. Digestion. 2010;81:252-64.
  • 7.Janice Main, Hamish McKenzie, Grant R Yeaman, Michael A Kerr, Deborah Robson, Christopher R Pennington, David Parratt Antibody to Saccharomyces cerevisiae (bakers' yeast) in Crohn's disease BMJ 1988;297:1105-1106
  • 8. Thomas Schaffer, Stefan Mueller, , Beatrice Flogerzi, , Beatrice Seibold-Schmid,Alain M. Schoepfer, and Frank Seibold Anti-Saccharomyces cerevisiae Mannan Antibodies (ASCA) of Crohn's Patients Crossreact with Mannan from Other Yeast Strains, and Murine ASCA IgM Can Be Experimentally Induced with Candida albicans Inflamm Bowel Dis 2007;13:1339 –1346
  • 9. M Hadjivassiliou, R A Grünewald, G A B Davies-Jones Gluten sensitivity as a neurological illness. Neurol Neurosurg Psychiatry 2002;72:560–563
  • 10. Vladimir T et al Higher Plasma Concentration of Food-Specific Antibodies in Persons With Autistic Disorder in Comparison to Their Siblings. Focus Autism Other Dev Disabl 2008; 23: 176-185
  • 11. Severance EG et al Immune activation by casein dietary antigens in bipolar disorder. Bipolar Disord 2010;12: 834–842
  • 12. Severance EG, et al Subunit and whole molecule specificity of the anti-bovine casein immune response in recent onset psychosis and schizophrenia. Schizophr Res. 2010;118:240-7.
  • 13.Huber A, et al Diet restriction in migraine, based on IgG against foods: a clinical double-blind, randomised, cross-over trial. Int Arch Allergy Immunol. 1998; 115:67-72.
  • 14.Vance G. et al. Ovalbumin specific immunoglobulin G and subclass responses through the first five years of life in relation to duration of sensitization and the development of asthma. Clia Exp Allergy 2004;34:1452-1459
  • 15.Wilders-Truschnig M et al. IgG Antibodies Against Food Antigens are Correlated with Inflammation and Intima Media Thickness in Obese Juveniles. Exp Clin Endocrinol Diabetes 2008;116:241-5.
  • 16. Marijn van der Neut Kolfschoten, et al Anti-Inflammatory Activity of Human IgG4 Antibodies by Dynamic Fab Arm Exchange. Science 2007;317:1554-1555
  • 17. Kemeny DM, et al Sub-class of IgG in allergic disease. I. IgG sub-class antibodies in immediate and non-immediate food allergy. Clin Allergy. 1986; 16:571-81.