Saturday, July 31, 2010

Irritable Bowel

Let's start with a bit of melancholy. I do so love this piece (right click to open in new tab) by Ralph Vaughan Williams. My 10th grade band director, who died in his early sixties of a heart attack, loved RVW too. There's plenty of tragedy in an ordinary life to go around, even now.

Last week when researching the lithium post I pulled another article from the Journal of Lipid Research that looked interesting. Its title: "Marked elevations in pro-inflammatory polyunsaturated fatty acid metabolites in females with irritable bowel syndrome." This article was produced by an Irish group, including a psychiatrist! Amazing what you find in the Journal of Lipid Research.

I see a lot of patients with IBS - basically, uncomfortable flatulence, bloating, constipation, and diarrhea, but no particular cause found. It is highly associated with depression and anxiety, thus my common involvement. And the researchers call IBS a "disorder of the brain-gut axis." But you may not be surprised that low-grade inflammation is suspected to be a predisposing factor, as evidenced by the presence of mast cell mediators (regulators of the cellular immune system) and lymphocytes in the colo-rectal mucosa, and elevations of pro-inflammatory cytokines in the serum of people with IBS.

(Yes. Inflammation. Imagine that.)

The study involved some comparison between people with IBS and normal controls. The IBS sufferers were selected from the gastroenterology clinic. The controls were found off the street, or however. Study participants filled out a few questionnaires and then had some blood taken (and centrifuged) for analysis. Some basic confounders were accounted for (such as smoking).

The findings: 41% of the subjects with IBS (17 out of 41 people) met the criteria for major depressive disorder. No one in the control group of 26 were depressed. The IBS group had a mix of constipation dominant vs. diarrhea dominant vs alternating symptoms.

And the fatty acids? IBS sufferers had lower plasma omega 6 and higher plasma omega 3 than the controls (WHA??? I know, sometimes science surprises - though this trend was not statistically significant), but the levels of arachidonic acid (created from omega 6 fatty acids, and the precursor for a ton of inflammatory cytokines) were significantly elevated compared to controls (p=0.029). The inflammatory cytokines themselves (prostaglandins and leukotrienes, if you must know) were also significantly elevated. There was no correlation between the severity of the symptoms and the elevations of the cytokines. Arachidonic acid does seem to play a role in intestinal permeability, so the researchers did not find its elevation surprising. They made note that diet might not be the only factor predisposing people to an inflammatory state.

It might not all be dietary! It might have to do with other stressful factors in life, and such. Those psychotherapy skills might be useful after all.

Thursday, July 29, 2010

Low Cholesterol and Suicide 2

In my last post on the link between low cholesterol and suicide, I made note of some general trends between low cholesterol, suicide (particularly violent suicide), accidents, and violence, and raised some questions about the safety of cholesterol-lowering drugs. I didn't find any researched link between statin therapy and suicide, though one study showed that a statin reduced the ability of a certain serotonin receptor to do its job (linked below). My takeaway point from the post was that, hey, cholesterol is important and needed in the brain. Obliterating the ability of our liver to make cholesterol may have some untoward mental health side effects.

Since then, I've kept an eye out for more information, and a few interesting snippets have come up. Current Psychiatry has a decent article this month, "Cholesterol, mood, and vascular health. Untangling the relationship."

Some interesting facts from the article:

1) 1/4 of the body's free cholesterol is found in the central nervous system
2) Depleting cholesterol impairs the function of the serotonin 1A receptor and the serotonin 7 receptor, and reduces the ability of the membrane serotonin transporter to do its thing. (Serotonin is made within nerve cells and needs to be transported outside into the synapse between the nerve cells to work. If the transporter isn't functioning, we have a Big Problem).
3)Cholesterol is also needed for forming a nerve synapse (also Important) and making myelin.
4) Cholesterol may be involved in GABA and NMDA receptor signaling, opioid signaling, and the transport of excitatory amino acids.

Just to be crystal clear - low serotonin is associated with violent suicide, impulsive acts, hostility, and aggression. We need plenty of cholesterol in the brain to have all our serotonin machinery work properly. Low cholesterol is also associated with suicide and violence. If you have low cholesterol, of course it does not mean you will be suicidal. Suicide is, fortunately, rare, and will have multiple predisposing causes.

So the paragraph above, with its caveat, brings up an interesting and actionable hypothetical question - does lowering cholesterol with medication predispose you to suicide or violence? The first cholesterol-lowering drugs were not statins. And an early analysis of the primary prevention trials of the non-statins showed a doubling of the risk of violent death or suicide. Oops. (I also linked the J-LIT trial in my previous post, which showed a 3-fold increase in suicide or accidents with statin therapy, though the increase was not statistically significant).

A later case-controlled study showed that statin users had a lower risk of depression than patients on non-statin lipid-lowering drugs. The LIPID study followed 1130 patients on pravastatin for 4 years, and found no changes in (self-reported) anger, impulsivity, anxiety, or depression. Pravastatin doesn't cross the blood-brain barrier very well. Simvastatin, a very commonly used statin, crosses it quite readily - but why this would be important may be interesting. HMG Co-A reductase inhibitors (statins) do most of their work in the liver, after all. But it turns out we have HMG Co-A reductase all sorts of places. These researchers found it in Chinese hamster ovary cells. And in these cells, administration of a statin reduced the ability of the serotonin IA receptor to work. Getting rid of the statin restored the serotonin IA receptor function.

But there's another complication in examining the literature for statin side effects. Some studies excluded patients with psychiatric problems (1). And due to the ability of statins to cause birth defects, many trials have excluded any women of childbearing age. Just something to keep in mind.

We are left with... well, a clinical trial is apparently underway to study the effects if pravastatin, simvastatin, or placebo on mood, sleep and aggression. We still don't know if low cholesterol causes suicide and aggression, or if it is a biomarker of depression. I'm convinced high cholesterol is just a biomarker for heart disease, after all, rather than a cause. Thus the whole question of why treat high cholesterol at all (though the magical anti-inflammatory statin effect may help younger men. With known heart disease.)

My brain needs cholesterol! So does yours.

Tuesday, July 27, 2010

Lithium and Inflammation

Lithium is an interesting sort of mineral salt. It sits on the periodic table right above sodium, and can fool our kidneys into thinking they are the same molecule. Scientists first figured out lithium could help stabilize mood in the late 1800s (when it was also used to treat gout). And, turns out, El Paso, Texas has high levels of lithium in the water, but low rates of violence and mental hospital admissions compared to other cities (1). Lithium was the original "up" ingredient in 7-UP soda (pretty sure lithium is not in there anymore!). The first research paper on lithium didn't appear until 1949, when Australian psychiatrist John Cade made his mark on psychiatric history. However, Greek physicians thousands of years earlier were treating mental disorders with mineral water now thought to be high in lithium.

Before John Cade, mania was treated with electroshock therapy or lobotomy, so lithium was a terrific option - in fact it was the first successful pharmaceutical treatment for mental illness (thorazine wasn't used for several more years). It has huge downsides - toxic to the thyroid and kidneys (and heart in high amounts), fatal in overdose, and a lot of the time it simply doesn't work. But when lithium does work, it is a wonderful thing. Suicidal depression and mood swings relieved within days. To this day, lithium is one of the few medications proven to decrease the risk of suicide (3).

Despite the fame and long term, widespread use, no one knew what the heck lithium actually did. In medical school, I was taught that it had some effect on the regulation of second messenger systems within the neurons (4). Meaning, like every other psychotropic medication, it buffs up the communication in the brain, presumably to help it work all the more smoothly. (We psychiatrists have almost no lab tests and no imaging studies to help us - we just have to sit with someone and figure out what might be going on. A handicap which lends itself to the search for holistic, evolutionary solutions - but everyone knows my bias!)

The good Dr. Hale sent me a link to this article in Psychiatric News, which sheds more light on lithium's possible mechanisms of action. The article references this paper in The Journal of Lipid Research, and the story herein involves more unfortunate rats.

We learn first of all that bipolar disorder is a major mental illness worldwide, and is characterized by mood shifts from severe depression to mania. Examination of the post-mortem frontal cortex of those with bipolar disorder shows an increase in neuroinflammatory markers (I'm sure you're not surprised), and an increase in the enzymes that regulate the expression of arachidonic acid. (Arachidonic acid is the highly-unstaturated fatty acid (HUFA) made from the omega 6 polyunsaturated fatty acids (omega 6 PUFAs), otherwise known as essential nutrients but Standard American Diet Villain Extraordinaire).

In this study, rats were given lithium-laced food or lithium-free food for 6 weeks, and then their little brains were examined to see what happened (sorry, they did not use non-invasive methods. The rats were anesthetized, however, before the final insult.)

To summarize the results - lithium decreases arachidonic acid in the brain and increases the concentration of an antiinflammatory metabolite of DHA (yes! Fish oil!). 17-OH DHA inhibits all sorts of other inflammatory proteins in the brain, like TNF-alpha. (For the biochemistry nuts - LiCl seems to intervene at the level of cPLA2 and sPLA2 and COX). Famous inflammatory modulator aspirin has been postulated to help lithium work better in bipolar disorder (5)

Interestingly enough, lithium has been shown to be the only effective drug (at least to slow the progression down) in another inflammatory, progressive, and invariably fatal neurotoxic disease, ALS (6), and is being studied in HIV dementia and Alzheimer's disease.

Sunday, July 25, 2010

Zinc Clarity

So far we have linked zinc deficiency to anorexia, ADHD, and depression. Meaning that patients with these conditions have a tendency to have lower serum zinc levels than people without them, and in anorexia and depression, human studies (only preliminary in the case of depression) have shown that supplementation can be helpful. The data for anorexia is robust (1), and the recommended daily supplementation for treatment of anorexia nervosa is 14mg a day, which is just a touch above the normal RDA. The "upper tolerable limit" is officially defined at 40mg a day, but low level toxicity (related to problems with copper absorption and metabolism) has been demonstrated at consistent supplementation with 60mg daily. However, one is more likely to run into problems above 150mg a day. (Eaton et al via Nora Gedgaudas says a Paleolithic daily intake of zinc was around 43mg daily)

There are many different formulations of zinc, and there is no consistent evidence at a population level that one formulation is better than another. Some will be more readily absorbed, but fortunately the labeling is supposed to account for that - indicating the amount of elemental zinc you will actually get if you take the supplement. Since nasal formulations can cause permanent loss of smell, they should be avoided. Though my (general mineral) supplement contains 7.5mg and it says "50% RDA,". The actual RDA varies by age and circumstance (11mg for adult men, 12-14 mg for breastfeeding moms). Vegetarians may need 50% more supplementation than meat-eaters, and people on thiazide diuretics (HCTZ for example) for high blood pressure may need 60% more.

The actual mechanisms of the zinc-related links to psychopathology remain obscure, though looking more closely at the variety of functions zinc modulates in the body, many could be at play. (And to clarify the finding that zinc is necessary for the formation of IL-6, yet zinc deficiency is associated with high levels of IL-6, those levels are from depression literature. That is, people who are depressed will tend to have low serum zinc and high IL-6.)

I'm not going to leave zinc behind, though, without touching upon one more rather obscure paper published online in "Neurochemical Research" in June, 2010. The title is "Zinc and Fatty Acids in Depression." (You didn't really think I could go another post without mentioning omega 3s and 6s, did you?)

This paper was poorly written, and the experimental methods had some issues, but there is something very likable about it despite the flaws.

These researchers took blood samples of zinc levels and fatty acids (obtained by gas chromatography of fats within the serum rather than through red blood cell membrane fraction which is what I am more used to seeing - usually I see a centrifuge mentioned in the RBC fraction measurements and I don't see one used here) and some other measurements, like BMI, standard clinical psychiatric interviews, and a couple standard rating rating scales for depression (Hamilton and Beck) of 88 psychiatric inpatients. Most of these inpatients were on antidepressants, not surprisingly. Oddly enough, the blood was drawn "at 8:00 am, after an overnight fasting with a quasi-empty bowel (1 cup herb tea without sugar and a slice of white bread)."

For the controls, "a convenient group" of 88 volunteers were recruited from the population and matched by age group and gender. "After an overnight fasting period, their blood sample was taken and prepared for chemical analysis under similar conditions as with the patients' samples." I'm not sure if that means they got the herb tea and slice of bread or not. Oh well.

What did they find?

First of all, there wasn't a big difference between the levels of zinc in the patients or in the controls. BUT, among the depressed people, the more depressed they were (by the Beck Depression Inventory) the lower their zinc level. The medication in this study seemed to have no relation to the zinc or fatty acid concentration (in a prior study, antidepressant medication normalized zinc levels (2)). Also, while the actual DHA and EPA (fish oil) levels were about the same between depressed individuals and controls, the ratios between the omega 6-born arachidonic acid and serum DHA and EPA were significantly correlated with the amount of depression symptoms. Meaning depressed patients had a more out-of-whack ratio between omega 6 and omega 3. Sounding very familiar.

Other nifty findings that make yo go "hmmm" - control subjects with higher zinc levels had higher levels of myristic acid (a type of saturated fat), and people who were depressed had higher amounts of stearic acid (another saturated fat). It's an observational study, so you get what you get and then you ponder on it.

And here in the discussion of the results, the researchers throw in something entirely unexpected, which is, really, why I like the paper. They suddenly start rambling on about skeletons. Right out of the blue. "Zn was demonstrated to contribute to stabilize the skeleton."

I'm not even sure what that means. But then - "in depressive disorders, the density of bone minerals was found to be low. Antidepressants... were reported to have a beneficial effect on [bone cells]." Then they go on to talk about how arachidonic acid (the HUFA created from the omega-6 PUFAs) and DHA enriched diets, given in a specific ratio, reduced the zinc content of piglets' femurs. As we've found out already, zinc metabolism is very complicated, but it appears to me we've found a sink where the body can store zinc and sequester it from the circulation - the bones. And a possible signal to get the body to store or release zinc in the bones is the omega 6 to omega 3 ratio.

Osteoporosis, demineralization of the bones, highly linked with depression, is another disease of civilization.

Now let's look at the zinc sink we already saw in a previous post - where inflammatory cytokine IL-6 seemed to help tuck zinc away in the liver. Turns out that in the livers of zinc deficient rats, there is a high amount of myristic acid and changes in the mRNA and protein messaging systems and transporters responsible for fatty acid metabolism. In this study, there was a linear relationship between the amount of zinc in the blood and the amount of myristic acid in the blood. Not sure what it means but definitely something to remember for future reference.

And finally - turns out the enzymes that change omega 6 acids into arachidonic acid, which is then used for complex signaling throughout the body, are zinc-dependent. Absolute zinc deficiency or a high amount of sequestered zinc (due to out of control inflammation?) could cause a lot of changes all along the omega 6 fatty acid metabolism pathway.

Let's examine the areas where zinc plays a major roll - inflammation and immunity, the brain (particularly the hippocampus, ground zero for depression), the liver (ground zero for the metabolism), the pancreatic beta cells, and the bones. We can't get too terribly excited about that - after all, zinc plays a role in so many chemical reactions in the body that of course we will find it in all the important areas related to the diseases of civilization. I mean, zinc even helps hair formation. (I had a few gray hairs prior to starting paleo eating, but they don't seem to be coming back. I read in Evolutionary Medicine Forum about some anecdotal evidence of gray hair reversal.)

There is a lot more to learn, some of which is not even known. Excess insulin has always had a starring role in the whole metabolic picture of the diseases of civilization, but the full screen view involves PUFA ratios and zinc. I've no doubt of it.

Saturday, July 24, 2010

Interlude

I'll get back to zinc tomorrow. But today I heard this piano solo on XM radio (right click to open in a new tab if you want to listen and read), and it got me to thinking.

The song is Traumerei, by Schumann, one of the most famous piano solos ever written. The pianist in the youtube video is Horowitz, playing an encore to his last concert in Moscow, thus the emotional reaction of the audience and pianist himself.

Robert Schumann had at least two suicide attempts during his life, and his last two years were spent in a mental asylum. He also had hugely productive periods, including the "Year of Song" in 1840, when he composed 168 pieces. These facts and his documented auditory hallucinations of angelic and demonic voices have led some to speculate that he suffered from bipolar disorder. It seems equally likely that he had tertiary syphilis and mercury poisoning (from the treatment for syphilis) leading to the disintegration and psychosis those last several years. He was only 46 when he died, and his last compositions showed him to have some experimental, evolutionary ideas of music, though they were unappreciated at the time. If we'd discovered the mold-derived penicillin (a surefire treatment for syphilis) 100 years earlier, who knows what more he would have accomplished.

Evolutionary medicine and lifestyle is not really about going back. Sometimes this is misunderstood.

I can't write about Schumann without touching upon Franz Schubert. He was an Austrian composer, also from the Romantic period, who also died young (age 31) of syphilis. It's important to know that when he wrote this piece (right click), he knew that his own death was imminent.

We don't see too much of death in our modern, Western world. We're protected from the feedlots and chicken roosts and for some it is possible to fantasize about a world where we don't even kill animals to stay alive. Our own typical deaths happen in a drugged haze, or attached to a thousand machines in the ICU. Such a death is expensive. And horrific. In medical school and residency, no matter what the specialty, you see this first hand all the time. And you grow acclimated and inured to your own future possibilities. If not some accident, bad luck, or foolishness, you get to pick heart disease or cancer. It is hard, unless you have a sudden infarction and arrhythmia, to die quickly or painlessly. Of course we hide our deaths from the masses now.

Staffan Lindeberg described the natural deaths of the Kitavans - after a long, healthy, hale life, somewhere between age 70-90 or even beyond 100, the elderly will become fatigued for a few days, and pass quietly away. To me that doesn't sound too bad. Sure beats hospice or an ICU.

There are some paleo folks who are looking beyond being flourishing omnivores toward optimal human health. This is where some of the most heated arguments about low-carb vs paleo carb vs low calorie living break out. And there is a bit of theoretical evidence that low carb, low protein may be optimal for longevity.

I'm not going to wait for the 100 year randomized controlled trial. And I do like lots of vegetables, fruit, and even potatoes or rice from time to time. Variety, as they say, is the spice of life. I'll settle for being merely human, and I'll take my penicillin and vaccines, should the need arise.

Friday, July 23, 2010

Zinc, Depression, and Everything

Today I will review more specific and up-to-date information about the interplay between zinc and depressive disorders and inflammation. Let's summarize the human evidence thus far (1):

1) Depressed patients in studies have a lower serum zinc level than normal controls.
2) The more depressed the patients are, the lower the zinc level.
3) Low zinc levels in pregnant women are associated with pre- and postpartum depression.
4) Treatment with antidepressants normalizes zinc levels (I've been a little loose with the terminology, I admit, and this finding helps us keep in mind that zinc level can be just a biomarker for depression, not necessarily a cause or effect per se.)
5) Zinc supplementation plus antidepressant therapy can work better for depression than antidepressants alone.
6) Zinc supplementation alone can have antidepressant effects.

Now let's try to clarify a bit more about zinc and the brain. As I noted in my last post, the hippocampus seems to be the most vulnerable to zinc deficiency. The hippocampus is a center of memory, and it also plays a big role in nerve plasticity and repair. Recall that nitric oxide and antidepressants seem to work by increasing the production of brain derived neurotrophic factor in the hippocampus. BDNF is one of many nerve growth factors in the hippocampus, and is part of several different neurochemical pathways which help in nerve recovery, adaptation, and repair.

Scientists have been able to cobble together the following pathway in rat brains: Zinc deficiency leads to decreased zinc in the synapse, which results in an increase in the NMDA receptors (these receptors respond to glutamate, an excitatory neurotransmitter that can be responsible for toxic effects in the brain if there is too much). At the same time, the inhibitory (in this case, neuroprotective) neurotransmitter GABA is decreased, along with BDNF and another nerve growth factor, NGF. The glutamate level in the synapse is higher, so calcium mediated stimulation of the nerves is primed. Do this too much, and you get "excitotoxicity." This same mechanism is thought (in acute vs chronic and in differing areas of the brain) to be responsible for seizures, migraines, dementia, anxiety, depression, and bipolar disorder (and is why pharmaceutical GABA receptor modulators can be effective for certain symptoms of any of those conditions).

Getting down to the real nitty-gritty, Zinc works in conjunction with nearly all of the different membrane signaling and second messenger systems you might have learned about in molecular biology classes. Membrane gated ion channels, p53 signaling, g-proteins, zinc-fingers (obviously) - the whole lot. This is why even though a lot of these different nerve chemicals work via different mechanisms, or multiple mechanisms, zinc can have a hand in all of these up regulating and down regulating events. Zinc is a cog in the machine all along the way.

So there are clear mechanisms by which absolute zinc deficiency can have a hand in all sorts of bad brain syndromes, and vegetarians, dieters, the elderly, those with malabsorption or intestinal issues, and the two billion people on the planet who (due to poverty) pretty much subsist on grains alone (rich in zinc-binding phytates) are all at risk for absolute zinc deficiency.

But robust presumably zinc-replete meat-eaters of a Western diet are at risk for depression, diabetes, dementia, and cardiovascular disease along with the whole diaspora of the Western chronic diseases. I contend (along with many others) that inflammation is the primary driving mechanism behind the whole shebang. Could there be a mechanism by which inflammation could affect brain zinc levels (or vice versa) as a part of the pathway leading from inflammation to bad brain disease?

Wanna put some money on it? Did I mention that pancreatic beta cells in particular run on a lot of zinc-dependent pathways too (2)(3)?

It's common knowledge that zinc supplementation can help ameliorate a cold (at least if you take the zinc within the first day of symptoms (4)), and, as I mentioned in the last post, zinc has a lot to do with mediating our body's immune response. We use zinc to activate the immune pathways that zap viruses (like colds), but zinc can influence the activity of 2000 (yeah, two thousand) different immune transcription factors. The baddest of these factors is NFkappaB. NFkappaB hangs around in the nucleus of immune cells and helps them make all sorts of inflammatory cytokines to fight off the perceived bad guys - good old inflammatory frenemies such as IL-6, IL-2, TNFalpha and many, many more. (Yesterday I noted that zinc deficiency is associated with increased IL-6, and on review of several articles, it seems that high and low zinc is associated high IL-6. It is probably a part of what I discuss in the next paragraph, but I'll look into it more, as a lot of the work is done by the Polish group or Maes, and they seem to cite each other all the time). Zinc not only directly promotes the synthesis of NFkappaB, it helps it get into the nucleus where it can work, and it helps it bind to the DNA to promote inflammation.

It isn't so simple as that. Turns out that zinc also has a hand in down regulating inflammation too! It even activates a protein that helps inactivate NFkappaB. And IL-6, an inflammatory cytokine which needs zinc the be born, will then activate a protein in the liver called metallothionein, a protein that holds on to zinc and keeps it in the liver, so that even if you eat a lot of zinc, it won't be available in your blood or brain for other uses. A lot of biochemical systems are like this - too little zinc (such as in people born without the ability to absorb it (5) and you get immune dysfuntion and vulnerability to infection, as your protective inflammatory response won't work. But if inflammation gets high enough, it has its own down regulating systems (sequestering zinc via IL-6 and metallothionein, for example) that cool things off.

Our inflammatory and fight or flight systems were built for acute insults. Viruses, injury, bacterial invasion, angry lion attacking the camp. When the insults are chronic (unalleviated stress, gallons of inflammatory-promoting omega-6 fatty acids, weird glutens and lectins, chronic depression-causing viral infections such as herpes, borna disease, HIV, or Epstein Barr), the whole system becomes dysregulated. What should be up is down. So zinc ought to be in the central nervous system, helping out with nerve repair and plasticity, and instead it is crusading with NFkappaB or stuck with metallothionein in the liver, and your poor hippocampus is shorting out on glutamate and calcium. Extra zinc might help. As might antidepressants, GABA receptor modulators, and other neuroprotective chemicals. But those are bailing buckets. What we really need is to correct the problem causing the boat to sink. We need to reduce the inflammatory insults in the first place.

There's more. Always more!

Thursday, July 22, 2010

Zinc!

When one makes a study of evolutionary medicine-type issues (that is, all chronic Western disease mediated by inflammation and diet and lifestyle so far removed from the life our bodies were designed for), the same nutrients keep popping up again and again. Fish oil is a good example. Yes, for heaven's sakes. I'm taking my fish oil. Shut up about it already.

When examining the small unexplored niche of nutritional evolutionary psychiatry, however, another trace mineral nutrient keeps bobbing to the top. This warrants a post, of course (or two, or three). Yup, no surprises here - I'm talking about zinc.

Let's start with the basics. This first bit of info comes from a rather abruptly titled "Zinc and depression. An update." from Poland in 2005. No cutesy titles in Poland! They get down to business. Good. I'm from Texas. I prefer cute yet vaguely threatening, myself (i.e. "Don't mess with Texas" as an anti-littering campaign - Also, driving out in the hill country, a large sign with scarecrow: "No trespassing. We don't call 911.") Gulp.

Zinc is a trace mineral (like magnesium, iodine, selenium, et al) that is essential for our continued life. Turns out that 300 or more enzymes in our bodies use zinc as a buddy to help them do their thing. DNA replication, protein synthesis, cell division - basic, mondo important, reliant on the presence of zinc. And guess what - the highest amount of zinc in our bodies is found in the brain - specifically in our hippocampus and cerebral cortex. Zinc deficiency can therefore lead to all sorts of unlovely consequences, such as ADHD, depression, alterations in behavior, learning, mental function, and seizures.

Turns out scientists of yore did all sorts of horrible tests on rats to figure out how zinc might be related to depression. Antidepressants seem to increase the ability of zinc to work as an anti-inflammatory agent in rat brains, zinc alone seems to be an antidepressant for rats, and the combination of zinc + small amounts of different classes of antidepressants (TCAs and SSRIs) enhanced the ability of the antidepressants to do their thing (helping the rats swim longer in hopeless situations, for example, or endure being held by their tails. Is reality TV really any different?).

Zinc therapy in rats also increases the amount of BDNF in rat hippocampi. Readers of the archives will note I am a big fan of BDNF in the hippocampus. And zinc reduced the fighting behavior of rats (and prisoners) too!

Yes, humans. Turns out Maes (my new hero - churned out a 62 page article on inflammation and depression this year, which I now have in my hot little hands!) discovered that zinc is low in the serum of humans with depression. Also, that low zinc seems to affect inflammation and immunity. Our T-cells (members of the immune system who hunt and kill infection) don't work well without zinc, and seem to release more inflammatory cytokines (IL-6 and IL-1) with low levels of zinc. Also, one of zinc's special actions is to inhibit the NMDA receptor in the brain. In suicide victims, there seems to be an alteration of zinc's ability to affect the NMDA receptor. (Turns out, BDNF + zinc helps calm down the NMDA receptor, leading to antidepressant effects.) Can zinc supplementation have antidepressant effects in humans? You may not be surprised at this point that the answer is yes (1).

There is more, much, much more to the story of zinc and psychopathology, but for now, let's end with good sources of zinc in our diets. Not surprisingly, the best sources of zinc are protein-rich meats, such as beef, pork, lamb, shellfish (especially oysters), chicken, turkey, etc. Pumpkin seeds are also a good non-meat source, and while grains have zinc, the absorption is strongly affected by the phytic acid in grains (2). Vitamin C, E, and B6 help you absorb zinc also. Seems that people with intestinal problems (celiac disease, inflammatory bowel), vegetarians, those with chronic kidney and liver diseases, alcoholics, and the elderly are most likely to suffer from zinc deficiency (3). Intake of more than 50mg a day (both from diet and from supplements) can lead to improper copper metabolism, altered iron function, reduction of HDL and reduced immune function.

More on zinc to come!

Tuesday, July 20, 2010

A Common Sense Defense of Animal Fat

When one looks in the books and on the internet about a "paleolithic-style" or "evolutionary-based" diet, one will likely be confused at the end of it. Some will focus on what Kurt Harris calls "paleo re-enactment" - meaning we should be hunting wild boar, digging up roots, and honey is okay. (When I first told a friend of mine, who happens to be a gastroenterologist at Johns Hopkins, about my interest in paleolithic diets, she remarked, "The only guy I ever saw on a paleo diet was pooping leaves and blood." Take that as an object lesson, paleo re-enactors, don't forage unless you know what you are doing!). Some paleo folks will be (once again using Kurt Harris' term - he has a knack for them), "pc-paleo" - meaning low fat paleo. Loren Cordain and Boyd Eaton are scientists and paleolithic nutrition pioneers, and their first look at the research focused on the fact that animals in the wild tend to be leaner than our domesticated animals. Cordain's The Paleo Diet: Lose Weight and Get Healthy by Eating the Food You Were Designed to Eat and The Paleo Diet for Athletes: A Nutritional Formula for Peak Athletic Performance will both advocate a lower-fat approach. Notice both were published prior to 2007. And as this is where the research was headed at the time, the "paleo" diets studied in diabetics by Staffan Lindeberg and others were relatively low-fat and sometimes a little weird - not sure how many of us foraged for canola oil and mayonnaise. But there you go. That's where an IRB gets you, I guess.

What's so important about 2007? Well, that's when Good Calories, Bad Calories: Fats, Carbs, and the Controversial Science of Diet and Health (Vintage) came out. I think Taubes' work is incredible and amazing, and required reading for any health professional or nutritionist. However, I do believe he focuses a bit too much on the carbohydrate hypothesis as the cause of Western Disease. He does, however, do a heck of a lot to exonerate fat.

Since 2007 and "Good Calories, Bad Calories", there has been the advent of what many now call "primal" style paleo, exemplified best by Mark Sisson's excellent book, The Primal Blueprint: Reprogram your genes for effortless weight loss, vibrant health, and boundless energy. Like Stephan Guyenet, I do think Mark is a little too hard on the carbohydrates (starch has a lot in common with, but ultimately is very different from sugar. This could be its own post, but if you have the time, I highly recommend this youtube video of a lecture by Robert Lustig, a pediatrician and obesity expert. He has a rather strange idea of what a paleo diet is, but hey, doesn't everyone?).

Anyway, the "primal" folks love fat, real food, and allow for some "sensible indulgences" such as high fat dairy, dark chocolate and red wine. And it is rumored that even Loren Cordain is more pro-fat than he used to be. Anthropological reports from all over the world will show us that our ancestors favored the fatty cut of meat, anyway, leaving carcasses and the lean meat to rot, and savoring the glands, liver, brains, and marrow, all high in saturated fat (1)(2)(3). In fact, when our ancestors were forced to survive on lean meat alone, they endured a life-threatening condition called "rabbit starvation." These symptoms of fatigue, weakness, diarrhea, hunger, headache, and low blood pressure and heart rate come from a diet of too much protein, and can be ameliorated by substituting some of the protein with carbohydrates or fat.

I've been told most of my life that animal fat is bad for me. The scientific story why this is not true is explained in exhaustive detail in a number of different, excellent books (including GCBC, but also Eat Fat, Lose Fat: The Healthy Alternative to Trans Fats) I especially like Mary Enig, as it seems to me she is the researcher primarily responsible for finally getting the FDA to come down against the nutritional horror that is industrial trans fats.

There are two, final scientific arguments against saturated ("animal") fats, in this case palmitic acid, that are not addressed in full by any of the previous sources I've named above. One is that a high amount of free fatty acids (palmitic acid for the most part) in the blood causes insulin resistance. Any erudite fat-lover will say to this, yes, of course it does. Palmitic acid is released into the bloodstream by the metabolic action of our own livers every time we fast (overnight, for example) or lose weight. It has to be. Otherwise we would never burn our fat stores, and we can't store much glycogen. The release of palmitic acid signals our body that we might not have a lot of food around, and we better preserve our precious glucose stores (the glycogen) for our brain to use. Therefore our muscles become more insulin resistant, and we shift to burning fat as fuel rather than glucose (burning fat sounds good, right?) This mechanism is physiologic, and I would say has very little to do with the full body insulin resistance of diabetes. Stephan and Peter have their own, brilliant takes on the matter (more required reading in my mind - no one said learning about nutrition didn't take a lot of time).

But let's get down to brass tacks on the insulin resistance/palmitic acid debate. Remember, and this is key, that when we burn fat and lose weight, we release a flood of palmitic acid into the blood stream. Deadly, hard core saturated animal fat. Yes, our own livers are trying to do us in. So why is fat loss, really, the best treatment for diabetes? Why does bariatric surgery, with the resultant forced semi-starvation, fasting, and FAT LOSS often result in the immediate turnaround of diabetes (4)? (Let's see - maximum of 2 pounds of fat loss a week = 7200 calories of fat burned, more or less, which would equal 1029 calories of fat a day = 114 grams of saturated fat (more or less) a day supplied by our own rumps! (or, hopefully, our visceral belly fat)).

This physiologic insulin resistance in fasting can, in the short term, increase fasting glucose levels in diabetics, it's true, and Gary Taubes' explanation of muscle insulin resistance vs overall body insulin resistance is a good one. This finding led the American Diabetic Association to recommend high carbohydrate diets to diabetics for years, and, in the long term, that is a huge mistake.

Did you know that feasting on your own stores of animal fat can help heart disease too (5)?

All right, I think I've fairly addressed the insulin resistance. Now let's go to the last, best argument against fat. Lipotoxicity. Perhaps I shouldn't have waited until the end for this one, as it is rather biochemistry-heavy. We'll give it a whirl. I have the full text for this paper (6), but let's summarize with the abstract:

"Insulin resistance is one of the pathophysiological features of obesity and type 2 diabetes. Recent findings have linked insulin resistance to chronic low-grade inflammation in white adipose tissue. Excess storage of saturated fat in white adipose tissue due to a modern life style causes hypertrophy and hyperplasia of adipocytes, which exhibit attenuated insulin signaling due to their production and release of saturated fatty acids. These adipocytes recruit macrophages to white adipose tissue and, together with them, initiate a proinflammatory response. Proinflammatory factors and saturated fatty acids secreted into the bloodstream from white adipose tissue impair insulin signaling in non-adipose tissues, which causes whole-body insulin resistance."

Let's leave off the insulin resistance for a bit and go to the proinflammatory section. Basically, what this paper says is that if we stuff our fat cells full of fat and then release saturated fat, we get inflammation. We hate inflammation here at Evolutionary Psychiatry, and this paper seems to implicate saturated fat as a major cause in a huge part of our bodies, the white adipose tissue (much of our stored body fat). This inflammation is called "lipotoxicity."

Paleo-diet aficionados tend to agree that doing the following are anti-inflammatory in the diet: omega-3 fatty acids (and avoiding omega 6), avoiding grains (especially wheat) and casein, eating vegetables and fruits, and avoiding sugar (fructose) and processed food.

Using common sense and biochemistry, saturated fat is anti-inflammatory compared to either cooked monounsaturated or polyunsaturated fat, given that heat and air can make the vulnerable unsaturated bonds oxidize and become rancid, which everyone can agree is highly inflammatory and bad for you. (Cold, fresh virgin-pressed olive oil would escape the oxidation, I hope!)

But here we have some science telling us saturated fat is inflammatory. And I don't think we have a full handle on that. I think what makes the most sense is to speculate that lots of saturated fat is inflammatory when combined with lots of sugar (big thanks to Dr. BG for helping me clarify this in my own mind! *edit* But I've further refined my thinking on this matter since publishing this post - please see the comments! Thank you *end edit*) Burning fat is a signal to our bodies that there might not be a lot of food around. Eating carbohydrates signals summertime! Lots of tubers and fruit and foliage! We stumbled upon a beehive! Yippie! Store it up! There was never a situation in our evolutionary past when we combined the grotesque amount of sugar we consume year round on the western diet, combined with the all the fats. Dr. BG called this, metabolically speaking, putting our foot on the brakes and the gas at the same time. Not great for the transmission.

I think, overall, the most important anthropological lessons are these - hunter-gatherers are healthy and, excepting infant mortality and accidents, long-lived eating a wide variety of macronutrient ratios. The Kitavans are high carb, the Inuit and Tokelau and Masai high fat. Most other folks were likely somewhere in between. But they did not consume tons of sugar. Or industrial seed oils, or wheat.

Again and again I come to that point. Epidemiology and observation won't show us the truth, but it will show us what could possibly be true, and what cannot. Saturated fat alone does not cause heart disease or diabetes. Neither do starchy carbohydrates alone. Those facts are true unless the Kitavans and the Inuit are blessed with magical pancreases or livers or hearts. My guess is they have the same old hearts you and I have.

Hopefully, science, a holistic approach, and sense will tell us the rest.

Sunday, July 18, 2010

Heart and Soul

UCSF medical school is running an 8 year prospective cohort study following patients with heart disease. The goal of the study is to gain some perspective on how psychological states affect your heart. A number of papers have been published (free full text! Suweet!), and today I'm focusing on one called "Scared to Death? Generalized Anxiety Disorder and Cardiovascular Events in Patients with Stable Coronary Heart Disease."

Anxiety has never had quite as much robust medical research that its big brother depression has. Perhaps because depression is easier to quantify and easier to treat. Anxiety tends to start in your youth, as you learn how to cope. By the time you end up getting treatment for it, you've probably had it for many years, even decades, and it becomes a part of who you are.

So it's no surprise that it is already known that depression and heart disease go hand in hand. If you have depression, you are more likely to develop heart disease (1), and if you have depression and heart disease, your heart disease will likely be worse (2).

But what studies there are of anxiety and heart disease show that anxiety is common among those with heart disease, and anxiety symptoms predict the amount of disability you will have. In the Heart and Soul study, 1015 people (mostly veterans from the VA medical centers) were followed for an average of 5.8 years. Generalized Anxiety Disorder (there are always specific criteria for these things, but in general, someone with GAD will worry a lot and have physical symptoms associated with worry, to the point that daily functioning is impaired. Panic attacks can also occur) was tested for via a Diagnostic Interview Schedule (that's a good test - a lot of studies will just use regular old scales, but the diagnostic interview is really the gold standard). In addition, everyone in the study had cholesterol measured, exercise capacity tested, a 24 hour heart monitor, 24 hour urine to measure norepinephrine and cortisol (chemicals associated with stress), C-reactive protein, and red blood cell percentage composition of fatty acids such as omega 3s, saturated fats, and omega 6s (which is the best way to figure out the fatty acid composition of the diet). Other patient data was also taken into account - age, sex, race, education, smoking, exercise, height, weight, and medications. Whew. All and all, pretty comprehensive, and all the latest technology. So far so good.

Each year, the investigators called up the study participants and asked about heart trouble. If anyone had an EKG, or a heart attack, any other "heart event" (specifically stroke, heart failure, MI, TIA, and death), the investigators got a hold of a copy of the medical records. Then they subjected the data to a tortuous round with the statisticians to try to sort out any confounding variables, and at the end, we get a bunch of nifty tables of information.

So what did they find?

10.4% of the participants met criteria for Generalized Anxiety Disorder (that's about in line with the literature - about 1/8 of people who visit their primary care doctor have GAD). Those who were anxious were also more likely to be younger, female, depressed, have better heart function on echocardiogram, take antidepressants and anxiety medicines, have lower omega 3 fatty acid levels in the red blood cell membranes, be less likely to exercise, and more likely to smoke. Also, they were less "adherent to medications" (what doctors call "noncompliance").

And the "heart events"? The annual rate of cardiovascular events was 6.6% for the people without generalized anxiety, and 9.6% of the people with GAD (p=0.03). That's annual! Meaning in 5.8 average years followed, there were a lot of medical records for the poor investigators to pore over!

And the confounding variables? (things which might cloud the statistical correlation between anxiety and heart disease) - male sex, heart function itself, exercise capacity, certain medication use, level of physical activity, and heart rate variability. So if those variables are "adjusted" for, you end up with a 62% greater rate of cardiovascular events for someone with generalized anxiety disorder and "stable" coronary heart disease. The raw data leaves you with a 74% greater rate of cardiovascular events. Either way, if you have heart disease, you are better off if you are more or less serene.

What do we take away from this paper? They have a discussion at the end worth reading. There's always a question in the medical literature about patients with psychiatric conditions - maybe they are sicker because they are too depressed or anxious to take care of themselves properly. They eat garbage, smoke more, exercise less. But, time after time, the studies show there is more to the connection than just crappy self-care. Interestingly, this study didn't find any link between physiologic markers of stress (the 24 hour urine measures, CRP, and heart rate variability) and the increased risk of anxiety disorders and heart disease. Smokin' and being lazy didn't explain the correlation either.

The authors postulated that a 24 hour urine wouldn't capture the risky "catecholamine spikes" of stress hormone that would be more likely to precipitate a heart attack. That makes sense. Then they speculate that anxious people are less likely to seek medical care (I sincerely doubt that one!), or are more likely to seek medical care (more realistic), thus the increase in recorded events was due to anxious people being more likely to consult their doctor with symptoms. Except, unfortunately, people with anxiety were more likely to be dead at the end of the 5.8 years of follow up, and that is one condition that isn't likely to be missed or uncounted.

Here we go: "It is also possible that there exists a common background origin to GAD symptoms and risk of cardiovascular disease." Also, GAD was associated with lower omega 3 fatty acid levels and depression, and "there is a clear association between lower omega 3 fatty acid levels and cardiovascular risk."

The conclusion? Take care of your anxiety! And eat some wild-caught salmon tonight.

Saturday, July 17, 2010

A Closer Look at Tartrazine

Tartrazine, also known as Yellow #5, is a coal-tar derivative azo dye found in a lot of processed food, including Kraft Macaroni and Cheese, Doritos, Mountain Dew, Peeps, and many soups, custards, mustards, baked goods, cotton candy, ice cream and tons and tons more. It's also in a million and one other products we may use on a daily basis - lotions, face soaps (including the one I used this morning), body soaps (including the one I used this morning!), cosmetics, shampoos, hand stamps - you name it! Despite this multimodal ingestion and cutaneous exposure, only a very small amount of the dye is used in each product, so total exposure might be around 1 teaspoon in a year (from wikipedia, so take that number with a grain of salt) (The CSPI site says around 12.75mg a day on average based on usage data - but probably more for kids and those who live on Mountain Dew). There's no reason, though, to freak out and empty out the pantry and medicine cabinet. Dose is important. But, as we well know, some of us can bathe in toxic substances and come out smelling like a rose, and others are sensitive to very small amounts.

The reason I'm looking at it more closely today is because if you hunt around the internet searching for possible creepy things about industrial food dyes, tartrazine has the worst reputation. And, indeed, it was one of the several dyes used in the Southampton Study I discussed earlier this week - and the whole study cocktail of dye and preservative did lead to increased hyperactivity in kids. Its use as a food additive is subject to a ban in the UK and voluntary bans in other European countries, like Norway. In fact, the Center for Science in the Public Interest in the US called for the FDA to ban Yellow Number 5 on June 30, 2010. (Here's a cute PDF from CSPI - Food Dyes, a Rainbow of Risk.)

But how might tartrazine cause problems? Well, some people (around 1/10,000, more or less) are definitely allergic to it. Hives, purpura, anaphylaxis, the real deal. This is why the FDA declared that it has to be on the labels of food if it is used - for people with hypersensitivity, and you can run afoul of the FDA if you include tartrazine in your product and don't label it (1). Not unusual - lots of natural and manmade chemicals cause allergic reactions in some people. It would be an unfortunate allergy to have, as yellow number 5 is in all sorts of things you wouldn't expect. Also, there seems to be a cross-reactivity in some people with asthma between tartrazine and aspirin - people with asthma caused by aspirin also seem to have asthma caused by tartrazine (2), and using desensitization techniques (gradual increasing exposure) to reduce aspirin sensitivity in one case study protected against the effects of tartrazine too. (3). Hmmm.

What about other actions of tartrazine? I couldn't find much. One study showed that application of small amounts of tartrazine caused contraction of intestinal muscle cells in guinea pigs, and the effect was blocked by atropine. That clues us in that tartrazine seems to be able to activate the parasympathetic nervous system, either directly or indirectly, via the muscarinic receptor (4). Now that is quite interesting, as the central nervous system has lots of muscarinic receptors of all types known (M1-M5). Various activating agents of these receptors can cause things such as confusion, convulsions, restlessness, and psychosis - in high enough doses. At lower doses they can sometimes do the opposite, and cholinesterase inhibitors (which increase the CNS activity of acetylcholine, a muscarinic activator) are used to treat dementia. We've taken several big leaps at this point, but it is theoretically possible that if tartrazine gets into the central nervous system, this muscarinic receptor activation might be a mechanism for some sort of psychiatric reaction (like increasing hyperactivity).

The most intriguing information comes from this study from 1990 about how tartrazine influences the zinc status of hyperactive children. Now I'm still trying to get my hands on the full text - institutional access website is being coy the last few days, and it seems this journal is only available online from 1995 on anyway. But the abstract is telling. 20 hyperactive boys and 20 "aged matched controls" were tested for zinc levels in their saliva, urine, 24 hour urine, scalp hair, fingernails, and blood. The hyperactive kids had decreased zinc everywhere but the saliva. Then 10 control kids and 10 hyperactive kids were given a tartrazine-containing drink. In the hyperactive kids, the blood levels of zinc went down and the urine levels of zinc went up, and their behavior got worse, suggesting that tartrazine caused them to pee out some much-needed zinc. It's a bit hard to tell from the abstract, but the way I read it, it looks like the control kids zinc levels didn't change, and neither did their behavior. So that might be the mechanism by which yellow number five influences hyperactivity in certain kids. Ironically enough, Concerta, a formulation of Ritalin, has yellow number 5 as a colorant in the capsule!

The only other dirt I found on yellow number five is that it was implicated also in reducing the absorption or metabolism of vitamin B6, leading to carpal tunnel syndrome, of all things (at least according to Dr. Murray). (Here's the link between B6 deficiency and carpal tunnel, anyway). The rumor that the yellow number 5 in Mountain Dew causes your testicles to shrink? Well, that's not true.

My stance is - for most kids there is no need to make a big scene at a birthday party by grabbing the bag of rainbow candy out of your kid's hand. But on a day to day basis, processed food should be avoided in favor of whole, real food anyway. Doing that will reduce exposure to the rainbow soup of chemicals in processed food. Not to mention the mountains of fructose, trans fats, and genetically modified ingredients. Real food is a win/win. Weirded out by yellow number 5 in shampoos and soap? Check the labels if it bothers you. Or use baking soda and apple cider vinegar as cheap alternatives to shampoo and conditioner.

Thursday, July 15, 2010

Hyperactivity and Diet

Here's the title to a June 2009 article in the Harvard Mental Health Letter: "Diet and Attention Deficit Hyperactivity Disorder - Can some food additives or nutrients affect symptoms? The jury is still out."

Hmmm. That sounds very noncommittal. Let's start instead with the 2007 Southampton Study. Published in the Lancet, Britain's premier medical journal, this was a well-designed nutritional study! We're talking a randomized, placebo-controlled, crossover study. That's enough to make any biological scientist's heart go pitter patter. So what did the researchers do? They took 153 3 year-olds and 144 8/9 year-olds recruited from the community, and did a baseline measure of hyperactivity on the kids via a questionnaire filled out by teachers. All the kids were put on an additive and preservative free diet. At weeks 2, 4, and 6, half the kids were randomly/but in crossover fashion given a study juice drink containing either Mix A (with tartrazine (yellow #5), sodium benzoate, sunset yellow, carmoisine, ponceau 4R) or Mix B (sodium benzoate, sunset yellow, carmoisine, quinoline yellow, and allura red AC). Mix A was supposed to be equal to the additives and preservatives found in 2 bags of candy and was similar to a mix in previous studies, and Mix B to 4 bags of candy - meant to be a representation of the average amount of additives a kid on a normal diet might receive. The other half of the kids got a placebo drink that was the same color and flavor as the test drinks, but had no artificial colors or preservatives (I'm sure they managed it somehow). On weeks 3 and 5, everyone got a placebo (these were "washout weeks."

Then the researchers asked parents and teachers to assess the children's behavior using standard clinical instruments, and also asked independent reviewers to observe the kids at school. The investigators found a significant increase in hyperactivity during the weeks the kids (both 3 year olds and 8/9 year olds) consumed the drinks containing the artificial colors. Some kids appeared to be more vulnerable to the effect than others, but the overall effect was a 10% increase in hyperactivity. It was similar to the results found by an earlier meta analysis, and, in summary, the effect of removing additives from the kids' diet is about 1/3 the effect of giving kids Ritalin to calm them down.

The analysis of the study came down to this - some kids are very sensitive to additives, and their behavior will be significantly impacted. With other kids, it won't matter much. The results of the study, however, were impressive enough that the UK and several other European countries banned the use of the studied food additives, so Skittles sold in the UK have no sunset yellow or tartrazine, though as far as I know, Skittles in the US have the same old recipe of hyperactive family fun. Sodium benzoate was not banned.

So why is the Harvard Mental Health Letter so noncommittal? Well, they cite a well known study of 35 mother-son pairs where the mothers believed the boys, ages 5 to 7, were sugar sensitive. The researchers told the Moms that the boys would be randomly assigned to a dose of high sugar or a dose of aspartame. In reality, all the boys received aspartame. The mothers who thought their sons got sugar reported their child's behavior to be significantly more hyperactive afterward. "The researchers concluded that parental expectation may color perception when it comes to food related behaviors." Really? Well, doesn't a double-blind prospective cross-over design with independent examiners CONTROL for that sort of thing in the Southampton Study? I would think so.

The medical letter goes on to talk about Omega 6 to Omega 3 fatty acids (they recommended kids consume 12 ounces of low mercury shellfish or fish a week) and micronutrients (deficiencies in zinc, iron, magnesium, and vitamin B6 have been documented in children with ADHD, but there is no evidence that supplementation is helpful, and megadoses, which can be toxic, should be avoided). Eventually, they recommend "a healthful diet" for kids and minimizing processed and fast food. Well, I can get behind that. And, frankly, I don't recommend a steady diet of skittles to anyone, though it would be nice to have the option (as the European moms do) of having candy without the crappy additives. Since hyperactivity affects 10% of children (1), what they eat can have a large effect, overall, on kids and parents alike.

Monday, July 12, 2010

D-D-Depression

I was deficient in vitamin D. Of course. I paid attention to the official word about sunshine - it's bad for you. Ultraviolet radiation chops up your skin cell DNA, and with enough scrambled DNA and a bit of bad luck, you will eventually get cancer. There are several major types of skin cancer, but melanoma is the scariest, also, sun gives you wrinkles and age spots and... so I've been putting on sunblock and avoiding the beach except for a few days a year for a least 10 years.

At the same time the dermatologists and women's magazines were scaring us away from the sun, our own fat phobia and a cultural trend of eating less organ meat scared us away from the best dietary sources of some key fat soluble vitamins (A, D, E, and K). We don't want to be low in these vitamins, as they tend to help orchestrate a lot of functions in the body. Vitamin D (which is found in animal fats, but we tend to get about 90% from the sun) in particular seems to be involved in about 10% of the biochemical soupy stuff our body does every day. It has a lot to do with membrane signaling and scavenging up any screwy cells that are starting to go awry (i.e. cancer), and being low in vitamin D seems to put us at hugely increased risk of cancer, including melanoma. And prostate cancer. And breast cancer. And colon cancer. In fact, women diagnosed with breast cancer in the summer and fall have the best prognosis. There are reports of chemotherapy not working as well in the winter. (1)

There are also links to mental health - depression, bipolar disorder, and psychotic disorders (2) have all increased in populations once most people stopped working outside and went to work inside. The elderly with low vitamin D also have much higher rates of depression (3). In this study of bone mineral density and depression, the elderly with poorer bone status were also more depressed (vitamin D was not explicitly stated to be the possible linking factor for both illnesses).

How would vitamin D affect the brain? Vitamin D is involved in the synthesis of the catecholamines (which are highly involved in neurotransmission). Summer sunlight increases brain serotonin levels twice as much as winter sunlight (4). Neurons and glial cells in all kinds of areas of the brain have vitamin D receptors on them, indicating a brain that is hungry to use vitamin D. Some effects in the nervous system include the synthesis of neurotrophic factors (what I call "brain fertilizer"), inhibition of the creation of an enzyme that chews up nitric oxide, and increasing glutathione levels. (See my previous posts for a molecular description of how some of these brain chemicals are thought to be involved in depression). As vitamin D in the periphery is associated with scavenging and cleaning cancer cells, vitamin D in the central nervous system seems to be involved in detoxification and anti-inflammatory pathways (5)(6).

Does supplementation help depression? Well, the first several studies were disappointing. Harris and Dawson Hughes tried treating Seasonal Affective Disorder with 400 IU vitamin D2 daily. Didn't do squat. Of course, D2 is the plant form of vitamin D (the animal form is D3), and 400 IU is a tiny dose anyway. Lansdowne and buddies gave 400 IU and 800 IU of vitamin D3 to healthy subjects in late winter, and found a lightened mood in those receiving the supplements. Hollis gave people with seasonal affective disorder a single 100,000 IU dose of D3, and found it to be more effective than light therapy, and the improvement was statistically correlated with the improvement in serum 25(OH) vitamin D levels. In this intriguing study, young adults were given access to tanning beds on Mondays and Wednesdays. One bed had UV light, and identical bed didn't. On Fridays, the participants were allowed to choose which bed they wanted. 95% of the time, they chose the UV bed, and participants also reported being more relaxed after a UV tan than in the sham bed.

Unfortunately, there is no large, well-designed study of D3 supplementation for depression that I'm aware of. However, there is enough interesting evidence for such a trial to be done, especially in populations that are more likely to be vitamin D deficient, such as the elderly. Like fish oil, vitamin D3 is cheap (about $10 for a three month supply) and readily available. And given the links to other diseases also (heart disease, stroke, osteoporosis, kidney damage, hypertension, you name it (1)), it would seem prudent (and money-saving from a public health standpoint if a lot of cancer is really prevented by adequate supplementation) to test for and treat deficiency in people with psychiatric disorders.

Another issue is that the RDA for vitamin D is woefully small. About 400 IU daily. This is an amount that will keep you from getting rickets, but it's certainly not an optimal amount for humans. I've heard murmurings that the official RDA is going to be increased to 1000 IU daily, and most decent multivitamins will have 1000 IU of vit D already (that's why your multi says "250%" of RDA of cholecalciferol (vit D3), in case you were wondering). The amount in fortified milk is also small, so that one would need to drink a truckload for it to matter much.

So how much vitamin D do we need, and hey, isn't vitamin D a fat soluble vitamin, which means we can store is for a long time, and couldn't we get toxic from high amounts? The answer is - we probably need many times the current RDA for vitamin D to get reasonable serum levels of the stuff, and yes, we can get toxic, but for most people that is not a realistic worry.

According to the Vitamin D Council, a serum level of 50 ng/ml or higher of 25 (OH) vit D3 is optimal. This level is not without controversy, and 35 is accepted by most as the minimal acceptable level. One probably doesn't want to go above 100, though toxicity has only been reported at serum levels higher than 150 (6). You can't get too much vitamin D from the sun - our skin actually destroys excess vitamin D made there after you have enough for the day. A cool regulatory mechanism if ever I heard one. You *could* theoretically get toxicity from combining high amounts of supplementation *and* lots of sunshine. There's a description on the vitamin D council website of one guy who actually did get toxic from supplements - turned out an industrial accident made his particular variety of vitamins (Prolongevity) contain up to 430 times the amount on the label. This guy was taking between 50,000 IU and 2.6 million IU daily for about two years. He recovered (uneventfully) with some medicine and sunscreen.

So how do you know if you have enough vitamin D? Well, if you are a lifeguard in Miami, you're probably fine. If you have very dark skin, unless you are a lifeguard on the equator, you probably need some supplementation. It can take someone with very dark skin about 5-6 times longer in the sun to get enough vitamin D to have adequate levels compared to someone with very pale skin. If you live north of 40 degrees latitude (above New York City), you only have a few weeks in the summer to expose that skin and get the full amounts of vitamin D you need to last you for the year, and you may have to supplement (again, there is controversy about this, especially as very pale people of Northern European ancestry seemed to live to the far north of 40 degrees and had only a few days a year they could possibly get adequate vitamin D from the sun). Anyway, to really know your blood levels of vitamin D, you need to get a blood test. The key level you need to know is 25-OH vitamin D3. If your doctor orders 1,25 OH or just "total vitamin D" you might not get the right number, so make sure you look at the lab slip. If you don't want to go to the doctor, you can go to this website and pay $65 or so for a home testing kit. Unless you live in New York state, where home testing via mailing bloodspot cards is apparently illegal.

So let's say you ordered a home test kit and stabbed your finger and shipped your spot of blood back to the lab and your level comes out to be 31 ng/ml. There's a general rule of thumb that 1000 IU of supplementation daily will increase blood levels by 10 ng/ml. (Use geltabs in oil suspension rather than tablets, unless you are always going to be taking the supplement with some oil/fat.) So let's say we are aiming for 50 - then one could take 2000 IU D3 daily in the morning. If you were already supplementing at 1000 IU (in your multivitamin, for example), you could take an additional 2000 IU daily, and you could skip the additional supplementation on days you spent time in the sun (without suncreen - sunscreen will block the UVB rays that synthesize vitamin D in the skin). Arms and legs exposure for 20 minutes midday in the summertime in Boston about 3-4 times a week would get you a goodly amount (probably around 10,000-12,000 IU with each exposure) if you have pale skin. That kind of exposure is not such a big deal for skin cancer risk, as long as you avoid burning. The farther south you are (until you get to the equator, then reverse!) and the paler you are, the less time you need.

It is standard practice for physicians to treat vitamin D deficiency with 50,000 IU pills once a week for 8-12 weeks, then recheck. Unfortunately, a recent JAMA study of similar treatment in elderly women (admittedly it was 50,000 IU D3 daily for 10 days) resulted in a great increase in the number of fractures. The editorial for the study thought 4000 IU daily was a safer, more physiological amount to treat deficiency, and be sure you are getting adequate calcium too. However, if you supplement with calcium and vitamin D3, as your vitamin D levels become adequate, your absorption of calcium can increase quite a bit (see slides 18-36). Therefore, you may not need as much calcium if you take vitamin D. The recommendations are not set in stone, though. (Our current RDA for calcium may be high simply because we don't get enough vitamin D!) Also, most of the prescription vitamin D doses are D2, not D3, and D2, the plant form, is probably not nearly as effective as the animal-derived form, D3.

Here's yet another thing to watch out for with higher-dose vitamin D3 supplementation. Occasionally, you will unmask some hyperparathyroidism. If someone's parathyroid is working on overdrive, he or she will start to have serum levels of calcium that are way too high, potentiated by the higher doses of vitamin D3. This can be dangerous if it goes undetected, though high calcium levels can be very uncomfortable, with symptoms of muscle twitching, cramping, fatigue, insomnia, depression, thinning hair, high blood pressure, bone pain, kidney stones, headaches, and heart palpitations. Since bone pain, fatigue, depression, and insomnia can be symptoms of low vitamin D3 as well, it is important to realize that if your symptoms get worse with supplementation, you should see your doctor and get a calcium and parathyroid hormone checked. While I personally don't check calcium levels with the initial vitamin D level, I do check it for follow-up ones (I tend to check after three months or so). While home testing is a neat option for the initial level, seeing your doctor about follow up and his or her suggestions for supplementation is a good idea if your level is found to be low.

And what about those other fat soluble vitamins: A, E, and K? It is important that you have enough of each of them, or things can get a bit screwy. For example, in order to create bone, you need adequate vitamin D (at least a level of 20-30), adequate calcium, AND vitamin K2. The best sources are animal fats, particularly the fats from animals that eat their natural diet - grass for cows, or grubs and grains and whatnot for chickens. So pastured chicken egg yolks, and butter and liver from pastured cows. Conventionally-raised eggs can have about 1/20th the vitamins of pastured eggs, and butter from grain-fed cows may have as little as 1/200th as much K2 as pastured butter, so it really does matter what the animals you eat ate. Vitamin A is also found in multivitamins and it is important not to have too much vitamin D3 and too little A, so I've recommended a multivitamin in addition to vit D3 for people who are deficient in serum 25 (OH) vitamin D (and aren't big liver eaters :)).

Strict vegetarians - here's another place you need to be super careful about what you eat, and you might need to choke down some fermented soy products (netto) to get enough Vitamin K2. K2 isn't found in a standard multivitamin (though we can make K1 into K2, if our intestinal flora is happy, which it might not be on a standard American diet - no idea about flora in a vegan diet. Interesting question) and is vital to bone formation and in keeping our arteries resilient. K2 is what warfarin blocks, so don't take it if you are on coumadin for blood clots. (Though why are you at risk for blood clots in the first place? maybe too much omega 6 compared to omega 3??)

So, a key part of good, lasting health is either to get plenty of (safe - no burns!) sun as our ancestors did, or use today's science to get your blood levels of vitamin D where they need to be. Chat with your doctor about it - and check out the Vitamin D Council Website for more information.

Saturday, July 10, 2010

Born to Run

Have you read this book: Born to Run: A Hidden Tribe, Superathletes, and the Greatest Race the World Has Never Seen?

I love that book. McDougall manages to populate the narrative with a whole cast of wacky, real-life ultrarunning characters. While doing so, he traces the roots of the human race as persistence hunters. Before we invented guns and spear-throwers and whatnot, we ran our prey down. We have the ability to keep up with and surpass (to the point of exhaustion) a horse, an antelope, any herd animal, over the long haul. The way we look, how we eat, how we breathe, why we think ahead, how we might have begun the rather uniquely human process of mentalism (being able to understand from someone else's point of view) - McDougall uses the paradigm of persistence hunting to tell us who we are.

(Here's a link to a youtube video of a persistence hunt by the Kudu, narrated by Attenborough himself - well worth a look!)

After reading Born to Run, I bought myself some VibramFiveFingers and started running again. I'm not the best runner, but I've noticed since stopping wheat and milk (except butter), I never get stitches in my side anymore. I used to get them with nearly every run, no matter how hard or long I trained. They tended to go away after about 30 minutes, so as long as I could "power through" the pain at the beginning, I could get in a nice long run. One doesn't get through medical school and residency without a bit of tolerance for unpleasantness and pain. It doesn't necessarily make you wise, but it may make you tough. The other day my husband made pizza, and (moderation being my motto - I have wheat maybe twice a month), I had a few slices. The next morning, I couldn't do my usual sprints - stitches in my side! Remarkable, that. Don't know if it is the wheat or the cheese. Butter seems to not have the same effect. But I've digressed.

I don't do too many long runs anymore. I try to work out most days a week, but it is mostly weights, sprints, long walks or hikes, and the occasional 5K in the neighborhood. Turns out that long, hard runs (or any long, hard cardio workout, such as a hard-going 115 minute spin class) may be bad for you.

I know. It is hard to fathom. I even looked at the studies a few times before I believed them. Here's one of them. Here's an article about another. Turns out, a lot of marathon runners have crappy, plaque-filled coronary arteries. I'm no radiologist. But Kurt Harris is, and his blog post (and the follow up is here) explains the studies better than I ever could.

My mother had a copy of Aerobics for Women by Kenneth Cooper. Mrs. Cooper writes a lot of the book (I suppose to make it more accessible to women), and she describes how, in the 60s, no one exercised, and her husband Dr. Cooper was the neighborhood freak who loved to jog. The conventional wisdom at the time was that exercise was probably bad for you. (The book also had an exercise program for the "young, dating girl" that counted Friday night's dancing date as part of your points for the week. I should watch more Mad Men).

So is exercise bad for you? Aren't we born to run?

No, and yes, but not born to run long distances very fast. Mark Sisson explains all of this very well in his terrific book, The Primal Blueprint: Reprogram your genes for effortless weight loss, vibrant health, and boundless energy Chris McDougall in Born to Run reports on a persistence hunt where the runners went 10 minute miles. 10 minute miles aren't too fast for hearty, healthy people who run their whole life long. Mark Sisson breaks it down so that for any "chronic cardio" the best thing is to keep your heart rate below 75% maximum (in general, 220 minus your age X 0.75). For most of us, that means a leisurely bike ride, a brisk walk, or a light jog. Chronic hard cardio raises cortisol, stress, and inflammation. Evolutionary Psychiatry is all about anti-inflammation.

Exercise is good for the psyche! No question (1). Regular, 5 times a week moderate exercise caused the remission of mild or moderate depressive symptoms in 42% of those who did it for 12 weeks. Here's another well-known study from 2000. From an evolutionary medicine perspective, we are meant to move.

Does exercise help you lose weight? Well, "chronic cardio" probably doesn't (also, it's not good for you). We tend to eat more to offset the exercise that we do. High intensity interval training (HIIT) definitely does help fat loss (2), and weight training increases muscle mass (thus increases the metabolism and aids in fat loss). Hikes and walks are excellent for the soul, and the more "basic" fitness (the more time you spend walking and hiking, etc.), the more you can push very hard on those once or twice a week sprints to maximize the fat loss. Just steer clear of the chronic, hard-going cardio. Or keep it to a few times a month. Suffering may be good in moderation.

Friday, July 9, 2010

"Epidemiology is Bogus"

The paleoblogosphere is humming with excitement the past few days over the glorious work done by raw food blogger Denise Minger in her personal examination of the China Study, a large data set of epidemiology studies used by researcher T. Colin Campbell to formulate his book The China Study: The Most Comprehensive Study of Nutrition Ever Conducted and the Startling Implications for Diet, Weight Loss and Long-term Health. In it, Campbell comes to the conclusion that avoiding animal protein is the best way to avoid all sorts of diseases of civilization.

Denise's post uses the same China data set to implicate wheat as a major factor associated with heart disease and cancer, not animal protein. Kurt Harris of PaNu makes a point in his analysis of Denise's work that is the closest to my own struggles combing the literature - association studies are interesting, but troublesome. Conclusions from such studies should be viewed with a furrowed brow. "Hmmm, that's intriguing. I wonder why red-tailed baboons who eat less algae live longer than the red-tailed baboons who eat more algae? Why don't we do a prospective randomized controlled trial of red-tailed baboons and algae eating to sort that one out?" Because associations always come with confounding factors - turns out red-tailed baboons who eat algae love race car driving, and have you ever seen a baboon wearing a seat belt? Sophisticated statisticians will try to account for all these "confounding factors," but such a task can be simply impossible when examining a complex system such as society, or the biochemistry of the human body.

The psychiatric literature is loaded with brief, often useless, short-term randomized controlled studies of lesser and better quality and association studies. It seems the nutrition literature is even worse. Large, long prospective trials of good quality are horrendously expensive, and may take decades to do properly.

This is why I feel the healthiest and most sensible way to eat is based on an evolutionary paradigm, and that evolutionary-based lifestyle measures (though I haven't blogged too specifically about them yet, as the nutrition aspect is my primary personal interest, these measures would include regular exercise, meditation (as a proxy for the intense in-the-now concentration we used to use for hunting and gathering), proper sleep, working with the hands, and various other social/fun stress-reducing activities) are likely to be most effective for our modern, unsettled minds. It is not that there is a huge amount of science data backing that up - it is that until we have exhaustively proven otherwise (which so far, in my mind, we haven't), at least scientists can concede that our exceedingly complex human design is based on adaptations for our ancestors' lives.

For public health and sanity, I believe in an evolutionary viewpoint. And animal protein :)

Thursday, July 8, 2010

Vegetarianism and Eating Disorders

A study from the Journal of the American Dietetic Association hit the press in April of 2009 with a rather provocative association - young adult and adolescent vegetarians have an increased risk of eating disorders.

Let's examine the data for a moment. 2516 participants from Minnesota schools (in 1999 and 2004) aged 15-18 and 19-23 answered questionnaires regarding food, exercise, weight, binge eating behaviors, dieting behaviors, demographics, and substance abuse. A huge cohort was originally recruited in ethnically diverse high schools, but many of them (around 50-60%, depending on the cohort) were lost to follow up. The "EAT-II" sample (of 2516 participants) who answered both surveys were more likely to be female and were less ethnically diverse than the overall study sample, which is important to know, as the EAT-II folks were used for the article I'm reviewing here.

Of the 2516 folks, about 15% labeled themselves as current or former vegetarians, but very few were vegan, as 95% of them consumed milk products, 87% consumed eggs, 46% consumed fish, and 25% consumed chicken. Most were vegetarian because they wanted a more healthful diet, and a slightly lesser percentage didn't want to kill animals, didn't like meat, wanted to help the environment, to lose or maintain weight, or were vegetarian for family or religious reasons. In the older cohort, vegetarians were thinner (not true of the younger ones), and all of the vegetarians tended to eat more fruits and vegetables and ate less fat than non-vegetarians. Overall exercise patterns were about the same between vegetarians and non-vegetarians, and younger vegetarians were less likely to drink or smoke cigarettes, but more likely to use "non-marijuana" drugs than non-vegetarians.

Here's where it gets interesting - vegetarians were more likely to engage in "extreme unhealthful weight control behaviors" (p<0.005) and "binge eating with loss of control" than the nonvegetarians (p<0.001). The extreme weight control behaviors included use of vomiting, diet pills, laxatives, and diuretics, and approximately 1 in 4 current adolescent and young adult vegetarians admitted to using one or more of those methods to lose weight. Only 1 in 10 of never-vegetarians admitted the same. So what comes first - the decreased consumption of chicken or the semi-avoidance of eggs? Well, this article (or the abstract anyway - the original article is in Hebrew!) suggests that vegetarianism precedes eating disorders, though if one talks to patients with eating disorders, many will explain that they told people they were vegetarians so they could avoid eating in social situations.

The authors of the 2009 study, in an editorial in the same issue, expressed their belief that vegetarianism did not cause eating disorders, but could be used as a marker by concerned parents and health care practitioners to be more suspicious of an eating disorder in a young adult or teenager who is vegetarian. Lierre Keith (not surprisingly) has a different view in her book, The Vegetarian Myth: Food, Justice, and Sustainability (p230-4) Nutrition therapists at eating disorder clinics in in Indiana, Boston, and Los Angeles reported that between 30-50% of their patients were vegetarian. Julia Ross, a nutrition writer, thought the reason might be due to lower amounts of tryptophan in a vegetarian diet (1). Recall that tryptophan is the precursor to serotonin, so that without enough of it, we are vulnerable to anger, anxiety, depression, and addictive behaviors. Zinc is a mineral tough to come by in a vegetarian diet (it is found mostly in egg yolks and red meat), and zinc deficiency is known to cause depression, obsessive and compulsive behavior, and eating disordered behavior. Supplementation with zinc is a known and extremely helpful treatment for anorexia nervosa, helping sufferers regain weight faster than those without the supplement (2). Binging and vomiting can trigger an endorphin rush, which can temporarily mask anxiety and depression (bulimics given an opiate blocker, naltrexone, report symptoms of opiate withdrawal, according to Ross). Fasting and extreme dieting lower other vitamins, like thiamine, which can cause loss of appetite, making it easier to fast, on and on and on.

Lierre Kieth has her axe to grind, but I do think her book is an important counterpoint to the milder view (and worth a read for vegetarians and meat-eaters alike). Julia Ross' theories and her clinical experience are also intriguing. Vegetarians tend to be healthier and thinner than those who eat a standard American diet (which I'll say is not saying much), but there may be a hidden cost, and when a young person is a vegetarian, it seems that one should be somewhat suspicious that there is a darker reason than not wanting to kill animals.