Monthly Archives: March 2014

The Health Benefits of White Chocolate (yes, they exist)

I have a strange chocolate-related pet peeve. When someone tells me “You like white chocolate? But that’s not even real chocolate!”…it makes me want to hurl a block of white chocolate right at their smug face.

White chocolate gets absolutely no respect, while dark chocolate can apparently cure cancer and bring the dead back to life. And the darker the better. If you like 70% dark, your foodie friend will undoubtedly poo-poo that in favor of 85% or darker. Well it’s finally time to stand up, white chocolate lovers! All three of you. Because white chocolate is healthy, and here’s why.

The advantages of white chocolate

First off, dark chocolate is healthy, and is a PHD-recommended “supplemental food”. But just like how Arnold Schwarzenegger overshadows Danny DeVito in the movie “Twins”, so goes the relationship between dark and white chocolate. They share the same mother (the cacao tree) and the more popular sibling has desirable traits absent in the less popular sibling (flavonoids and psychoactive compounds). But just like Danny DeVito, white chocolate actually has a heart of gold (or rather, a heart of stable fatty acids).

White chocolate is simply cocoa butter plus some added milk, while dark chocolate is cocoa butter with (dark) cocoa particles added instead of milk. Each can have varying levels of sugar. White chocolate undergoes remarkably little oxidation during storage or cooking, unlike vegetable oils that autooxidize and may become carcinogenic. A quick look at the fatty acid profile of cocoa butter explains its stability. It’s made up of 60% saturated fat split between stearic and palmitic acids, about 30% monounsaturated, and only 3% polyunsaturated fat. The high stearic acid content means that cocoa butter has a smaller effect on cholesterol than do other fats with similar saturated percentages.

chloe-chocolate

Cats can’t read nutrition labels…or can they?

Plus cocoa butter has been shown to improve resistance to oxidation in rats when compared to vegetable oil. Cocoa butter may even help protect against fatty-liver related endotoxemia, via upregulation of an enzyme called ASS1 (argininosuccinate synthase 1, hee hee).

Dark chocolate sometimes contains mycotoxins and aflatoxins (which can be carcinogenic, depending on dose), whereas white chocolate doesn’t. While cocoa butter doesn’t have nearly as many health benefits as dark chocolate, it’s been shown to benefit platelet function and potentially atherogenesis more than dark chocolate (in men, but not in women…why must you be so confusing science?).

Finally, white chocolate isn’t toxic to dogs and cats due to low theobromine levels, whereas a high-percentage dark chocolate bar could kill a small dog. White chocolate also has negligible amounts of caffeine and other bioactive compounds, due these being water soluble and hence removed during processing. For those sensitive to dark chocolate, who exhibit symptoms such as as intestinal distress or headaches after consumption, white chocolate may be a good alternative.

Why is white chocolate a black sheep?

So why is cocoa butter rarely eaten (outside of chocolate bars), when other low-PUFA fats such as coconut oil and ghee are prized? Three reasons stick out: price, availability, and stigma. High-quality dark chocolate costs around a buck an ounce or less. It’s hard to even find high-quality white chocolate outside of some internet stores, and the price of edible pure cocoa butter is typically 50% or more above that of dark chocolate. Plus pure cocoa butter doesn’t really come in the form of standard-sized chocolate bars.

If you want bars, you have to jump down from 100% cocoa butter to around 30% or much lower, and these bars typically have almost 50% sugar content. Note that in 2002, the FDA set a minimum cocoa butter percentage of 20% in order for a product to be labeled “white chocolate”. But this definition also included minimum levels of milk solids and milk fat, so those staying away from milk may prefer pure cocoa butter. In my Willy Wonka white chocolate utopia, there would be 70%, 85%, and 92% white chocolate bars available at every grocery store, but as it stands you have to buy sugary white chocolate or make your own lower sugar version using cocoa butter.

“Cocoa Cooking”: say it three times fast

Stigma is a real problem for white chocolate producers. For some reason, the white chocolate lobby simply doesn’t have the power of Big Pharma or Tobacco, so you’ll never see ads encouraging you to try healthy cocoa fat in recipes. But you definitely can. The popularity of mole sauce suggests that even strong chocolate flavors can be handled in savory dishes. One chef says:

“It’s important that people break out of the mindset of chocolate as candy. Chocolate is chocolate. If you let it stand on its own, it becomes an ingredient like cumin or butter. If you let go of what your preconception of chocolate is–which for 99.9 percent of the people is a candy bar–it becomes another culinary weapon in your larder.”

Indeed, for most of its history, chocolate was exclusively used with savory/spicy ingredients. Cooking cocoa butter with meats or veggies may give them a heartier flavor, or you could try something more daring like salmon with white chocolate sauce. Snacks made with cocoa butter may be more practical than snacks high in coconut oil, as cocoa butter doesn’t melt around room temperature like coconut oil does.

Compared to dark chocolate, white chocolate needs much less sugar to mask bitterness (especially for us supertasters). It also complements delicate flavors whereas bitter dark chocolate overwhelms them, so my hunch is that white chocolate (or just cocoa butter) could provide a versatile base for things other than just sauces and desserts.

If you cook for someone, and the dish has cocoa butter or white chocolate in it, be prepared to defend your daring choice. Concerning the claim that white chocolate isn’t actually chocolate, I find that a bit ludicrous. When cocoa beans are pressed, they ooze out cocoa butter. It’s very similar to the relationship between olives and olive oil, or coconut flesh and coconut oil. When’s the last time you heard “Olive oil? But that isn’t even real olive!” I’ll chalk the misunderstanding up to the prevalence of fake white chocolates (below 20% cocoa butter or often none at all) combined with the rarity of dishes made with cocoa butter. You can’t like white chocolate if you haven’t had it, or if all you’ve had is an overly-sugared white chocolate bar.

Conclusion

Theobroma, the genus containing the cacao tree, translates to “food of the gods”. Chocolate was used as currency by ancient Mayans, and kept in safes with gold and precious stones. While dark chocolate is currently exalted as a king among medicinal foods, white chocolate has become the court jester. This is a shame, as white chocolate holds promise as a healthy and underutilized cooking ingredient.

White chocolate and cocoa butter keep extremely well at room temperature, so don’t be afraid to buy some and do your best Iron Chef impression. If the dish works out, feel free to comment here with the recipe. And if you find a high-quality white chocolate bar low in sugar, let me know. Kamal/Chocomal/Caramal excels in taste-testing.

Is Mineral Water an Underrated Supplement?

Many of us try to eat an ancestrally-influenced diet, which brings up a question: should we also drink similar types of water as our ancestors? There are a few practical reasons why mineral water makes for an interesting source of nutrients. Unlike food, water is calorie-free. And unlike supplements, you don’t have to remember to take water each day. Plus some people really enjoy the taste of mineral water.

So let’s take a peek into the world of mineral waters and health. The reason we’re focusing on mineral water is that it’s a type of water that contains measurable nutrients, and is thus somewhat less susceptible to pseudoscientific claims (yes, I’m looking at you “alkaline water”).

What does mineral water have that you want?

Minerals, duh! Which ones though? The Cadillac of mineral waters, Gerolsteiner, tested as having 112 mg of magnesium per liter of water and 368 mg of calcium, along with 134 mg of sodium.

20140315_170453

The Ford Focus of mineral waters, Poland Spring, has just one milligram of magnesium and calcium per liter, and four times that much sodium! (meaning four milligrams…just showing the power of ratios to deceive) So what do these numbers mean in terms of health benefits or detriments?

Let’s start with magnesium. Magnesium is a PHD-recommended daily supplement, unless you get enough from food. Rather than choking down two horse pills, some people prefer powder, epsom salt baths, or spray. Magnesium has oodles of therapeutic functions outside of its commonly known roles in heart and bone health, extending to bowel disease, migraines, anti-aging effects, and reducing chronic inflammation. Unfortunately magnesium levels have taken a hit from routine municipal water softening as well as lower concentrations in crops.

Most tap water contains negligible amounts of magnesium. So unless you live in Lubbock, Texas (where the tap water has 60 mg/L of magnesium, almost twice as much as any other major US city), bottled waters are the best option for liquid magnesium replenishment. My local Trader Joe’s sells Gerolsteiner and San Pellegrino, and other grocery stores sell Perrier and Evian. The latter two contain very little magnesium, while San Pellegrino has half as much as Gerolsteiner.

Sodium and calcium…eh

High-sodium mineral waters sometimes get a bad rap, even though low salt intake is associated with higher mortality rates. Plus the sodium in mineral water is usually in the form of sodium bicarbonate, which has been shown to actually decrease blood pressure in hypertensive patients.

If you’ve read the Perfect Health Diet book then you know the potential dangers of supplementing calcium. On the other hand, for those who lack leafy greens or dairy in their diets, mineral waters could be a good option. European mineral waters, that is. As might be expected of the land of higher life expectancy, finer wines, longer maternity leaves, and smellier cheeses, European mineral waters tend to have far more calcium (and magnesium) than their American counterparts. Popular European waters such as the aforementioned Gerolsteiner, San Pellegrino, Perrier, and Apollinaris all have between around 100-370 mg of calcium per liter. The higher end of this range would bring a calcium-deficient diet close to optimal levels when drinking 1-2 liters a day.

“Trace” Minerals sound so insignificant

…but they’re not! One of the most important trace minerals is lithium. While high-dose supplementation of lithium may impair immune and thyroid function (these doses are prescribed for psychiatric disorders), an optimal lower dose (but higher than what Americans typically take in through food) is linked to longer lifespans and lower rates of mental illness. Areas where tap water has the lowest lithium levels have higher suicide and homicide rates.

Rather than splitting lithium supplement pills to get small enough doses, one could get low doses of lithium through mineral water. “Lithia waters”, mineral waters high in lithium, were a craze in the late 1800s and early 1900s due to numerous testimonials on their miraculous health benefits. But most mineral waters actually have quite low lithium levels, so you have to look hard to find a water that provides enough to equal very low-dose supplements.

If you can find a brand that has somewhere between 0.1-0.3 mg per liter (or more) of lithium, it may produce some beneficial effects. Gerolsteiner, for example, contains 0.13 mg per liter. It might not take much to produce benefit — microdoses of lithium as small as 0.3 mg per day have been shown to improve cognitive impairment in Alzheimer’s patients. Note that diets low in plants and seafoods typically have lower lithium levels, and lithium concentrations in plants varies widely (Texas and western states have much higher lithium levels in soil and water than the rest of the country)

To sparkle or not to sparkle, that is the question

Some people love sparkling water. But in an informal survey (me surveying myself), sparkling water has been found to be difficult to drink in large quantities because it doesn’t go down as smoothly while gulping and makes you burp. That mitigates the ease of using water as a supplement. There’s an obvious way to get around this — just let the water go flat.

20140315_170532Alternately, you can choose a water that is not carbonated (aka “still water”). The problem is that still waters typically have much lower mineral levels than naturally carbonated waters. For example, Gerolsteiner offers a still water that has less than half as much magnesium as its popular sparkling water. Gerolsteiner’s website explains why this is:

“Gerolsteiner takes its mineral water from various sources in the depths of the Volcanic Eifel…It is the natural carbonic acid that allows the water to absorb the valuable minerals and trace elements from the rock.”

Note that a couple California-sourced still mineral waters such as Adobe Springs (also sold as “Noah’s Spring Water”) have magnesium levels comparable to fancy European sparkling waters, and also have that distinctive mineral water taste. But while some people love the taste of mineral water, others find hard water off-putting. To counter this, you can add lemon or let some cut berries infuse throughout it. Classy!

What does mineral water have that you don’t want?

One notable effect of drinking mineral water is a reduction in mean weight of your wallet. To reduce the cost and help with portability, you could try Concentrace, a concentrated little bottle of dried minerals from the Great Salt Lake that has the salt removed. I’m torn between the ease of using Concentrace and the possible dangers of using it. First the good part: you just put a few drops of Concentrace in your water, and it becomes highly mineralized with not just magnesium but a variety of trace minerals. Concentrace may improve joint pain, as shown in this trial of knee osteoarthritis (although this study doesn’t appear to be indexed by Pubmed…hmmm…).

However, be careful with the Concentrace. Tap water is not allowed to have more than 0.01 parts per million of arsenic. The Concentrace instructions say to use a total of 40-80 drops per day, so let’s say you use 20 drops in a glass of water. That comes out to right around 0.01 parts per million of arsenic. Uh-oh?

So is mineral water an underrated supplement?

There’s probably a reason why mineral water springs were highly prized by so many ancient cultures, with people traveling many miles to seek health benefits. In modern times, the World Health Organization has recognized magnesium levels in drinking water to be an important public health issue, due to the possible heart disease benefits of drinking hard water.

We didn’t even get into potential benefits from taking in higher amounts of other trace minerals. Nor did we discuss benefits from higher intake of bicarbonate, which is present in many minerals. Or how about skin hydration benefits, or the ability of hard water to avoid mineral leeching that happens when boiling foods in soft water?

The magnesium and calcium in mineral water is also typically highly bioavailable (even more so when consumed with a meal). All in all, mineral water may be a useful addition for those that can afford it, as it can provide a reliable daily boost to levels of important nutrients.

The Case of the Killer Protein

Earlier this week a paper was released to much fanfare, claiming that diets with over 20% of energy as animal protein might be as life-threatening as smoking.

  • The Huffington Post said, “Atkins aficionados, Paleo enthusiasts, and Dukan devotees, you may want to reconsider what’s on your plate. While high-protein diets have been all the rage over the last few years for their waist-whittling goodness, a new study says they could be as bad for you as smoking.”
  • Scientific American said “People who eat a high-protein diet during middle age are more likely to die of cancer than those who eat less protein, a new study finds.”
  • NPR said, “Americans who ate a diet rich in animal protein during middle age were significantly more likely to die from cancer and other causes.” They added, “In an age when advocates of the Paleo Diet and other low-carb eating plans such as Atkins talk up the virtues of protein because of its satiating effects, expect plenty of people to be skeptical of the new findings.” A sound prognostication!

Ray, Alex, Navy87Guy, Kat, Sam, and others asked for my thoughts.

What the Researchers Did

The article appeared in Cell Metabolism, a high-impact journal which likes long complex papers reporting years of work. [1] A common strategy for getting into such journals is to piece together a great variety of work into one article, weaving a narrative theme to unite them. That’s what this article did, using the theme “high protein diets may shorten lifespan” to link several relatively disconnected projects.

The NHANES Findings

The work that generated most of the buzz was an analysis of data from the National Health and Nutrition Examination Survey (NHANES). They looked at a group of 6,381 NHANES respondents and found, “Respondents aged 50–65 reporting high protein intake had a 75% increase in overall mortality and a 4-fold increase in cancer death risk during the following 18 years. These associations were either abolished or attenuated if the proteins were plant derived.”

Here’s their Figure 1:

Longo et al Figure 1

Two oddities in this result raise red flags:

  • First, protein appears harmful at age 50, neutral at age 65, and beneficial at age 80. This reversal of effects is incompatible with most mechanisms by which protein could affect aging or disease risk. In animal studies, we see the opposite: protein restriction extends maximum lifespan, which means that at high ages, mortality is lower, but increases risk of early death, which means that in middle age mortality is higher.
  • Second, they report that the effect was specific to animal protein: “[W]hen the percent calories from animal protein was controlled for, the association between total protein and all-cause or cancer mortality was eliminated or significantly reduced, respectively, suggesting animal proteins are responsible for a significant portion of these relationships. When we controlled for the effect of plant-based protein, there was no change in the association between protein intake and mortality, indicating that high levels of animal proteins promote mortality.” Yet, plant and animal proteins are biologically similar.

These two oddities strongly suggest that the appearance of negative health outcomes from protein is due to confounding factors – behaviors or foods associated with animal protein consumption in middle age, rather than effects caused by the protein itself.

When we look at how the analysis was performed, we find more reasons to doubt that protein is at fault. All of this data was found using a model which adjusted for the following covariates:

Model 1 (baseline model): Adjusted for age, sex, race/ethnicity, education, waist circumference, smoking, chronic conditions (diabetes, cancer, myocardial infarction), trying to lose weight in the last year, diet changed in the last year, reported intake representative of typical diet, and total calories.

Adjustment for a host of health-related conditions – waist circumference, diabetes, cancer, myocardian infarction, and even total calories which is effectively a proxy for obesity – can radically distort results, and even transform effects from positive to negative. I’ve discussed this issue previously in The Case of the Killer Vitamins.

In practice, many factors are highly correlated. The variables being studied – protein intake, waist circumference, total calorie intake, and others – are beset by the problem of collinearity. Attempting multiple regression analysis on collinear variables can generate very peculiar results. The more the number of adjustment factors grows, the more strange things tend to happen to data.

If they wanted us to understand whether their results are trustworthy, authors would present raw data, and then a sensitivity analysis that shows how introducing each covariate individually affects the results, then showing how including combinations of two covariates affects the results, and so forth. This would help us judge how robust the results are to alternative methods of analysis.

Of course, authors do not do this. Instead, they ask us to trust the analysis they have chosen to present – which is only one of billions they could have done. (This study adjusted for 13 covariates. The NHANES survey may have gathered data on, say, 40 variables. There are 40 choose 13, or 12 billion, possible multivariate regression analyses that could be performed using 13 covariates on this data set. Each of the 12 billion analyses would generate different outcomes.)

Are the authors trustworthy? Unfortunately, most academics today are not. Career and funding pressures are severe, and by and large those who are good at gaming the funding and publishing processes have triumphed professionally over careful, diligent truth seekers. It is much easier to construct a narrative that will garner attention and publicity and interest, than to carefully exclude non-robust results and publish only those results that are solidly supported.

Frankly, I give little credence to their NHANES analysis. And, judging by comments in the press, other epidemiologists don’t seem to give it much credence either. From the NPR article:

But could eating meat and cheese really be as bad for you as smoking, as the university news release describing the new Cell Metabolism paper suggested?

Well, that may be an exaggeration, according to Dr. Frank Hu, a researcher at the Harvard School of Public Health who studies the links between health, diet and lifestyle.

“The harmful effects of smoking on cancer and mortality are well-established to be substantial, while the harmful effects of red meat consumption are modest in comparison,” Hu wrote to us in an email.

The Mouse Experiments

So let’s turn to the next part of the study, the mouse experiments:

Eighteen-week-old male C57BL/6 mice were fed continuously for 39 days with experimental, isocaloric diets designed to provide either a high (18%) or a low (4%–7%) amount of calories derived from protein …

The low protein diets are really starvation diets, in terms of protein intake. The reason the low protein diets were sometimes 4% and sometimes 7% was because mice will often lose weight on 4% protein diets due to starvation (in the paper’s experiments on BALB/c mice, “the mice had to be switched from a 4% to a 7% kcal from protein diet within the first week in order to prevent weight loss.”). Animal control officers do not allow experiments to continue if the mice are obviously starving.

[B]oth groups were implanted subcutaneously with 20,000 syngeneic murine melanoma cells (B16).

This is an unusually small number of cells. Typically, cancer researchers implant a million cells to create a syngeneic tumor. Presumably they used this small number of cells in order to ensure that some mice would not develop tumors during the 39 day experiment. As it happened, this was a lucky (canny?) choice of cell quantity: while 10 of 10 mice on the high-protein diet developed tumors during the experiment, only 9 of 10 mice on the low-protein diet did. If they had used more cells, all mice on both diets would have developed tumors; if they had used fewer cells, some mice on the high protein diet would have failed to develop tumors. Either way, the results would appear less damning for the high protein diet.

The outcomes:

Longo et al Figure 3
Due to the small number of cells injected, it takes at least two weeks before tumors are detectable in size (normally they would be visible in ten days). They seem to be similar in size at about two weeks after implantation.

However, when the tumors reach larger sizes, growth is impaired on the low protein diets. A mouse weighs 20 grams, and a 2000 mm3 tumor weighs 2 grams, or 10% of body weight – equivalent to a 15-pound tumor in humans. Growing a tumor of this size requires building a large amount of tissue — blood vessels, extracellular matrix, and more. The ability to construct new tissue is constrained on a protein-starved diet, so it’s not surprising that tumor growth is slower when the tumor is large and protein is severely restricted.

Animal protocols generally require that mice be sacrificed when tumors reach 2000 mm3. Extrapolating the tumor growth curves, it looks like the mice in experiment (B) would be sacrificed 5 weeks after implantation on the high protein diet, or 8 weeks after implantation on the low protein diet; in experiment (G), mice on the high protein diet would be sacrificed about 9 weeks after implantation, while mice on low protein diets would have been sacrificed about 11 weeks after implantation.

In other words, tumors still kill you, just a bit more slowly if you are starving yourself.

It’s important to note a couple of things. First, the word “starving” is appropriate. 4% to 7% protein intakes are starvation levels for mice. In a nice blog post closely relevant to this topic, Chris Masterjohn notes that a 5% protein intake completely stunts the growth of young rats:

Chris rhetorically asks: “How many of us would deliberately feed a two-year old a diet that would cause them to stop growing altogether?”

Second, as Chris also points out in the same post, such low protein intakes actually make cancer more likely in the context of exposure to mutagens. For instance, aflatoxin exposure leads to cancer (or pre-cancerous neoplasms) much more frequently in rats on low-protein diets than in rats on high-protein diets:

In this experiment, there were two diets, 5% protein and 20% protein, and two diet periods, one during exposure to aflatoxin and one afterward. Rats exposed to aflatoxin while on a 5% protein diet were far more likely to develop neoplasms than rats exposed to aflatoxin on a higher protein diet. That is, the “20-5” rats had far fewer cancers than the “5-5” rats, and the “20-20” rats had far fewer cancers than the “5-20” rats. High protein for the win!

However, once the rats had neoplasms, the tumors grew more slowly on the low-protein diet. Just as the new study found.

So, if your goal is to avoid getting cancer, it is better to eat adequate protein. If you already have cancer, or if researchers have injected you with highly metastatic melanoma cells, you can buy yourself slightly slower tumor growth by starving yourself of protein. In laboratory mice, this extends lifespan a few weeks because they are not allowed to die from cancer, but are sacrificed when tumors reach a specific size. In humans, however, cancer death commonly follows from cachexia, or wasting of lean tissue. A low protein diet might promote cachexia and accelerate cancer death in humans. It is not possible to infer from this study that there would be a clinical benefit to a low protein diet in human cancer patients.

Other Negative Effects of Low-Protein Diets

The study noted a significant negative effect of low protein diets in older mice. While young mice (18 weeks, equivalent to young adults) lost only a few percent of body weight on the starvation low protein diets, elderly mice (2 years old) wasted away on low protein diets. The data:

Longo et al Figure 4

Both young and old mice managed to gain a bit of weight on the high protein diets, and both young and old mice lost weight on the low protein diets. The weight loss was much more severe in elderly than young mice.

Considering that wasting away commonly precedes death in the elderly, this is not a good sign for the low protein diets. The authors themselves argue that this is consistent with the NHANES finding that high protein diets become beneficial after age 65: “old but not young mice on a low protein diet lost 10% of their weight by day 15, in agreement with the effect of aging on turning the beneficial effects of protein restriction on mortality into negative effects.”

However, while I think it is clear that the dramatic weight loss in the elderly mice fed low protein is harmful, it is far from clear that the slight weight loss of the younger mice was harmless. Though they maintained their weight better than elderly mice, they may have been starving as well. To actually support the NHANES survey, the researchers should have maintained the mice on low or high protein diets for several years, and seen which group lived longer. They did not do this.

If they had, I speculate that the high protein mice would have lived longer.

Conclusion

This is a study in the line of T. Colin Campbell and other vegetarians who have tried to show that animal protein promotes cancer and mortality. These studies are unconvincing. They simply do not prove the conclusions they purport to draw.

The Perfect Health Diet takes a middle ground in regard to protein: We recommend eating about 15% protein, and argue that both high protein and low protein diets are likely to be harmful; high protein diets by accelerating aging or by making protein available to gut bacteria for fermentation, producing a less beneficial gut flora and generating nitrogenous toxins; low protein diets by starving the body of a key nutrient needed to maintain bodily functions, especially liver, kidney, and immune function.

Nothing in this study persuades me that those recommendations need revision.

References

[1] Levine ME et al. Low Protein Intake Is Associated with a Major Reduction in IGF-1, Cancer, and Overall Mortality in the 65 and Younger but Not Older Population. Cell Metabolism 19, 407–417, March 4, 2014. http://www.cell.com/cell-metabolism/retrieve/pii/S155041311400062X.