Category Archives: Infections - Page 2

Evidence for Jaminet’s Corollary

Note to Abby: I did get distracted. Lemon juice next week.

In Friday’s post, I offered Jaminet’s Corollary to the Ewald Hypothesis. The Ewald hypothesis states that since the human body would have evolved to be disease-free in its natural state, most disease must be caused by infections. A consequence of the Ewald hypothesis is that, since microbes evolve very quickly, they will optimize their characteristics, including their virulence, depending on the human environment. If human-human transmission is easy, microbes will become more virulent and produce acute, potentially fatal disease. If transmission is hard, microbes will become less virulent, and will produce mild, chronic diseases.

Jaminet’s corollary is that such an evolution has been happening over the last hundred years or so, caused by water and sewage treatment and other hygienic steps that made transmission more difficult. The result has been a decreasing number of pathogens that induce acute deadly disease, but an increasing number that induce milder, chronic, disabling disease.

Indeed, most of the diseases we now associate with aging – including cardiovascular disease, cancer, autoimmune diseases, dementia, and the rest – are probably of infectious origin and the pathogens responsible may have evolved key characteristics fairly recently. Many modern diseases were probably non-existent in the Paleolithic and may have substantially changed character in just the last hundred years.

I predict that pathogens will continue to evolve into more successful symbiotes with human hosts, and that chronic infections will have to become the focus of medicine.

Is there evidence for Jaminet’s corollary?  I thought I’d spend a blog post looking at gross statistics.

When did hygienic improvements occur?

Since the evolution of pathogens should have begun when water and sewage treatment were adopted, it would be good to know when that occurred.

Historical Statistics of the United States, Millennial Edition, volume 4, p 1070, summarizes the history as follows:

[I]n the nineteenth century most cities – including those with highly developed water systems – relied on privy vaults and cesspools for sewage disposal…. Sewers were late to develop because at least initially privy vaults and cesspools were acceptable methods of liquid waste disposal, and they were considerably less expensive to build and operate than sewers.

Sewers began to replace privy vaults and cesspools as running water became more common and its use grew. The convenience and low price of running water led to a great increase in per capita usage. The consequent increase in the volume of waste water overwhelmed and undermined the efficacy of cesspools and privy vaults. According to Martin Melosi, “the great volume of water used in homes, businesses, and industrial plants flooded cesspools and privy vaults, inundated yards and lots, and posed not just a nuisance but a major health hazard” (Melosi 2000, p 91).

Joel Tarr also notes the impact of the increasing popularity of water closets over the later part of the nineteenth century (Tarr 1996, p 183). Water closets further increased the consumption of water, thus contributing to the discharge of contaminated fluids.

The data is not really adequate to tell when the biggest improvements were made. The most relevant data series, Dc374 and Dc375, begin only in 1915. They show that investments in sewer and water facilities were high before World War I, fell during the war and post-war depression, were very high again in the 1920s, and fell again after the Great Depression. It’s likely that the peak in water and sewage improvements occurred before 1930. In constant dollar terms, investment in water facilities peaked in 1930 at 610 million 1957 dollars and didn’t reach that level again until 1955. Investment in sewer facilities peaked at 734 million 1957 dollars in 1936 – probably due to Depression-era public works spending – and didn’t reach those levels again until 1953.

It seems likely that hygienic improvements were being undertaken continuously from the late 1800s and were probably completed in most of the US by the 1930s; in rural areas by the 1960s. Systems to deliver tap water were built mostly in the last quarter of the 19th century and first half of the 20th. The first flush toilets appeared in 1857-1860 and Thomas Crapper’s popularized toilet was marketed in the 1880s.

Mortality

Historical Statistics of the United States, Millennial Edition, volume 1, p 385-6, summarizes the trends in mortality as follows:

Recent work with the genealogical data has concluded that adult mortality was relatively stable after about 1800 and then rose in the 1840s and 1850s before commending long and slow improvement after the Civil War. This finding is surprising because we have evidence of rising real income per capita and of significant economic growth during the 1840-1860 period. However, … urbanization and immigration may have had more deleterious effects than hitherto believed. Further, the disease environment may have shifted in an unfavorable direction (Fogel 1986; Pope 1992; Haines, Craig and Weiss 2003).

Of course, urbanization and a worsening of the disease environment would be expected to coincide: with lack of hygienic handling of sewage, cities were mortality sinks throughout medieval times and that would have continued into the 19th century. Under the Ewald hypothesis, we would expect microbes to have become more virulent as cities became more densely populated in the 1840s and 1850s.

We have better information for the post-Civil War period. Rural mortality probably began its decline in the 1870s becaue of improvements in diet, nutrition, housing, and other quality-of-life aspects on the farm. There would have been little role for public health systems before the twentieth century in rural areas. Urban mortality probably did not begin to decline prior to 1880, but thereafter urban public health measures – especially construction of central water distribution systems to deliver pure water and sanitary sewers – were important in producing a rapid decline of infectious diseases and mortality in the cities that installed these improvements (Melosi 2000). There is no doubt that mortality declined dramatically in both rural and urban areas after about 1900 (Preston and Haines 1991).

The greatest improvements in mortality occurred between 1880 and 1950. Here is life expectancy at birth between 1850 and 1995 (series Ab644):

Life expectancy was only 39.4 years in 1880, but increased to 68.2 years by 1950 – an increase of 28.8 years. In the subsequent 40 years, life expectancy went up only a further 7.2 years.

Causes of Death

From Table Ab929-951 of volume 1, we can get a breakdown of death rates by cause from 1900 to 1990. Here are death rates from various infectious diseases:

And here for comparison are death rates from cancer, cardiovascular and renal diseases, and diabetes:

Overall, death rates have declined, consistent with rising life expectancy. However, death rates from chronic diseases have actually increased, while death rates from acute infections have, save for influenza and pneumonia, gone pretty much to zero.

Conclusion

Death rates from acute infections plummeted in the period 1880 to 1950 when hygienic improvements were being made. By and large, these decreases in infectious disease mortality preceded the development of antimicrobial medicines. Penicillin was discovered only in 1928, and by then mortality from infectious diseases had already fallen by about 70%.

We can’t really evaluate the Jaminet corollary from this data, other than to say that the data is consistent with the hypothesis. Nothing here rules out the idea that pathogens have been evolving from virulent, mortality-inducing germs into mild, illness-inducing germs.

Sometime later this year, I’ll look for evidence that individual pathogens have evolved over the last hundred years. It should be possible to find evidence regarding the germs for tuberculosis and influenza, since those continue to be actively studied.

There is great concern over the evolution of antibiotic resistance among bacteria. This data suggests that antibiotic resistance will not generate a return to the high mortality rates of the 19th century. Those mortality rates were high not due to a lack of antibiotics, but due to a lack of hygiene that encouraged microbes to become virulent.

As long as we keep our hands and food clean and our running water pure, we can expect mortality rates to stay low. Our problem will be a growing collection of chronic diseases.

Our microbes will want to keep us alive — that is good. But they will increasingly succeed at making us serve them as unwilling hosts. We will be increasingly burdened by parasites.

Diet, nutrition, and antimicrobial medicine are our defenses. Let’s use them.

Jaminet’s Corollary to the Ewald Hypothesis

In Tuesday’s comments, Kriss brought up Paul Ewald, father of the “Ewald hypothesis.” (Also brought up by Dennis Mangan here.) Ewald did some of his work in collaboration with Gregory Cochran, who may be familiar to many for his appearances on blogs (notably at Gene Expression) and for his recent book The 10,000-Year Explosion.

In a 1999 Atlantic article, “A New Germ Theory,” Judith Hooper summarizes Ewald’s hypothesis:

Darwinian laws have led Ewald to a new theory: that diseases we have long ascribed to genetic or environmental factors — including some forms of heart disease, cancer, and mental illness — are in many cases actually caused by infections.

Regular readers won’t be surprised to hear that we wholeheartedly endorse the Ewald hypothesis. We believe that nearly all diseases are caused by infections and bad diet. Malnourishing, toxin-rich diets impair immune function and create vulnerability to infectious disease.

The Ewald Hypothesis

Ewald’s reasoning goes as follows. Quotations are from the Atlantic essay.

First, genetic causes of disease are unlikely. Any gene that led to impaired functioning of the human body would be selected against and removed from the genome. Therefore, genetic diseases should have the abundance of random mutations – about 1 in 100,000 people:

As noted, the background mutation rate — the ratate which a gene spontaneously mutates — is typically about one in 50,000 to one in 100,000. Not surprisingly, genetic diseases that are severely fitness-impairing (for example, achondroplastic dwarfism) tend to have roughly the same odds, depending on the gene.

Diseases that are fitness-impairing and reach higher prevalence – and this includes nearly all major diseases – must have a cause other than genetic mutations.

Germs, on the other hand, are plausible candidates as causes for disease. Germs can benefit from doing us harm. At a minimum, they would like to modify human functioning in order to make us better hosts for themselves — by suppressing immune function, for instance. Also, they wish to induce behaviors that help them spread to new hosts – like sneezing, coughing, diarrhea, or sexual promiscuity.

Germs evolve quickly. Gene exchange, and lack of error checking during gene replication, modifies genomes quickly. Short reproductive time scales – on the order of 20 minutes – mean that helpful mutations proliferate rapidly. Big evolutionary changes can occur in a few weeks:

“The time scale is so much shorter and the selective pressures so much more intense [in microbes]. You can get evolutionary change in disease organisms in months or weeks.”

This means that germs quickly optimize their disease characteristics through natural selection. For example, virulence, or the severity of the disease that a pathogen causes, is rapidly optimized.

One factor determining virulence is how easily the organism can spread to a new host. If the organism can spread easily, there’s little cost to harming the current host, and microbes produce severe disease. If it’s hard to spread, on the other hand, organisms will be mild and peaceable toward their hosts. It pays to keep their hosts alive and healthy.

Ewald and his students collected empirical data supporting their explanation for virulence:

The dots on Saunders’s graphs made it plain that cholera strains are virulent in Guatemala, where the water is bad, and mild in Chile, where water quality is good. “The Chilean data show how quickly it can become mild in response to different selective pressures,” Ewald explained…. Strains of the cholera agent isolated from Texas and Louisiana produce such small amounts of toxin that almost no one who is infected with them will come down with cholera.

In the last few decades, evidence has only grown for the infectious origins of most diseases. In 1999, over 80% of serious diseases were known to be caused by pathogens:

Of the top forty fitness-antagonistic diseases on the list, thirty-three are known to be directly infectious and three are indirectly caused by infection; Cochran believes that the others will turn out to be infectious too. The most fitness-antagonistic diseases must be infectious, not genetic, Ewald and Cochran reason, because otherwise their frequency would have sunk to the level of random mutations.

If this analysis were repeated today, the percentage would be still closer to 100%. More cancers are now known to be caused by viruses, and the links between microbes and cardiovascular disease, dementia, and multiple sclerosis are stronger than ever.

I think Ewald and Cochran are correct in asserting that mental and neurological illnesses are especially likely to be infectious in origin. These illnesses tend to have a big impact on number of descendants, supporting the evolutionary argument for an infectious origin. And, due to their dependence on glucose, neurons are unusually susceptible to infections.

Schizophrenia is a good example of a disease that must be infectious in origin:

From the fitness perspective, schizophrenia is a catastrophe. It is estimated that male schizophrenics have roughly half as many offspring as the general population has. Female schizophrenics have roughly 75 percent as many. Schizophrenia should therefore approach the level of a random mutation after many generations.

Ewald and Cochran suggest we need a “Human Germ Project”:

In Ewald and Cochran’s view, evolutionary laws dictate that infection must be a factor in schizophrenia. “They announced they had the gene for schizophrenia, and then it turned out not to be true,” Cochran said one day when I mentioned genetic markers. “I think they found and unfound the gene for depression about six times. Nobody’s found a gene yet for any common mental illness. Maybe instead of the Human Genome Project we should have the Human Germ Project.”

I concur. Medical research should make much bigger investments in detecting, understanding the effects of, and developing treatments for human infections. Many existing lines of research, including many of the “autoimmune” and genetic hypotheses for disease origins, are not panning out, but continue to monopolize funding.

Jaminet’s Corollary

In the last century, sewage and water treatment has cleaned up our water supply and removed sewage and water as a vector for disease transmission. Hygienic methods, such as daily bathing and the use of soap, also tend to inhibit disease transmission.

Just as cholera is an extremely mild constituent of gut flora in hygienic Texas, but creates acute disease in unclean Guatemala, so we can expect that germs that created acute disease in (unclean) 1900 will have evolved to create mild chronic infections in (hygienic) 2011.

This is Jaminet’s corollary to the Ewald hypothesis:  Microbes are evolving away from severe acute disease toward milder chronic disease.

The focus of modern medicine on acute conditions, and its neglect of chronic conditions, adds to the selective pressures on microbes. Any pathogen that creates acute disease is subject to the full arsenal of modern antimicrobial drugs. But pathogens that create mild chronic disease are generally left untreated.

Modern medicine has created a powerful selective pressure on pathogens to generate chronic illnesses that are just mild enough, and that resemble aging closely enough, to elude the attention and antimicrobial arsenal of medical doctors.

Why No Dementia in Kitava?

Staffan Lindeberg in the Kitava Study found no evidence of stroke, diabetes, dementia, heart disease, obesity, hypertension, or acne on Kitava.

Why were these diseases absent? Partly due to the Kitavans’ excellent toxin-free diet, no doubt, but partly also due to an absence of the pathogens that cause these diseases.

Why was there no multiple sclerosis in the Faeroe Islands until British troops were stationed there in World War II? Because the pathogen that causes MS was absent from the islands, until the Brits introduced it.

Why has the incidence of chronic diseases increased tremendously in the last century? Partly due to longer-lived populations, but also, I believe, due to evolution of pathogens toward these diseases.

I predict the incidence of chronic disease will increase further in decades to come; and we will gradually come to appreciate that nearly all forty year olds today are not fully healthy, but are mildly impaired by a collection of chronic infections.

Conclusion

Fifty thousand years ago there were a few hundred thousand humans in the world. Today there are over 6 billion.

If a pathogen today wants to adapt to a specific host, its best bet is to adapt to humans. And within humans, its best way to flourish is to develop a chronic infection that persists for many decades.

The evolutionary arms race is not over. It has simply moved to a new field of battle. And medicine will have to evolve as the microbes do. The microbes are developing a new style of fighting. Medicine needs to shift its focus toward this rising threat of mild chronic diseases.

They’ve Got Us Surrounded

Note:  Our best wishes and prayers for a quick recovery to erp, who has surgery tomorrow. Get well soon, e!

We think that pathogens – viruses, bacteria, fungi, and protozoa – are, along with toxic and malnourishing diets, the main cause of human disease.

People who think we exaggerate the impact of microbes on health may not have fully appreciated the ubiquity of these pathogens. We live in a sea of microbes, many of whom would like nothing better than to live at our expense.

So today, let’s look at just how abundant microbes are.

In the Water

When you swim in the ocean, how many viruses are you swallowing?

… pause … time for reader to guess …

The answer is in a fascinating story in The Scientist:

Once thought not to exist in marine environments, scientists now realize that there are some 50 million viruses in every milliliter of seawater.

These viruses can not only infect cellular life, they frequently kill it:

Every day, marine viruses kill about 20 percent of the ocean’s microorganisms, which produce about half the oxygen on the planet.

It’s not just viruses: Vibrio cholerae, the bacterium that causes epidemic cholera, is widespread in ocean water, and is the most common cause of food poisoning from eating shellfish.

In the Air

What about the air?  How many microbes do you inhale when you breath?

… pause … time for reader to guess …

The Scientist once again came to my aid:

Every cubic meter of air holds upwards of 100 million microorganisms …

Lungs contain about 2.4 liters of air, of which 0.5 liters is expelled every breath. A cubic meter has 1,000 liters, so a single breath takes in 50,000 microorganisms.

Some more information for the curious:

Recent research published in PNAS suggests that the diversity of microbial life in the air is on par with the soil, at least in urban areas, yet the air remains vastly understudied in comparison.

“Just seven or ten years ago we didn’t realize bacteria existed in clouds,” said Anne-Marie Delort, professor of microbiology and organic chemistry at Université Blaise Pascal in France. Now researchers know microbes act as a surface for the condensation of water vapor in the atmosphere, thus forming clouds. Recent research publish in Science shows microbes also play the same role during snowflake formation and other types of precipitation.

Which Is More Dangerous, Air or Water?

Not all microbes flourish in the human body, but all have to be dealt with by our immune defenses. And some can, and do, establish lasting infections in humans.

Since both air and water have pathogenic microbes, it seems fair to ask which environment is more likely to make you sick.

Luckily scientists have done a controlled trial. [1] They sent two sets of people to the beach, and instructed half to remain in the air and the other half to venture into the water. ScienceDaily has details:

A yearlong beach study led by a team of University of Miami researchers suggests that swimmers at sub-tropical beaches face an increased risk of illness….

B.E.A.C.H.E.S. (Beach Environmental Assessment and Characterization Human Exposure Study) enlisted more than 1,300 volunteers, all local residents who regularly use South Florida beaches. Researchers divided study participants into two groups: volunteers who went into the water and those instructed to stay out of the water. The group that went in the water was asked to dunk themselves completely in the water three times over a fifteen-minute period. A few days later both sets of participants received follow-up calls from researchers, checking on their health and well being.

“We found that when swimming in sub-tropical beach areas with no known pollution or contamination from sewage or runoff, you still have a chance of being exposed to the kind of microbes that can make you sick,” said Dr. Lora Fleming …

The study found that the swimmers were 1.76 times more likely to report a gastrointestinal illness, and 4.46 times more likely to report having a fever or respiratory illness. Swimmers in the study were also nearly six times more likely to report a skin illness than those volunteers who stayed out of the water.

The obvious flaw in this study was the lack of a control group placed in a vacuum. It would have been nice to know if complete isolation from microbes would have improved health even further. Perhaps the scientists lacked funding for this third group.

(Warning: inside joke coming.) Of course, it may be impossible for this study ever to be replicated in the US, since after these results how can an ocean swimming group ever be permitted by an Institutional Review Board? It seems that follow-up studies will have to be performed on foreign beaches, perhaps in Rio, the French Riviera, or Tahiti.

Conclusion

It seems the microbes have us surrounded. Whether you venture into the air or the water, you have a chance to get sick.

Is there anything you can do to protect yourself, besides staying home and cowering under your bed? Possibly. We’ll look into that in upcoming posts.

References

[1] Fleisher JM et al. The BEACHES Study: health effects and exposures from non-point source microbial contaminants in subtropical recreational marine waters. Int J Epidemiol. 2010 Oct;39(5):1291-8. http://pmid.us/20522483.

Tryptophan Poisoning and Chronic Infections

On Friday I discussed a recent paper showing that a high-tryptophan diet caused mice, after 4 to 12 weeks, to start harming themselves by tearing out fur from their bellies and forepaws. The mice also developed ulcerative dermatitis – open sores on their skin. [1]

I closed with a promise that on Monday I would suggest some reasons why a high tryptophan diet might cause these diseases.

However …

C57BL/6 Mice Are Prone to Ulcerative Dermatitis

The breed of mice used in the study, C57BL/6 mice, are an inbred strain featuring some genetic mutations which make them prone to ulcerative dermatitis and compulsive behavior.

Upon looking into the literature, it seems that if you look at these mice cross-wise they get ulcerative dermatitis.

For instance, vitamin A causes ulcerative dermatitis in these mice because of mutations that impair the disposal of excess retinol:

A number of C57BL/6 (B6) substrains are commonly used by scientists for basic biomedical research. One of several B6 strain-specific background diseases is focal alopecia that may resolve or progress to severe, ulcerative dermatitis…. Four B6 substrains tested have a polymorphism in alcohol dehydrogenase 4 (Adh4) that reduces its activity and potentially affects removal of excess retinol. Using immunohistochemistry, differential expression of epithelial retinol dehydrogenase (DHRS9) was detected, which may partially explain anecdotal reports of frequency differences between B6 substrains. The combination of these 2 defects has the potential to make high dietary vitamin A levels toxic in some B6 substrains … [2]

Malnourishment seems to lead to spontaneous development of ulcerative dermatitis. The cause may be oxidative stress, since supplemental vitamin E cures the ulcerative dermatitis:

In this study, we fed a standard NIH-31 diet fortified with vitamin E to C57BL/6 mice and strains of mice with a C57BL/6 background that had spontaneously developed ulcerative dermatitis (UD)…. Of 71 mice, 32 (45%) had complete lesion re-epithelialization with hair regrowth. Complete lesion repair was not influenced by sex, age, or coat color. The average time to complete lesion repair ranged from 2 to 5 weeks, and there was no correlation with sex or coat color. The positive response to vitamin E suggests that protection from oxidative injury may play a role in the resolution of UD lesions … [3]

When scientists studying cancer applied a carcinogenic toxin to the skin of these mice, they developed not cancer but – you guessed it – ulcerative dermatitis:

In this study, heterozygous p53-deficient (p53(+/-)) mice … and wild-type (WT) litter mates were subjected to a two-stage skin carcinogenesis protocol with 7,12-dimethylbenz[a]anthracene and 12-O-tetradecanoylphorbol-13-acetate. Instead of skin carcinomas, however, the chemical treatment protocol caused ulcerous skin lesions, and 89% of mice fed ad libitum died from infection/septicemia. When WT mice were restricted to 60% of the average calorie intake of the respective ad libitum group, however, only 33% developed such lesions, and the CR mice survived twice as long on average as the ad libitum mice. [4]

It’s interesting that calorie restriction reduces the ulcerative dermatitis rate. Note that the mice went on to die of sepsis from an unknown infection; this suggests that these mice probably had some chronic pre-existing infection that was laying dormant, but emerged to become acute when the mouse was stressed by the toxins.

So far we’ve seen that malnutrition (lack of antioxidants) and exposure to toxins (vitamin A, carcionogens) can cause ulcerative dermatitis. Our three major causes of disease are malnutrition, toxins, and pathogens, so to complete our survey we should check whether infections cause ulcerative dermatitis in these mice.

Indeed they do. The most common infection in laboratory mice is fur mites, and fur mite infections induce ulcerative dermatitis in all strains of C57BL mice. [5]

Since diseases become more common with age, we shouldn’t be surprised to see that the rate of spontaneous ulcerative dermatitis rise with age in these mice:

A spontaneous, severely pruritic ulcerative dermatitis was initially observed in 33/201 (16.4%) aged C57BL/6NNia mice obtained from the National Institute of Aging. This ulcerative dermatitis also developed in 21/98 (21%) aged C57BL/6 mice in a subsequent experimental group obtained from the same source. The average age of onset in the initial group was 20 months. These animals were negative for ectoparasite infestation and primary bacterial or fungal infection. The lesions varied from acute epidermal excoriation and ulceration to chronic ulceration with marked dermal fibrosis…. The elucidation of the pathogenesis of this disease is important because of the significant percentage of animals affected … [6]

Given that there are so many possible triggers of ulcerative dermatitis in these mice, I decided to shift the topic of this post a bit, and focus on the specific possibility that chronic infections might play a role.

Lab Mice Are At Risk For Chronic Infections

As I’ve said many times, everyone gets chronic infections. The world is saturated in germs, and sooner or later the ones that can maintain persistent infections take up residence in all of us. So health is determined by the relative balance of power between immune system and pathogens, not by exposure.

Lab mice are at heightened risk for chronic infections because of their living conditions. In the case of the tryptophan-poisoned mice, here’s a description of their accommodations:

All subjects were adult (over 6 months of age) C57BL/6 mice, bred from C57BL/6J progenitors, and housed with same-sex siblings…. Lights were on a 14-10-h light-dark cycle. Mice were caged in standard shoebox cages (12.7 cm high, 433 cm2 floor area) with wire lids…. Ages ranged from 24-66 weeks of age (mean, 47 weeks). The number of mice per cage was variable, ranging from 2-4 mice per cage. [1]

These shoebox cages are small: 433 cm2 floor area translates to roughly 15 by 29 cm or 6 by 11 inches. The lid is 5 inches high. With four mice to a cage, each mouse gets a 3 by 5 inch area.

There is little room to exercise. Crowding is stressful.

Moreover, there is no natural sunlight, and therefore no vitamin D production.

All of these factors tend to impair immune function. Their casein, cornstarch, and soybean oil diets can’t be helpful to immune function either.

Cages are generally fairly closely packed in animal facilities. The mice generally cannot touch mice in adjacent cages, but respiratory pathogens and fur mites spread easily. If one mouse in a facility gets fur mites, usually all the others get infected soon afterward, and the whole room has to be quarantined.

Note also that the mice are old enough – 6 to 15 months – to have had extensive exposure to any chronic pathogens carried by either their mouse neighbors or their human handlers.

So it wouldn’t be surprising if the lab mice in this study had chronic infections.

Chlamydiae Infections in Lab Mice

C. pneumoniae is a zoonotic bacterium that crossed from reptiles to mammals about 60 million years ago and can infect both humans and animals.

Unfortunately, nobody seems to have bothered to see if C. pneumoniae is commonly present in “healthy” lab mice. However, in test tubes C. pneumoniae infects mouse brain cells just fine:

Inspired by the suggested associations between neurological diseases and infections, we determined the susceptibility of brain cells to Chlamydia pneumoniae (Cpn). Murine astrocyte (C8D1A), neuronal (NB41A3) and microglial (BV-2) cell lines were inoculated with Cpn…. Our data demonstrate that the neuronal cell line is highly sensitive to Cpn, produces viable progeny and is prone to die after infection by necrosis. Cpn tropism was similar in an astrocyte cell line, apart from the higher production of extracellular Cpn and less pronounced necrosis. In contrast, the microglial cell line is highly resistant to Cpn as the immunohistochemical signs almost completely disappeared after 24 h. Nevertheless, significant Cpn DNA amounts could be detected, suggesting Cpn persistence. [7]

So in the mouse glial cells which were most resistant, C. pneumoniae could maintain infections even if it didn’t do much. In mouse neurons and astrocytes, C. pneumoniae was able to reproduce freely and would eventually kill the cells.

In a different mouse microglial cell line, EOC-20, C. pneumoniae replicates rapidly, increasing 9-fold in 3 days. [8] So it appears that C. pneumoniae can infect all the major cell types of the mouse brain.

In humans, C. pneumoniae generally reaches the brain by first infecting the blood vessels that feed the brain. In our C57BL/6 mice, C. pneumoniae can create a vascular infection after a single dose in the nose:

In C57BL/6J mice on a nonatherogenic diet, C. pneumoniae were detected in the aorta only 2 weeks after a single intranasal inoculation in 8% of mice. The persistence of C. pneumoniae in atheromas suggests a tropism of C. pneumoniae to the lesion. [9]

Repeated inoculation of C57BL/6J mice resulted in inflammatory changes in the heart and aorta in 8 of 40 of mice … [10]

In a ranking of mouse strains in susceptibility to C. pneumoniae infection, C57BL/6 mice occupy a middle position:

The first mouse models for C. pneumoniae infection … used several mouse strains that differed in susceptibility to infection. Swiss Webster mice and NIH/S mice were highly susceptible, followed by C57BL/6 mice, whereas BALB/c mice were least susceptible. [11]

C. pneumoniae infection typically begins with a brief, mild respiratory infection that is followed by spread of the pathogen to other organs via infected white blood cells and establishment of a persistent infection:

Infection of mice with C. pneumoniae resulted in a self-limiting pneumonia [30–32,77]. Depending on the dose given, mice displayed symptoms, like dyspnea, weakness and weight loss. The symptoms reached a maximum within 2 to 4 days and rarely lasted longer than 1 week….

Isolation of viable C. pneumoniae organisms from the lungs was generally possible up to 4 weeks after infection and occasionally up to 6 weeks [31,32,42,47,49,52,54,66,72]. However, C. pneumoniae antigens and DNA were detected in the lungs for a much longer period, up to 20 weeks after infection [42]. The presence of C. pneumoniae antigens in the lungs was limited to macrophages in alveoli and bronchus-associated lymphoid tissue [63]. These findings suggested a latent persistence of the organisms, which was confirmed by reactivation of pulmonary infection using cortisone-induced immunosuppression experiments [50,51]….

C. pneumoniae infection was not limited to the respiratory tract only. Following local infection, spreading of C. pneumoniae to multiple organs throughout the body was detected by PCR and immunohistochemical staining up to 20 weeks [42]. The dissemination was probably mediated by peripheral blood monocytes as they were positive both by PCR and isolation in some studies, whereas blood plasma was negative [38,47]. [11]

Interferon Gamma and the Immune Defense Against Chlamydiae

In both humans and mice, immune defense against Chlamydiae is mediated principally by interferon gamma. Mice that lack interferon gamma or its receptor suffer prolonged severe infections:

The central role of IFN-? in clearance is evidenced by prolonged infections that occur in IFN-?, and IFN-? receptor-, deficient mice (10, 11). [12]

Infant mice exposed to Chlamydiae die within 2 weeks if they lack interferon gamma:

Importantly, infected mice deficient in IFN-gamma or IFN-gamma receptor demonstrated enhanced chlamydial dissemination, and 100% of animals died by 2 wk postchallenge. [13]

Might Chlamydia Infection Have a Connection to Compulsive Behavior and Ulcerative Dermatitis?

I started looking into these Chlamydia infection papers to see if they might explain why C57BL/6 mice given high levels of tryptophan might develop compulsive behavior and ulcerative dermatitis.

It’s possible. The logic goes like this.

First, tryptophan is crucial to Chlamydial growth; it is a precursor to niacin, the key vitamin of bacterial metabolism, and is also essential to a number of Chlamydial proteins. So a high-tryptophan diet will promote Chlamydiae infections.

Chlamydiae trigger the innate immune response mediated by interferon gamma.

So interferon gamma will be elevated in Chlamydiae-infected mice. Interestingly, the only study I found that gave interferon gamma directly to mice found that it caused ulcerative dermatitis:

Daily subcutaneous doses of 0.02, 0.2, or 2 mg/kg/d of recombinant murine interferon-gamma (rmuIFN-gamma) were given to mice on postnatal days 8 through 60 … Males given 0.2 and 2 mg/kg/d had swelling and ulcerative dermatitis around the urogenital area, which were observed after sexual contact and attributed to a bacterial infection. [14]

What about the compulsive behavior? Well, in mice one of the principal effects of interferon gamma is to stimulate the release nitric oxide (NO):

Gamma interferon (IFN-gamma)-induced effector mechanisms have potent antichlamydial activities that are critical to host defense. The most prominent and well-studied effectors are indoleamine dioxygenase (IDO) and nitric oxide (NO) synthase. The relative contributions of these mechanisms as inhibitors of chlamydial in vitro growth have been extensively studied using different host cells, induction mechanisms, and chlamydial strains with conflicting results. Here, we have undertaken a comparative analysis of cytokine- and lipopolysaccharide (LPS)-induced IDO and NO using an extensive assortment of human and murine host cells infected with human and murine chlamydial strains. Following cytokine (IFN-gamma or tumor necrosis factor alpha) and/or LPS treatment, the majority of human cell lines induced IDO but failed to produce NO. Conversely, the majority of mouse cell lines studied produced NO, not IDO. [15]

That’s interesting, because nitric oxide (NO) induces compulsive behavior in mice. A standard measure of obsessive-compulsive behavior in mice is marble burying. Nitric oxide increases marble burying:

In view of the reports that nitric oxide modulates the neurotransmitters implicated in obsessive-compulsive disorder, patients with obsessive-compulsive disorder exhibit higher plasma nitrate levels, and drugs useful in obsessive-compulsive disorder influence nitric oxide, we hypothesized that nitric oxide may have some role in obsessive-compulsive behavior. We used marble-burying behavior of mice as the animal model of obsessive-compulsive disorder, and nitric oxide levels in brain homogenate were measured using amperometric nitric oxide-selective sensor method. Intraperitoneal administration of nitric oxide enhancers … significantly increased marble-burying behavior as well as brain nitrites levels, whereas treatment with 7-nitroindazole-neuronal nitric oxide synthase inhibitor (20-40 mg/kg, i.p.) or paroxetine-selective serotonin reuptake inhibitor (5-10 mg/kg, i.p.) dose dependently attenuated marble-burying behavior and nitrites levels in brain…. In conclusion, obsessive compulsive behavior in mice appears related to nitric oxide in brain … [16]

Giving mice arginine, which raises NO levels, reverses the effect of SSRI antidepressants:

enhancement of NO synthesis by l-arginine reversed the effect of SSRI antidepressants, further demonstrating the role of NO in regulating the marble-burying behavior [17]

In rats allowed access to cocaine, blocking NO production decreased their self-administration of the drug. [18] This raises the possibility that Chlamydiae infection, leading to increase NO production, would increase self-administration of cocaine.

So if your pet mice are addicted to cocaine, maybe you should give them antibiotics!

Conclusion

Chronic infections are widespread in both humans and animals, and can have odd effects on behavior and health. Yet researchers have barely begun to detect their existence, much less trace their effects.

I don’t know whether chronic infections were involved in the apparent poisoning of C57BL/6 mice by tryptophan. But, since tryptophan is a very strong promoter of bacterial growth, and bacterial infections trigger interferon gamma and nitric oxide release which can induce ulcerative dermatitis and compulsive behavior in this breed, the possibility can’t be ruled out.

References

[1] Dufour BD et al. Nutritional up-regulation of serotonin paradoxically induces compulsive behavior. Nutr Neurosci. 2010 Dec;13(6):256-64. http://pmid.us/21040623.

[2] Sundberg JP et al. Primary Follicular Dystrophy With Scarring Dermatitis in C57BL/6 Mouse Substrains Resembles Central Centrifugal Cicatricial Alopecia in Humans. Vet Pathol. 2010 Sep 22. [Epub ahead of print]. http://pmid.us/20861494.

[3] Lawson GW et al. Vitamin E as a treatment for ulcerative dermatitis in C57BL/6 mice and strains with a C57BL/6 background. Contemp Top Lab Anim Sci. 2005 May;44(3):18-21.  http://pmid.us/15934718.

[4] Perkins SN et al. Calorie restriction reduces ulcerative dermatitis and infection-related mortality in p53-deficient and wild-type mice. J Invest Dermatol. 1998 Aug;111(2):292-6. http://pmid.us/9699732.

[5] Dawson DV et al. Genetic control of susceptibility to mite-associated ulcerative dermatitis. Lab Anim Sci. 1986 Jun;36(3):262-7. http://pmid.us/3724051.

[6] Andrews AG et al. Immune complex vasculitis with secondary ulcerative dermatitis in aged C57BL/6NNia mice. Vet Pathol. 1994 May;31(3):293-300. http://pmid.us/8053123.

 [7] Boelen E et al. Chlamydia pneumoniae infection of brain cells: an in vitro study. Neurobiol Aging. 2007 Apr;28(4):524-32. http://pmid.us/16621171.

[8] Ikejima H et al. Chlamydia pneumoniae infection of microglial cells in vitro: a model of microbial infection for neurological disease. J Med Microbiol. 2006 Jul;55(Pt 7):947-52. http://pmid.us/16772424.

[9] Moazed TC et al. Murine models of Chlamydia pneumoniae infection and atherosclerosis. J Infect Dis. 1997 Apr;175(4):883-90. http://pmid.us/9086145.

[10] Campbell LA et al. Mouse models of C. pneumoniae infection and atherosclerosis. J Infect Dis. 2000 Jun;181 Suppl 3:S508-13. http://pmid.us/10839749.

[11] de Kruif MD et al. Chlamydia pneumoniae infections in mouse models: relevance for atherosclerosis research. Cardiovasc Res. 2005 Feb 1;65(2):317-27. http://pmid.us/15639470.

[12] Kaiko GE et al. Chlamydia muridarum infection subverts dendritic cell function to promote Th2 immunity and airways hyperreactivity. J Immunol. 2008 Feb 15;180(4):2225-32. http://pmid.us/18250429.

[13] Jupelli M et al. Endogenous IFN-gamma production is induced and required for protective immunity against pulmonary chlamydial infection in neonatal mice. J Immunol. 2008 Mar 15;180(6):4148-55. http://pmid.us/18322226.

[14] Bussiere JL et al. Reproductive effects of chronic administration of murine interferon-gamma. Reprod Toxicol. 1996 Sep-Oct;10(5):379-91. http://pmid.us/8888410.

[15] Roshick C et al. Comparison of gamma interferon-mediated antichlamydial defense mechanisms in human and mouse cells. Infect Immun. 2006 Jan;74(1):225-38. http://pmid.us/16368976.

[16] Umathe SN et al. Role of nitric oxide in obsessive-compulsive behavior and its involvement in the anti-compulsive effect of paroxetine in mice. Nitric Oxide. 2009 Sep;21(2):140-7. http://pmid.us/19584001.

[17] Krass M et al. Nitric oxide is involved in the regulation of marble-burying behavior. Neurosci Lett. 2010 Aug 9;480(1):55-8. http://pmid.us/20553994.

[18] Collins SL, Kantak KM. Neuronal nitric oxide synthase inhibition decreases cocaine self-administration behavior in rats. Psychopharmacology (Berl). 2002 Feb;159(4):361-9. http://pmid.us/11823888.