Chapter 44 Nutritional Medicine
Nutritional medicine, as described in this textbook, consists of the use of diet and nutritional supplementation as therapeutic modalities. The foundation of nutritional medicine is a health-promoting diet that focuses on the consumption of whole, natural foods. Nutritional supplements are used in the overall context of nutritional medicine as complementary agents, not as sole primary medicines. Diet is always primary, and supplementation secondary.
Although the human gastrointestinal tract is capable of digesting both animal and plant foods, a number of physical characteristics indicate that Homo sapiens evolved to primarily digest plant foods. Specifically, our teeth are composed of 20 molars, which are perfect for crushing and grinding plant foods, along with eight front incisors, which are well suited for biting into fruits and vegetables. Only our front four canine teeth are designed for meat eating. Our jaws swing both vertically to tear and laterally to crush, but carnivores’ jaws swing only vertically. Additional evidence that supports the body’s preference for plant foods is the long length of the human intestinal tract. Carnivores typically have a short bowel, and herbivores have a bowel length proportionally comparable to that of humans. Thus, the human bowel length favors plant foods.1
To answer the question, “What should humans eat?” many researchers look to other primates, such as chimpanzees, monkeys, and gorillas. Nonhuman wild primates are also omnivores or, as often described, herbivores and opportunistic carnivores. They eat mainly fruits and vegetables, but may also eat small animals, lizards, and eggs if given the opportunity. Only 1% and 2%, respectively, of the total calories consumed by gorillas and orangutans are animal foods. The remainder of their diet is from plant foods. Because humans are between the weights of the gorilla and orangutan, it has been suggested that humans are designed to eat around 1.5% of their diet as animal foods.2 Most Americans derive well over 50% of their calories from animal foods.
Although most primates eat a considerable amount of fruit, it is critical to point out that the cultivated fruit in American supermarkets is far different from the highly nutritious wild fruits these animals rely on. Wild fruits have slightly higher protein contents and a higher content of certain essential vitamins and minerals, but cultivated fruits tend to be higher in sugars. Cultivated fruits are therefore very tasty to humans, but because they have a higher sugar composition and also lack the fibrous pulp and multiple seeds found in wild fruit that slow down the digestion and absorption of sugars, cultivated fruits raise blood sugar levels much more quickly than their wild counterparts.
Wild primates fill up not only on fruit but also on other highly nutritious plant foods. As a result, wild primates weighing one tenth as much as a typical human ingest nearly 10 times the level of vitamin C and much higher amounts of many other vitamins and minerals. Other differences in the wild primate diet are also important to point out, such as a higher ratio of α-linolenic acid, the essential ω-3 fatty acid, to linoleic acid, the essential ω-6 fatty acid2 (Table 44-1).
|MINERAL||TOTAL DAILY INTAKE OF 7-kg ADULT MONKEY (mg)||RECOMMENDED DAILY ALLOWANCE FOR 70-kg MAN (mg)|
Determining what foods are best suited for humans may not be as simple as looking at the diet of wild primates. There are some structural and physiologic differences between humans and apes. The key difference may be a larger, more metabolically active brain. It has been theorized that a shift in dietary intake to more animal foods may have produced the stimulus for brain growth. The shift itself was probably the result of limited food availability that forced early humans to hunt grazing mammals, such as antelope and gazelle. Archeological data supports this association: the brains of humans started to grow and become more developed at about the same time as evidence shows an increase in bones of animals butchered with stone tools at sites of early villages.
Improved dietary quality alone cannot fully explain why human brains grew, but it definitely appears to have played a critical role. With a bigger brain, early humans were able to engage in more complex social behavior, which led to improved foraging and hunting tactics, which in turn led to even higher quality food intake, fostering additional brain evolution.
Data from anthropologists looking at hunter–gatherer cultures provide much insight as to what humans are designed to eat; however, it is very important to point out that these groups were not entirely free to determine their diets. Instead, their diets were molded by what was available to them. For example, the diet of the Inuit Eskimos is far different from that of the Australian aborigines. It may not be appropriate to answer the question, “What should humans eat?” simply by looking at these studies. Nonetheless, it is important to point out that regardless of whether hunter–gatherer communities relied on animal or plant foods, the rate of diseases of civilization, such as heart disease and cancer, is extremely low in such communities.3
It should also be pointed out that the meat that our ancestors consumed was much different from the meat found in supermarkets today. Domesticated animals have always had higher fat levels than their wild counterparts, but the desire for tender meat has led to the breeding of cattle that produce meat with a fat content of 25% to 30% or more, compared with less than 4% for free-living animals and wild game. In addition, the type of fat is considerably different. Domestic beef contains primarily saturated fats and virtually undetectable amounts of ω-3 fatty acids. In contrast, the fat of wild animals contains more than five times more polyunsaturated fat per gram and has a good amount of beneficial ω-3 fatty acids (approximately 4%).4
Considerable evidence indicates that a high intake of red or processed meat increases the risk of mortality. For example, in a cohort study of half a million people aged 50 to 71 years at baseline, men and women in the highest versus lowest quintile of red and processed meat intake had elevated risks for overall mortality.5
In another prospective cohort study, subjects were followed from 1980 (women) or 1986 (men) until 2006. Low-carbohydrate diets, either animal-based (emphasizing animal sources of fat and protein) or vegetable-based (emphasizing vegetable sources of fat and protein), were computed from several validated food-frequency questionnaires assessed during follow-up.6 A low-carbohydrate diet based on animal sources was associated with higher all-cause mortality in both men and women, whereas a vegetable-based, low-carbohydrate diet was associated with lower all-cause and cardiovascular disease mortality rates.
The evidence supporting diet’s role in chronic degenerative diseases is substantial and overwhelming. There are two basic facts linking the diet–disease connection: (1) a diet rich in plant foods (i.e., whole grains, legumes, fruits, and vegetables) is protective against many diseases that are extremely common in so-called Western society, and (2) a diet providing a low intake of plant foods is a causative factor in the development of these diseases and provides conditions under which other causative factors are more active.
Much of the link between diet and chronic disease originated from the work of two medical pioneers, Denis Burkitt, MD, and Hugh Trowell, MD, authors of Western Diseases: Their Emergence and Prevention, first published in 1981.7 Although now extremely well-recognized, their work is actually a continuation of the landmark work of Weston A. Price, a dentist and author of Nutrition and Physical Degeneration. In the early 1900s, Dr. Price traveled the world observing changes in teeth and palate (orthodontic) structure as various cultures discarded traditional dietary practices in favor of a more “civilized” diet. Price was able to follow individuals as well as cultures over 20 to 40 years and carefully documented the onset of degenerative diseases as their diets changed. On the basis of extensive studies examining the rate of diseases in various populations (epidemiologic data) and his own observations of primitive cultures, Burkitt formulated the following sequence of events:
Third stage. As more and more people abandon their traditional diet, conditions that were once quite rare become extremely common. Examples are constipation, hemorrhoids, varicose veins, and appendicitis.
Fourth stage. Finally, with full westernization of the diet, other chronic degenerative or potentially lethal diseases, such as heart disease, cancer, osteoarthritis, rheumatoid arthritis, and gout, become extremely common.
Since Burkitt and Trowell’s pioneering research, a virtual landslide of data has continually verified the role of the Western diet as the key factor in virtually every chronic disease, especially obesity and diabetes. Box 44-1 lists diseases with convincing links to a diet low in plant foods. Many of these now common diseases were extremely rare before the twentieth century.
During the twentieth century, food consumption patterns changed dramatically (Table 44-2). Total dietary fat intake rose from 32% of the calories in 1909 to 43% by the end of the century. Overall carbohydrate intake dropped from 57% to 46%, and protein intake remained fairly stable at about 11%.
Compounding these detrimental changes are the individual food choices accounting for the changes. The biggest changes were in significant rises in the consumption of meat, fats and oils, and sugars and sweeteners in conjunction with the decreased consumption of noncitrus fruits, vegetables, and whole grain products. The absolutely greatest change in the last 100 years of human nutrition is the switch from a diet with a high level of complex carbohydrates, as found naturally occurring in grains and vegetables, to a tremendous and dramatic increase in the number of calories consumed from simple sugars. Currently, more than half of the carbohydrates being consumed are in the form of sugars (sucrose, corn syrup, etc.) being added to foods as sweetening agents. High consumption of refined sugars is linked to many chronic diseases, including obesity, diabetes, heart disease, and cancer.
Throughout the years, various governmental organizations have published dietary guidelines, but the recommendations of the United States Department of Agriculture (USDA) have become the most widely known. In 1956, the USDA published Food for Fitness—A Daily Food Guide. This became popularly known as the Basic Four Food Groups. The Basic Four were composed of the following:
One of the major problems with the Basic Four Food Groups model was that it graphically suggested that the food groups were equal in health value. The result was overconsumption of animal products, dietary fat, and refined carbohydrates, and insufficient consumption of fiber-rich foods like fruits, vegetables, and legumes. This in turn resulted in diets being responsible for many premature deaths, chronic diseases, and increased health care costs.
As the Basic Four Food Groups became outdated, various other governmental as well as medical organizations developed guidelines of their own, designed to reduce the risk of either a specific chronic degenerative disease, such as cancer or heart disease, or of all chronic diseases.
In an attempt to create a new model in nutrition education, the USDA first published the “Eating Right Pyramid” in 1992. It received harsh criticisms from numerous experts and other organizations. One big question that should be asked is, “Is it appropriate to have the USDA making these recommendations?” After all, the USDA serves two somewhat conflicting roles: (1) it represents the food industry, and (2) it is in charge of educating consumers about nutrition. Many people believe that the pyramid was weighted more toward dairy products, red meat, and grains because of influence from the dairy, beef, and grain farming and processing industries. In other words, the pyramid was designed not to improve the health of Americans but rather to promote the USDA agenda of supporting multinational agrifoods giants (Figure 44-1).
One of the main criticisms of the Eating Right Pyramid is that is does not stress strongly enough the importance of quality food choices. For example, the bottom of the pyramid represents the foods that the USDA thinks should make up the bulk of a healthy diet: the Bread, Cereal, Rice, and Pasta Group. Eating 6 to 11 servings a day from this group is supposedly the path to a healthier life. In this way, the pyramid sets a person up for insulin resistance, obesity, and adult-onset diabetes if he or she consistently chooses refined rather than whole grain products in this important category. This is one example of how the Eating Right Pyramid does not take into consideration how quickly blood glucose levels rise after eating a certain type of food—an effect referred to as the food’s glycemic index (GI). The GI is a numerical scale used to indicate how fast and how high a particular food raises blood glucose (blood sugar) levels. There are two versions of the GI, one based on a standard of comparison with glucose as 100, and the other based on white bread. Foods are tested against the results of the selected standard. Foods with a lower GI create a slower rise in blood sugar, whereas foods with a higher GI create a faster rise in blood sugar.
One of the major problems with the Eating Right Pyramid is that the GIs of some of the foods that the pyramid is directing Americans to eat more of, such as breads, cereals, rice, and pasta, can greatly stress blood sugar control, especially if derived from refined grains, and are now being linked to an increased risk for obesity, diabetes, and cancer. As a result, the goal of the Eating Right Pyramid was to improve the health of Americans and, hopefully, slow down the growing trend toward obesity and diet-related disease, but because of poor individual food choices within the categories, the pyramid has only worsened the problem.
On June 2, 2011 the USDA unveiled a new food icon, MyPlate, to replace the food pyramid (see Figure 44-2). This simplified illustration is designed to help Americans make healthier food choices. MyPlate is the first step in a multi-year effort to raise awareness and educate consumers of eating more healthfully. The initial launch came with some simple recommendations:
Hopefully, this new campaign will be more successful than prior efforts. And, hopefully, the program will not yield to political pressure, and focus on communicating important nutritional guidance (Figure 44-2).
On the basis of existing evidence we have created The Optimal Health Food Pyramid (Figure 44-3). The major difference from the USDA pyramid is that the Optimal Health Food Pyramid incorporates the best of two of the most healthful diets ever studied—the traditional Mediterranean diet (see later) and the traditional Asian diet. In addition, the Optimal Health Food Pyramid more clearly defines what the healthy components within the categories are and stresses the importance of vegetable oils and regular fish consumption as part of a healthful diet. Appendix 8 provides a patient handout that clearly defines the components of the Optimal Health Food Pyramid.