Navigation bar--use text links at bottom of page.

These non-evolutionary food classes carry with
them certain health risks with elevated intake, now being
traced at the biochemical and genetic levels.

The Late Role of Grains and Legumes
in the Human Diet, and Biochemical Evidence
of their Evolutionary Discordance

by Loren Cordain, Ph.D.
Copyright © 1999 by Loren Cordain. All rights reserved.

Based on and edited from postings made to the Paleodiet listgroup on 3/29/97, 4/9/97, 4/26/97, 4/29/97, 5/23/97, 6/3/97, 6/8/97, 6/11/97, 6/16/97, 6/23/97, 10/1/97, 1/22/98, and 3/30/98. Note that a much more comprehensive treatment of the issues in this article can be found in professor Cordain's recent peer-reviewed paper on the topic, "Cereal grains: humanity's double-edged sword." (1999) World Review of Nutrition and Dietetics, vol. 84, pp. 19-73.


Introduction: The principle of evolutionary discordance

To set the context for this discussion, let's first briefly recap the basic evolutionary processes to which all species are subject. These principles are fundamental to understanding the repercussions of grains or any other food on the genetics that govern human biology.

An appreciation of the above process of how evolutionary change occurs is fundamental to both the science of evolutionary biology itself and can help to elicit a deeper understanding of dietary fitness and health: it is the "how" and "why" that explains the reasons species come to have the nutritional requirements they do. The foods that humanity originally evolved to eat and those we now eat in modern civilization are in many cases significantly different--yet our basic underlying genetic inheritance remains basically the same as it was before, and has evolved only very slightly since then. Thus, many of the foods we now eat are discordant with our genetic inheritance. (This is not simply an idle or "just so" hypothesis. As we proceed, we will look at the considerable clinical evidence supporting this picture.) Such "evolutionary discordance" is a fundamental aspect of the evolutionary equation that governs fitness and survival (in which health plays a key role), which includes the question of the diet humans are evolved to handle best from the genetic standpoint.

To begin with, we will be examining evolutionary discordance from a general standpoint by looking at the mismatch between the characteristics of foods eaten since the "agricultural revolution" that began about 10,000 years ago compared with our genus' prior two-million-year history as hunter-gatherers. As the article progresses, however, we'll be taking a look at some of the actual genetics involved so it can be seen that "evolutionary discordance" is not merely a theoretical concept but a very real issue with relevance in how diseases can be genetically expressed in response to dietary factors.

With this key concept in mind, let's now begin with a look at the history of grains and legumes in the human diet (quite recent in evolutionary time), after which we'll move on to some of the evolutionarily discordant effects of their consumption on human beings, as seen in modern clinical and genetic studies.

Evidence for the late evolutionary role of grains in the human diet

Digestive considerations and technology required

Question: Granted that grains would not have made up a large portion of the diet. Nevertheless, if people could in some way have comfortably eaten some amount of wild grains without technology, then given the opportunistic nature of human beings, there's not much reason to think they wouldn't have, is there?

Commentary: People can put many plant items as well as non-edible items (stones, bones, feathers, cartilage, etc.) into their gastrointestinal tracts by way of putting them into their mouths. The key here is the ability of the GI tract to extract the nutrients (calories, protein, carbohydrate, fat, vitamins, and minerals). Bi-gastric herbivores (those having second stomachs) have evolved an efficient second gut with bacteria that can ferment the fiber found in leaves, shrubs, grasses, and forbs (broad-leaved herbs other than grass) and thereby extract nutrients in an energetically efficient manner. (That is, there is more energy in the food than in the energy required to digest it.) Humans can clearly put grasses and grass seeds into our mouths; however, we do not have a GI tract which can efficiently extract the energy and nutrients.

The starch and hence carbohydrate and protein calories in cereal grains occur inside the cell walls of the grain. Because the cell walls of cereal grains are almost completely resistant to the mechanical and chemical action of the human GI tract, cereal grains have been shown to pass through the entire GI tract and appear intact in the feces [Stephen 1994]. In order to make the nutrients in cereal grains available for digestion, the cell walls must first be broken (by milling) to liberate their contents and then the resultant flour must be cooked. Cooking causes the starch granules in the flour to swell and be disrupted by a process called gelatinization which renders the starch much more accessible to digestion by pancreatic amylase [Stephen 1994]. It has been shown that the protein digestibility of raw rice is only 25% whereas cooking increases it to 65% [Bradbury 1984].

The main cereal grains that humans now eat (wheat, rice, corn, barley, rye, oats, millet, and sorghum) are quite different from their wild, ancestral counterparts from which all were derived in the past 10,000 years. We have deliberately selected for large grains, with minimal chaff, and which are easily harvestable. The wild counterparts of these grains were smaller and difficult to harvest. Further, separation of the chaff from the grain was time-consuming and required fine baskets for the winnowing process. Once the chaff is separated from the grain, the grains have to be milled and the resultant flour cooked. This process is time-consuming and obviously could have only come about in very recent geologic times. Further, the 8 cereal grains now commonly eaten are endemic to very narrow geographic locations and consequently by their geographic isolation would have been unavailable to all but a selected few populations of hominids.

As touched upon previously, the issue of antinutrients in raw cereal grains is a very real issue. There are components in raw cereal grains which wreak absolute havoc with human health and well-being. The primary storage form of phosphorous in cereal grains is phytate, and phytates bind virtually all divalent ions, i.e., minerals for our purposes. Excessive consumption of whole-grain unleavened breads (50-60% of total calories) commonly results in rickets [Robertson 1981; Ewer 1950; Sly 1984; Ford 1972, 1977; MacAuliffe 1976; Hidiroglou 1980; Dagnelie 1990], retarded skeletal growth [Reinhold 1971; Halsted 1972; Sandstrom 1987; Golub 1996] including hypogonadal dwarfism, and iron-deficiency anemia (will provide the references upon request). The main lectin in wheat (wheat germ agglutinin) has catastrophic effects upon the gastrointestinal tract [Pusztai 1993a]. Additionally, the alkylrescorcinols of cereals influence prostanoid tone and induce a more inflammatory profile [Hengtrakul 1991], as well as depressing growth [Sedlet 1984].

Given the barriers to grain consumption that primitive hominids would have faced, who did not possess the more sophisticated technology only seen since about 15,000 years ago, optimal foraging theory, again, strongly suggests any consumption would have been at extremely minimal levels. Given also the lack of adaptation of the human gut to prevent the negative effects of their consumption which are only mitigated (and only partially) by such technology, it is extremely unlikely cereal grains were ever more than a very minute fraction of the human diet until very recent times.

Genetic changes to the human gut in evolutionary perspective

Question: What evidence is there for the speed at which genetic changes that govern the way the gastrointestinal tract functions can occur? Isn't there evidence showing that, for example, the genes governing lactose intolerance can be quite rapid in evolutionary terms? What basis is there for believing that the human gut is really the same as that of our hominid ancestors during Paleolithic times?

Commentary: There are calculations which estimate how long it took to increase the gene for adult lactase persistence (ALP) in northern Europeans from a pre-agricultural incidence rate of 5% to its present rate of approximately 70% [Aoki 1991]. (Note: The enzyme lactase is required to digest the sugar lactose in milk, and normally is not produced in significant quantity in human beings after weaning.) In order for the gene frequency to increase from 0.05 to 0.70 within the 250 generations which have occurred since the advent of dairying, a selective advantage in excess of 5% may have been required [Aoki 1991].

Therefore, some genetic changes can occur quite rapidly, particularly in polymorphic genes (those with more than one variant of the gene already in existence) with wide variability in their phenotypic expression. ("Phenotypic expression" means the physical characteristic(s) which a gene produces.) Because humans normally maintain lactase activity in their guts until weaning (approximately 4 years of age in modern-day hunter-gatherers), the type of genetic change (neoteny) required for adult lactase maintenance can occur quite rapidly if there is sufficient selective pressure. Maintenance of childlike genetic characteristics (neoteny) is what occurred with the geologically rapid domestication of the dog during the late Pleistocene and Mesolithic [Budiansky 1992].

The complete re-arrangement of gut morphology or evolution of new enzyme systems capable of handling novel food types is quite unlikely to have occurred in humans in the short time period since the advent of agriculture. Some populations have had 500 generations to adapt to the new staple foods of agriculture (cereals, legumes, and dairy) whereas others have had only 1-3 (i.e., Inuit, Amerindians, etc). Because anatomical and physiological studies among and between various racial groups indicate few differences in the basic structure and function of the gut, it is reasonable to assume that there has been insufficient evolutionary experience (500 generations) since the advent of agriculture to create large genetic differences among human populations in their ability to digest and assimilate various foods.

Of the population differences in gastrointestinal function which have been identified, they generally are associated with an increased ability to digest disaccharides (lactose and sucrose) via varying disaccharidase activity. Although insulin metabolism is not a direct component of the gastrointestinal tract, there is substantial evidence to indicate that recently acculturated populations are more prone to hyperinsulinemia and its various clinical manifestations, including non-insulin-dependent diabetes mellitus (NIDDM), obesity, hypertension, coronary heart disease and hyperlipidemia [Brand-Miller and Colagiuri 1994].

It is thought that these abnormalities, collectively referred to as "syndrome X" [Reaven 1994], are the result of a so-called "thrifty gene" [Neel 1962] which some groups have suggested codes for glycogen synthase [Schalin-Jantti 1996]. Consequently, the ability to consume increasing levels of carbohydrate without developing symptoms of syndrome X is likely genetically based and a function of relative time exposure of populations to the higher carbohydrate contents of agriculture [Brand-Miller and Colagiuri 1994].

There are no generally recognized differences in the enzymes required to digest fats or proteins among human populations. Additionally, all human groups regardless of their genetic background have not been able to overcome the deleterious effects of phytates and other antinutrients in cereal grains and legumes. Iranian populations, Inuit populations, European populations, and Asian populations all suffer from divalent ion (calcium, iron, zinc, etc.) sequestration with excessive (>50% total calories) cereal or legume consumption. All racial groups also have not evolved gut characteristics which allow them to digest the food energy which is potentially available in the major type of fiber contained in cereal grains. Further, most of the antinutrients in cereal grains and legumes (alklyrescorcinols, amylase inhibitors, lectins, protease inhibitors, etc.) wreak their havoc upon human physiologies irrespective of differing genetic backgrounds.

Thus, most of the available evidence supports the notion that except for the evolution of certain disaccharidases and perhaps changes in some genes involving insulin sensitivity, the human gut remains relatively unchanged from paleolithic times.

Celiac disease as evidence of genetic and evolutionary discordance

Simoons classic work on the incidence of celiac disease [Simoons 1981] shows that the distribution of the HLA B8 haplotype of the human major histocompatibility complex (MHC) nicely follows the spread of farming from the Mideast to northern Europe. Because there is strong linkage disequilibrium between HLA B8 and the HLA genotypes that are associated with celiac disease, it indicates that those populations who have had the least evolutionary exposure to cereal grains (wheat primarily) have the highest incidence of celiac disease. This genetic argument is perhaps the strongest evidence to support Yudkin's observation that humans are incompletely adapted to the consumption of cereal grains.

Thus, the genetic evidence for human disease (in this case, I have used celiac disease; however, other models of autoimmune disease could have been used) is supported by the archeological evidence which in turn supports the clinical evidence. Thus, the extrapolation of paleodiets has provided important clues to human disease--clues which may have gone unnoticed without the conglomeration of data from many diverse fields (archaeology, nutrition, immunology, genetics, anthropology, and geography).

For a celiac, a healthy diet is definitely cereal-free--why is this so? Perhaps now the evolutionary data is finally helping to solve this conundrum.

Biotin deficiency and the case of Lindow Man

Lindow Man, whose preserved body was found in a peat bog in Cheshire, England in 1984, is one of the more extensively studied of the so-called "bog mummies" [Stead, Bourke, and Brothwell 1986]. The principal last meal of Lindow Man likely consisted of a non-leavened whole-meal bread probably made of emmer wheat, spelt wheat, and barley. Unleavened whole-grain breads such as this represented a dietary staple for most of the less-affluent classes during this time. Excessive consumption of unleavened cereal grains negatively impacts a wide variety of physiological functions which ultimately present themselves phenotypically (i.e., via changes in physical form or growth). The well-documented phytates of cereal grains sequester many divalent ions including calcium, zinc, iron, and magnesium, which can impair bone growth and metabolism. Further, there are antinutrients in cereal grains which directly impair vitamin D metabolism [Batchelor 1983; Clement 1987]; and rickets are routinely induced in animal models via consumption of high levels of cereal grains [Sly 1984].

Less well-appreciated are the ability of whole grains to impair biotin metabolism. My colleague, Bruce Watkins [Watkins 1990], as well as others [Blair 1989; Kopinksi 1989], have shown that biotin deficiencies can be induced in animal models by feeding them high levels of wheat, sorghum, and other cereal grains. Biotin-dependent carboxylases are important metabolic pathways of fatty-acid synthesis, and deficiencies severely inhibit the chain-elongation and desaturation of 18:2n6 (linoleate) to 20:4n6 (arachidonic acid). Human dietary supplementation trials with biotin have shown this vitamin to reduce fingernail brittleness and ridging that are associated with deficiencies of this vitamin [Hochman 1993].

Careful examination of the photograph of Lindow's man fingernail (still attached to a phalange of the right hand [Stead 1986, p. 66]) shows the characteristic "ridging" of biotin deficiency. It is likely that regular daily consumption of high levels (>50% daily calories) of unleavened cereal-grain breads, which Lindow man may have consumed, caused a biotin deficiency, which in turn caused nail ridging.

Antinutritional properties of legumes

Question: So far we have been discussing grains. What about legumes? Could they have been realistically eaten as a staple by primitive groups without cooking, and if they are natural to the human evolutionary experience, why do they cause gas which indicates fermentation of indigestible products in the gut? If they are not natural for us, how do we account for the !Kung and other primitive groups who eat them?

Commentary: As with grain consumption, there are hunter-gatherers who have been documented eating legumes. However, under most cases, the legumes are cooked or the tender, early sprouts eaten raw rather than the mature pod. Some legumes in their raw state are less toxic than others. However, most legumes in their mature state are non-digestible and/or toxic to most mammals when eaten in even moderate quantities. I refer interested readers to:

These references summarize the basics about legume indigestibility/toxicity; however, there are hundreds if not thousands of citations documenting the antinutritional properties of legumes. Legumes contain a wide variety of antinutrient compounds which influence multiple tissues and systems, and normal cooking procedures do not always eliminate these [Grant 1982]. There are a variety of compounds in beans which cause gas. Mainly, these are the non-digested carbohydrates raffinose, stachyose, and sometimes verbascose, which provide substrate for intestinal microflora to produce flatus [Calloway 1971].


Back to Paleodiet & Paleolithic Nutrition

   Beyond Veg home   |   Feedback   |   Links