Navigation bar--use text links at bottom of page.

(Fire and Cooking in Human Evolution--continued, Part C)

When was fire first
controlled by human beings?


So the next question is obvious: How long have fire and cooking been around, then, and how do we know whether that length of time has been long enough for us to have adapted sufficiently?

Let's take the question one part at a time. The short answer to the first part of the question is that fire was first controlled by humans anywhere from about 230,000 years ago to 1.4 or 1.5 million years ago, depending on which evidence you accept as definitive.

Crux of the question: first control of fire vs. earliest widespread use. Now of course, the crucial question for us isn't just when the earliest control of fire was; it's at what date fire was being used consistently--and more specifically for cooking, so that more-constant genetic selection pressures would have been brought to bear. Given the evidence available at this time, most of it would probably indicate that 125,000 years ago is the earliest reasonable estimate for widespread control.*[108] Another good reason it may be safer to base adaptation to fire and cooking on the figure of 125,000 years ago is that more and more evidence is indicating modern humans today are descended from a group of ancestors who were living in Africa 100,000-200,000 years ago, who then spread out across the globe to replace other human groups.[109] If true, this would probably mean the fire sites in Europe and China are those of separate human groups who did not leave descendants that survived to the present. Given that the African fire sites in Kenya and South Africa from about 1.5 million years ago are under dispute, then, widespread usage at 125,000 years ago seems the safest figure for our use here.

Sequence of stages in control: fire for warmth vs. fire for cooking. One thing we can say about the widespread use of fire probable by 125,000 years ago, however, is that it would almost certainly have included the use of fire for cooking.* Why can this be assumed? It has to do with the sequence for the progressive stages of control over fire that would have had to have taken place prior to fire usage becoming commonplace. And the most interesting of these is that fire for cooking would almost inevitably have been one of the first uses it was put to by humans, rather than some later-stage use.*

The first fires on earth occurred approximately 350 million years ago--the geological evidence for fire in remains of forest vegetation being as old as the forests themselves.[110] It is usual to focus only on fire's immediately destructive effects to plants and wildlife, but there are also benefits. In response to occasional periodic wildfires, for example, certain plants and trees have evolved known as "pyrophytes," for whose existence periodic wildfires are essential. Fire revitalizes them by destroying their parasites and competitors, and such plants include grasses eaten by herbivores as well as trees that provide shelter and food for animals.[111]

Opportunistic exploitation of animal kills by predators after wildfires. Fires also provide other unintended benefits to animals as well. Even at the time a wildfire is still burning, birds of prey (such as falcons and kites)--the first types of predators to appear at fires--are attracted to the flames to hunt fleeing animals and insects. Later, land-animal predators appear when the ashes are smoldering and dying out to pick out the burnt victims for consumption. Others, such as deer and bovine animals appear after that to lick the ashes for their salt content. Notable as well is that most mammals appear to enjoy the heat radiated at night at sites of recently burned-out fires.[112]

It would have been inconceivable, therefore, that human beings, being similarly observant and opportunistic creatures, would not also have partaken of the dietary windfall provided by wildfires they came across. And thus, even before humans had learned to control fire purposefully--and without here getting into the later stages of control over fire--their early passive exposures to it would have already introduced them, like the other animals, to the role fire could play in obtaining edible food and providing warmth.


Potential adaptation to cooking
in light of genetic rates of change


So if fire has been used on a widespread basis for cooking since roughly 125,000 years ago, how do we know if that has been enough time for us to have fully adapted to it?

To answer that, we have to be able to determine the rate at which the genetic changes constituting evolutionary adaptation take place in organisms as a result of environmental or behavioral change--which in this case means changes in food intake.

Rates of genetic change as estimated from speciation in the fossil record. The two sources for estimates of rates at which genetic change takes place are from students of the fossil record and from population geneticists. Where the fossil record is concerned, Niles Eldredge, along with Stephen Jay Gould, two of the most well-known modern evolutionary theorists, estimated the time span required for "speciation events" (the time required for a new species to arise in response to evolutionary selection pressures) to be somewhere within the range of "five to 50,000 years."[113] Since this rough figure is based on the fossil record, it makes it difficult to be much more precise than that range. Eldredge also comments that "some evolutionary geneticists have said that the estimate of five to 50,000 years is, if anything, overly generous."[114] Also remember that this time span is for changes large enough to result in a new species classification. Since we are talking here about changes (digestive changes) that may or may not be large enough to result in a new species (though changes in diet often are in fact behind the origin of new species), it's difficult to say from this particular estimate whether we may be talking about a somewhat shorter or longer time span than that for adaptation to changes in food.

Measurements of genetic change from population genetics. Fortunately, however, the estimates from the population geneticists are more precise. There are even mathematical equations to quantify the rates at which genetic change takes place in a population, given evolutionary "selection pressures" of a given magnitude that favor survival of those individuals with a certain genetic trait.[115] The difficulty lies in how accurately one can numerically quantify the intensity of real-world selection pressures. However, it turns out there have been two or three actual examples where it has been possible to do so at least approximately, and they are interesting enough I'll mention a couple of them briefly here so people can get a feel for the situation.

The most interesting of these examples relates directly to our discussion here, and has to do with the gene for lactose tolerance in adults. Babies are born with the capacity to digest lactose via production of the digestive enzyme lactase. Otherwise they wouldn't be able to make use of mother's milk, which contains the milk sugar lactose. But sometime after weaning, this capacity is normally lost, and there is a gene that is responsible. Most adults--roughly 70% of the world's population overall--do not retain the ability to digest lactose into adulthood[116] and this outcome is known as "lactose intolerance." (Actually this is something of a misnomer, since adult lactose intolerance would have been the baseline normal condition for virtually everyone in the human race up until Neolithic (agricultural) times.[117]) If these people attempt to drink milk, then the result may be bloating, gas, intestinal distress, diarrhea, etc.[118]

Influence of human culture on genetic selection pressures. However--and this is where it gets interesting--those population groups that do retain the ability to produce lactase and digest milk into adulthood are those descended from the very people who first began domesticating animals for milking during the Neolithic period several thousand years ago.[119] (The earliest milking populations in Europe, Asia, and Africa began the practice probably around 4,000 B.C.[120]) And even more interestingly, in population groups where cultural changes have created "selection pressure" for adapting to certain behavior--such as drinking milk in this case--the rate of genetic adaptation to such changes significantly increases. In this case, the time span for widespread prevalence of the gene for lactose tolerance within milking population groups has been estimated at approximately 1,150 years[121]--a very short span of time in evolutionary terms.

Relationship between earliest milking cultures and prevalence of lactose tolerance in populations. There is a very close correlation between the 30% of the world's population who are tolerant to lactose and the earliest human groups who began milking animals. These individuals are represented most among modern-day Mediterranean, East African, and Northern European groups, and emigrants from these groups to other countries. Only about 20% of white Americans in general are lactose intolerant, but among sub-groups the rates are higher: 90-100% among Asian-Americans (as well as Asians worldwide), 75% of African-Americans (most of whom came from West Africa), and 80% of Native Americans. 50% of Hispanics worldwide are lactose intolerant.[122]

Now whether it is still completely healthy for the 30% of the world's population who are lactose tolerant to be drinking animals' milk--which is a very recent food in our evolutionary history--I can't say. It may well be there are other factors involved in successfully digesting and making use of milk without health side-effects other than the ability to produce lactase--I haven't looked into that particular question yet. But for our purposes here, the example does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought.*

Genetic changes in population groups who crossed the threshold from hunting-gathering to grain-farming earliest. Another interesting example of the spread of genetic adaptations since the Neolithic has been two specific genes whose prevalence has been found to correlate with the amount of time populations in different geographical regions have been eating the grain-based high-carbohydrate diets common since the transition from hunting and gathering to Neolithic agriculture began 10,000 years ago. (These two genes are the gene for angiotensin-converting enzyme--or ACE--and the one for apolipoprotein B, which, if the proper forms are not present, may increase one's chances of getting cardiovascular disease.)[123]

In the Middle East and Europe, rates of these two genes are highest in populations (such as Greece, Italy, and France) closer to the Middle Eastern "fertile crescent" where agriculture in this part of the globe started, and lowest in areas furthest away, where the migrations of early Neolithic farmers with their grain-based diets took longest to reach (i.e., Northern Ireland, Scotland, Finland, Siberia). Closely correlating with both the occurrence of these genes and the historical rate of grain consumption are corresponding rates of deaths due to coronary heart disease. Those in Mediterranean countries who have been eating high-carbohydrate grain-based diets the longest (for example since approximately 6,000 B.C. in France and Italy) have the lowest rates of heart disease, while those in areas where dietary changes due to agriculture were last to take hold, such as Finland (perhaps only since 2,000 B.C.), have the highest rates of death due to heart attack. Statistics on breast cancer rates in Europe also are higher for countries who have been practicing agriculture the least amount of time.[124]

Whether grain-based diets eaten by people whose ancestors only began doing so recently (and therefore lack the appropriate gene) is actually causing these health problems (and not simply correlated by coincidence) is at this point a hypothesis under study. (One study with chickens, however--who in their natural environment eat little grain--has shown much less atherosclerosis on a high-fat, high-protein diet than on a low-fat, high-carbohydrate diet.[125]) But again, and importantly, the key point here is that genetic changes in response to diet can be more rapid than perhaps once thought. The difference in time since the advent of Neolithic agriculture between countries with the highest and lowest incidences of these two genes is something on the order of 3,000-5,000 years,[126] showing again that genetic changes due to cultural selection pressures for diet can force more rapid changes than might occur otherwise.

Recent evolutionary changes in immunoglobulin types, and genetic rates of change overall. Now we should also look at the other end of the time scale for some perspective. The Cavalli-Sforza population genetics team that has been one of the pioneers in tracking the spread of genes around the world due to migrations and/or interbreeding of populations has also looked into the genes that control immunoglobulin types (an important component of the immune system). Their estimate here is that the current variants of these genes were selected for within the last 50,000-100,000 years, and that this time span would be more representative for most groups of genes. They also feel that in general it is unlikely gene frequencies for most groups of genes would undergo significant changes in time spans of less than about 11,500 years.[127]

However, the significant exception they mention--and this relates especially to our discussion here--is where there are cultural pressures for certain behaviors that affect survival rates.[128] And the two examples we cited above: the gene for lactose tolerance (milk-drinking) and those genes associated with high-carbohydrate grain consumption, both involve cultural selection pressures that came with the change from hunting and gathering to Neolithic agriculture. Again, cultural selection pressures for genetic changes operate more rapidly than any other kind. Nobody yet, at least so far as I can tell, really knows whether or not the observed genetic changes relating to the spread of milk-drinking and grain-consumption are enough to confer a reasonable level of adaptation to these foods among populations who have the genetic changes, and the picture seems mixed.*

Rates of gluten intolerance (gluten is a protein in certain grains such as wheat, barley, and oats that makes dough sticky and conducive to bread-baking) are lower than for lactose intolerance, which one would expect given that milk-drinking has been around for less than half the time grain-consumption has. Official estimates of gluten intolerance range from 0.3% to 1% worldwide depending on population group.[129] Some researchers, however, believe that gluten intolerance is but the tip of the iceberg of problems due to grain consumption (or more specifically, wheat). Newer research seems to suggest that anywhere from 5% to as much as 20-30% of the population with certain genetic characteristics (resulting in what is called a "permeable intestine") may absorb incompletely digested peptide fragments from wheat with adverse effects that could lead to a range of possible diseases.[130]

What do common genetic rates of change suggest about potential adaptation to cooking? We have gone a little far afield here getting some kind of grasp on rates of genetic change, but I think it's been necessary for us to have a good sense of the time ranges involved. So to bring this back around to the question of adaptation to cooking, it should probably be clear by this point that given the time span involved (likely 125,000 years since fire and cooking became widespread), the chances are very high that we are in fact adapted to the cooking of whatever foods were consistently cooked.* I would include in these some of the vegetable foods, particularly the coarser ones such as starchy root vegetables such as yams, which are long thought to have been cooked,[131] and perhaps others, as well as meat, from what we know about the fossil record.

GO TO NEXT SECTION OF PART 2

(Are Cooking's Effects Black-and-White or an Evolutionary Cost/Benefit Tradeoff?)

Return to beginning of interviews

SEE BIBLIOGRAPHY


SEE TABLE OF CONTENTS FOR: PART 1 PART 2 PART 3

GO TO PART 1 - Setting the Record Straight on Humanity's Prehistoric Diet and Ape Diets

GO TO PART 2 - Fire and Cooking in Human Evolution

GO TO PART 3 - The Psychology of Idealistic Diets / Successes & Failures of Vegetarian Diets

Back to Frank Talk by Long-Time Insiders

   Beyond Veg home   |   Feedback   |   Links