A year ago I blogged about Richard Wrangham’s book, Catching Fire: How Cooking Made Us Human (it’s a great book, and I recommend it). My main interest in the post was, what were the implications for the diet of our ancestors? And what are they for those of us who follow the Paleo Diet?
As I recollect it (I don’t have the book in Toulouse, where I am currently visiting), Richard’s argument is that our Australopithecine ancestors fueled their relatively small brains, 450 cubic cm (compared with 350-400 cubic cm for forest apes) by digging up underground storage organs of plants, which are rich in starches.
The first members of genus Homo, such as Homo habilis, probably fueled their larger brains, 600 cubic cm, by shifting to eating meat, marrow, and brains of animals they scavenged. Then the big change came – Homo erectus with 870 cubic cm and Homo sapiens (that’s us) with 1400 cubic cm. Richard Wrangham thinks that this jump in cranial capacity was literally fueled by fire, or rather by the ability to cook. Cooking made large guts unnecessary and allowed efficient extraction of energy from food. Let’s call this the Wrangham scenario.
Last week I was discussing this idea by e-mail with Loren Cordain, a Paleo diet pioneer (see his book Paleo Diet). Loren makes an interesting point about how we can estimate when human beings learned to make fire.
The main way that people made fire in prehistory was by using wood on wood friction.
As an aside, in medieval times the main method switched to striking flint with steel, but that required having access to iron, which became widely available very recently, as far as human evolutionary history goes.
Of course, “friction” doesn’t mean that you just rub two pieces of wood together. Instead, you use a fire drill (see Loren’s blog post about it).
The discovery of this method was clearly a side effect of drilling holes to produce, for example, beads for necklaces. It’s an interested case of cumulative culture. You first need to develop a technology for drilling holes in objects. Then, one day as you are drilling a hole in a wooden bead, it suddenly goes up in flames. It’s hard to imagine how a fire drill could be invented any other way.
Items with drilled holes appear relatively late. Thus, according to Loren’s argument, the discovery of the fire drill and ability to produce fire at will must be dated to between 100,000 and 75,000 years ago.
The implications that he draws from this conclusion is that fire became important only quite late in our evolutionary history. Since tubers and roots need to be cooked for safe consumption, these starchy foods also must be a relative evolutionary innovation, and should not be part of the Paleo diet.
This is an interesting argument (let’s call it the Cordain scenario), but I am not convinced by it. If the increase of human brain was not driven by plentiful calories obtained from cooked underground storage organs, then where did the energy come from? It must have been bone marrow and brains of scavenged animals. But did early members of the genus Homo have access to dependable and plentiful supply of such foods?
I still prefer the Wrangham alternative. And to me the question of when human beings learned to reliably produce fire is an interesting, but side issue. Learning how to maintain fire and move it around (e.g., in special fire containers) is much more straightforward, and can be perfected in small steps. If one family’s fire is extinguished they can obtain it from others. So human beings could have been using fire for many hundreds of thousands of years before they learned how to make it.
Unfortunately for the Wrangham hypothesis, there is no evidence that fire was used by people before 800,000 years ago, and widespread use of fire is documented starting 300-400 thousand years ago. This is clearly a problem. But any alternative to the Wrangham hypothesis would have to come up with an explanation of where the calories came from and, even more importantly, how early humans could afford to shrink their guts. If you want to thrive on raw vegetarian diet, then you want to look like this:
I too like the Wrangham hypothesis. People surely learned to get fire from natural fires (lightning, volcanoes, whatever) and keep it alive, hundreds of thousands of years before they thought of the firedrill. They also surely first used fire to clear brush and drive game. Only later would they have figured out about using it to heat, and maybe only after that to cook. Naturally “cooked” food following natural fires may have given them the idea, though burned food is usually pretty well ruined. I actually considerably extend Wrangham: I think the whole task of controlling fire and cooperating to burn brush was a major driver of evolution of sociability and of resource management abilities.
IN general, I agree, but Wrangham also makes the point, somewhere in the book, that chimps are known to use fire for searing food.
A recent archaeological study from a site at Schöningen, North Germany, concludes those early inhabitants of Northern Europe did not use fire. Presumably they ate raw meat! And kept themselves warm in winter without fire somehow!
http://www.sciencedirect.com/science/article/pii/S0047248415000858
Abstract
When and how humans began to control fire has been a central debate in Paleolithic archaeology for decades. … We conclude that the analyzed features and artifacts present no convincing evidence for human use or control of fire. … The lack of evidence for the human control of fire at Schöningen raises the possibility that fire control was not a necessary adaptation for the human settlement of northern latitudes in the Lower Paleolithic.
Meanwhile, other groups elsewhere were using fire 300kya. Seems like there may have been fire users, intermittent fire users, and non-fire-users for a very long period.
I wonder what the climate was like then. I can see how you could get enough calories and nutrients given an abundant supply of large game, but it’s hard to see how you could survive without fire in a typical Russian winter, which which I grew up. More generally, as work by Rob Boyd and others shows, surviving in cold climates requires very elaborate cultural adaptations, as his Inuit example shows.
Sandgathe and colleagues argue that stratified Neandertal sites in France show a pattern of fire use during warmer and drier pasts of the Middle Paleolithic but an absence of fire during cooler and wetter periods. This is consistent with the idea that Neandertals were pretty good at conserving fire but could not make it, in line with Peter T’s picture. In warmer, drier periods they could frequently renew their fire from wildfires, but in cooler wetter periods they could not. Sandgathe et al.’s idea has stirred up some debate.
Given their large brain size and high energy requirements, how could Neandertals persist in NW Europe w/o fire, given Wrangham’s strong physiological argument, particularly in cold periods? Could NW Europe have been repeatedly colonized by fire-bearing people from the southeast who always had access to fire and who lingered for a while after they lost it before going extinct or retreating S and E? I think we probably underestimate the dynamism of past human populations, particularly in the face of high amplitude high frequency climate variation.
PaleoAnthropology 2011: 216−242.
This is a good point. After losing fire, isolated groups could linger for a while, and then go extinct (or go elsewhere).
Interestingly, maintaining fire is a lot like maintaining culture: it’s much easier if you have a large and well-connected population.
Whoops! The preceding anonymous comments was by me!
Tasmanians had fire, but reportedly didn’t know how to make it. Maybe the loss of fire explains the Flores island “hobbits,” with small bodies and chimp sized brains. Take a population of something like H. erectus and let them lose fire. On an island without predation from lions and tigers and bears, and without competition from other hominins, they can survive, but they evolve into something not really human (except in a purely cladistic sense) any more.
That’s a great suggestion, Doug!
“But did early members of the genus Homo have access to dependable and plentiful supply of such foods?”
They pretty clearly did, they’d evolved all the characterstics that allowed for persistent hunting, which can be successful without the use of weapons, especially for larger animals. See Bramble & Lieberman’s Nature article from 2004 “Endurance running and the evolution of Homo”.
The biggest problem with Wrangham’s hypothesis is that the inclusion of cooked starches as a regular part of the diet reliably leads to tooth decay (caries), and there’s no evidence of this until ~10,000 years ago, at the very end of the Paleolithic.
Lieberman notes in his book “The Story of the Human Body” that we do have some adaptations—notably molars much larger than those of chimps—that do seem to indicate we adapted to eating the fibrous USOs that are typical in Africa, but it seems to me most likely that this was a passing phase as we continued on our journey to be coming the planetary apex predator.
There’s another consequence of eating a diet of cooked foods: the jaw fails to develop properly. Again, this isn’t seen until much later than Wrangham’s hypothesis would require.
So I think the best explanation is that we evolved to be persistence hunters in the long period prior to the invention of weapons, and USOs were merely a fallback food.
(This may appear as a dupe, so please forgive. Did not get a confirm that it went through the first time.)
Just got back personally from Germany where I myself went to see all the fossil homonids at the famous paleo museums in Berlin & Tübingen. There’s just no doubt when you have all the skulls & bones in front of your eyes. CJ Hunt’s correct – we were apex carnivore predators and we were tall, with amazing sturdy broad faces & jaws of freaking thickness with amazing perfect teeth. As we become neo we get short, our face changes shape, our jaws get thinner, the teeth are disgusting. By Roman times we’re minute, with thin, delicate faces and fragile jaws that seem to be china tea cups compared to our ancestors, just horrible teeth. Even if we could eat tubers – we shouldn’t. It literally deforms us, de-evolves us, as do grains. Don’t take my word for it. Go to the museums yourself.
But Romans did not eat tubers – they ate cereals. So it doesn’t follow that tubers are bad.
Also, I went Paleo 3.5 years ago. Within 2-3 months after I stopped eating cereals I noticed a great improvement, and 6 months later it was truly remarkable. But I continued, and continue eating potatoes.
Can I ask a simple question? what is the best guess for why our earliest human ancestor came down from the trees to begin harvesting USO’s and other foods?
Another simple question, at what point did man begin doing things repetitively? An example – an animal got cooked in a wildfire, I as prehistoric man encounter it, enjoy it, and try to repeat it again tomorrow using fire. The repetition is due to an image, of enjoyment, of increased security through percieved higher nutrition, or whatever. What im trying to get at is the example of psychological continuity…a human living today in a slightly modified form based on some image held on to from yesterday. We are alone in this foible and id like to narrow down at what point this began to happen. The psychological entity surely begins at this point of holding images. Commonly this is probably called intelligence, but im looking at it from a different angle, the repitition enabling so-called intelligence leading to a mechanical mode of living (psychologically).
Hope that makes some sense.
that first part should read “when”, the why is in the second part.
I have answered the first part of my question to my satisfaction – “Ardipithecus ramidus is about half a million years older than the earliest Australopithecus afarensis and is a bit closer to the last common ancestor between living chimpanzees and humans…A. ramidus was a forest dweller that does appear to have been bipedal due to the forward placement of the foramen magnum. It may have lived a partly arboreal existence, with ape-like grasping feet that facilitated tree climbing. At 4.4 million years ago, Ethiopia’s Ardipithecus ramidus, its foot retains many features of tree-dwelling primates, including a divergent, mobile first toe.”
But it is the beginning of intelligence, or image forming insofar tthat specificially interests me. If its a given (is it?) that this process began with increasing brain size due to increased consumption of OSU’s due to some environmental/climage reason, why then did not other apes also follow this path of OSU consumption. For prehumans, it became habitual? why?
If my questions are stupid, so be it, but im just getting interest in this whole subject and am at the ground level of understanding.
Hi Brian,
No, your question is very interesting. I don’t know the answer to it, apart form just saying that climate changes all the time, forests grow and recede, and so do savannas. As ecological niches open up, they invite colonization by organisms. So, I would imagine, our distant ancestors responded to such an environmental opportunity.
It’s important to keep in mind that humans are not unique in becoming ground-dwellers. Think of baboons, who are a very successful ground-dwelling primate. Incidentally, they also live in rather large groups, probably for cooperative defense against predators.
Where did the calories come from? The answer is most probably fat. Prey animals contain 40-50% fat in caloric terms. Visceral fat, intermuscular fat, intramuscular fat. Brains and marrow compose only small portion of animals’ fat. See my paper Man the Fat Hunter for details.
Can you provide a link?