Over time, the inhabitants of Europe and Africa diverged. Africans - some of them, anyhow - gradually came to look more like modern humans. The earliest specimens showing these tendencies date back almost 200,000 years. AMH (Anatomically Modern Humans) had changed a lot above the neck: they were reasonably modern looking, with high foreheads, chins, and flat faces. They were, however, still much more robust than modern humans, with thicker bones and stronger muscles. Although their bodies had changed, their minds had not, at least not yet, not in ways that show up in the archeological record. Since they are our ancestors, we rather expect to see signs of increased innovation associated with modern anatomy, but for a long time there were none. Their stone tools had gradually improved over time - for example, they began to make hafted weapons - but were on the whole very similar to those used by the Neanderthals in Europe. For most of this period, there is no evidence of art, symbolism, or trade among AMH. We would say that they were not yet 'behaviorally modern'; that is, not yet good at inventing and learning complex behaviors.
The strongest proof of this was, we think, their response to the Eemian interglacial period. There have been occasional warm periods in-between the ice ages: we're in one right now, the Holocene. The Eemian was another, lasting from 131,000 to 114,000 years ago, with weather quite similar to that which we experience today.
Conditions in glacial periods were very different: most important, they were very unfavorable for agriculture. The world was dryer and of course colder, but climate was also much more variable than it has been over the past ten thousand years. There were strong warming and cooling trends over thousand-year periods, and those trends were themselves choppy, interrupted by big temperature swings that occurred in as little as a decade. By ‘strong’, we mean temperature swings of up to 15 degrees Fahrenheit, in contrast to swings of about 3.5 degrees Fahrenheit in recent millennia. CO2 levels in the ice age were 25% lower than they were in pre-industrial times (50% lower than today), and those levels were unfavorable to plant growth. Considering these negative factors, Richerson et al have made the case that agriculture was impossible under glacial conditions. The high climate variability would have ensured that no single plant species would do well most of the time, so even if some people experimented with farming, the effort would probably have been abandoned within a generation or two as the weather changed. Far milder climatic swings such as the Little Ice Age (1600-1850) have seriously disrupted agriculture and led to severe famines.
Interglacial conditions were favorable to the development of agriculture, since climate was warmer, wetter, and far more stable than in the ice ages, while carbon dioxide levels were higher. In the current interglacial period, agriculture was developed in short order – more than once, perhaps as many as seven times independently. Before you knew it we had beer, cities and writing. If humans 130,000 years ago had had modern behavioral capabilities, they should have developed agriculture in the Eemian - but nothing much seems to have happened. Anatomically modern humans from Africa moved up into Palestine, along with other characteristically African fauna, but that population seems to have disappeared when the Earth cooled. The other human strains did no better: Neanderthals moved farther north as the ice melted, but retreated south when the Eemian ended. Back then, neither anatomically modern humans in Africa nor archaic humans like Neanderthals had what it took.
There are plenty of other challenges that humans of that era (~100,000 years ago) never met: for example they never colonized the high Arctic, the Americas, or Australia/New Guinea. Even though Neanderthals and Africans had brains that were as large as or larger than those of modern humans, even though humans in Africa were reasonably modern-looking, modern behavioral capacities did not yet exist. They didn't yet have the spark. Come to think of it, most people today still don't. We'll have more to say on that in a moment.
The first Neanderthal skeleton recognized as such was found in a limestone quarry in the Neander Valley in Germany in 1856. At first, this rather odd skeleton was thought to be that of some medieval guy crippled by arthritis (or a Celt, or a diseased Cossack): it was only identified as a representative of an extinct type of human somewhat later. This other human race was named after the site: spelling reform later changed its name to Neandertal and eventually most paleontologists followed, driven by obscure interdepartmental struggles. We’re sticking with ‘Neanderthal’, though: an archaic spelling seems only appropriate for a vanished species.
We know quite a bit about the Neanderthals, more in fact than we know about our anatomically modern African ancestors of that period. In part this is because physical conditions in Western Europe favored preservation so that there really are more fossil remains. In addition, a high level of general education meant that farmers and quarrymen were more likely to call in a professor when they found cave paintings or an odd-looking skeleton: that is, more likely in nineteenth-century Germany or France than in nineteenth-century Africa or China. There were plenty of professors close by, since Neanderthals left their remains conveniently close to famous universities and five-star restaurants. Many Neanderthal skeletons, site, and artifacts have been found – remains from a few hundred individuals, in sharp contrast to the handful of known African human fossils from the same period.
The Neanderthals had big brains (averaging about 1500 cubic centimeters, noticeably larger than those of modern people) and a technology like that of their anatomically modern contemporaries in Africa, but were quite different in a number of ways: different physically, but also socially and ecologically. Neanderthals were cold-adapted, with relatively short arms and legs in order to reduce heat loss - something like Arctic peoples today, only much more so. Considering that the climate the Neanderthals experienced was considerably milder than the high Arctic (more like Wisconsin), their pronounced cold adaptation suggest that they may have relied more on physical than cultural changes. Of course they spent at least six times as many generations in the cold as any modern human population has, and that may have had something to do with it as well.
We don’t yet know for sure, but it seems likely that, as part of their adaptation to cold, Neanderthals were furry. Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mother’s fur as infants. Modern humans don’t have these ridges, but Neanderthals do. Moreover, we know that humans can become furry with very simple genetic changes, since there are a few people working in the circus in which such a change has already taken place.
They were chinless and had big honking noses, heavy brow ridges, a pulled-forward face, and long, low skulls that tended to bulge outwards at the sides. The body form differences are seen in children – and so they were innate, rather being a consequence of their way of life. They were heavily built and muscular, judging from their skeletons, which had larger areas of muscle attachment than those seen in people today or in their contemporary African cousins. This means that they were stronger than us, probably much stronger. You could think of them as being born wrestlers.
Those conclusions about Neanderthal physique come from studying fossils, but modern methods have allowed researchers to draw less obvious conclusions as well.
You’ve probably heard of radiocarbon dating, which takes advantage of the small amount of radioactive C-14 incorporated in living plants and animals. The amount of C-14 in once-living material decreases by 50% in about 5500 years – thus we can estimate the age of biological materials such as wood or bone by measuring how much C-14 remains. This method can be used to date objects back to 50,000 years old or thereabouts – much older and it’s difficult to accurately measure the tiny amount of remaining C-14. This means that this, our most effective dating technique (even better than crashing weddings) can only be used to study the last Neanderthal remains.
More recently, people have begun to look at variations in naturally-occurring stable isotopes, which can sometimes be used to determine ancient diets. For example, nitrogen, a key component of protein, has two stable isotopes that are chemically similar but differ in weight - 14N and 15N. The nitrogen in the atmosphere is mostly (> 99%) 14N, while a little less than half a percent is 15N. Biological processes can cause slight changes in these percentages: in particular, herbivores tend to have higher levels of 15N than the plants they eat, while the carnivores that eat them have still higher levels of 15N. Thus we can learn something about the diet of ancient people by measuring the nitrogen isotope ratios of their bones.
Judging from the isotopic composition of their remains, Neanderthals were meat-eaters, pure top carnivores, comparable to lions or wolves. This is very different from most contemporary hunter-gatherers, who usually depend more on plant foods than meat.
It also seems that they ate almost no fish, which is somewhat surprising, considering that a number of the Neanderthal sites have been along rivers with strong salmon runs.
Like other top carnivores, Neanderthals were thin on the ground. The typical Neanderthal site has relatively more remains of cave bears than later sites occupied by modern humans, which suggests that Neanderthals were scarcer than later humans. We think that there may have been as few as 30,000 of them in all of Europe.
Neanderthal sites are generally found in caves and beneath rock overhangs, which functioned as shelters and also tended to preserve their remains. We have found signs of open-air camps as well, but any structures built seem to have been very simple. Such camps may have been common, but were far less likely to be preserved than sites in caves and rock shelters: as so often in archaeology, differences in preservation can completely obscure the original distribution of things.
Neanderthals used rather sophisticated stone tools, but used almost no bone, ivory, or shell. They had fewer types of tools than their successors. Their “Mousterian” tools (named for the southwestern French site Le Moustier), consisted of carefully shaped flake tools and small hand axes, almost always made from local materials. We find Mousterian tools over vast areas of Europe and western Asia, but there is little variation in space or time, which supports the general impression that their capacity to innovate was low. François Bordes, a famous French archaeologist, has said their technology consisted of “beautiful tools made stupidly” - by rote, or perhaps by instinct.
There are indications that they often used those stone tools to make things out of wood, but only a few such wooden objects have been preserved. We find awls that could pierce animal hides, but no bone needles with eyes, which their successors used to make tailored clothing. There is evidence that they processed hides, presumably for clothing, but they must have been used for blankets or ponchos rather than coats or parkas. That kind of clothing was good enough to let them survive in Ice Age Europe, but evidently not good enough to allow the Neanderthals to settle the high Arctic: ultimately this also kept them out of the Americas. Their front teeth show an unusual pattern of wear – apparently they were used as a third hand or vice, or perhaps to prepare animal skins. Somewhat similar patterns of wear have been observed in peoples like Eskimos that prepare hides by chewing.
Neanderthals had fire, and probably could not have settled ice-age Europe without it. However, they didn’t do anything fancy with it: there were no specialized hearths and there is no evidence that they used lamps.
Neanderthals are the first humans known to have buried their dead, but there is no clear evidence of ceremony or ritual in those burials. We don’t find weapons or decorative objects associated with those graves as we often do with the graves of modern humans. It may be that burial was for them more a way of disposing of unpleasant remains than a ritual occasion. It may have been more like flushing a goldfish down the toilet.
We know that Neanderthals hunted big game (red deer, European bison, sometimes mammoths and rhinos) and took big risks in the process, judging from their many healed fractures. The injury pattern is like that of bronco riders, as documented by Eric Trinkaus. At root, these high risks were a consequence of their lack of projectile weapons, which allow hunters to bring down big game without getting dangerously close – that and a lack of any other safe way of making a living. Neanderthals used stabbing spears and were probably ambush hunters. Using this strategy, they had to get up close and personal with desperate animals that outweighed them several fold, a good way to get hurt.
Since they were so often injured, they had to help each other. That's the only way in which they could have survived while recovering from those serious injuries. You see a similar pattern in some other cooperative-hunting species such as lions: injured members of the pride manage to feed off the kills of others while they recover. We see lots of healed injuries in saber-tooth tigers as well, whose hunting pattern of stabbing from ambush was rather similar to that of Neanderthals. Come to think of it, their heavy, almost bear-like build (compared to that of the other big cats) was also similar. In some cases, Neanderthals carried this cooperation very far, providing care that allowed permanently crippled individuals to reach advanced ages. The most famous example of this is the skeleton of a forty-something man found in Shanidar, in northern Iraq. His right arm was withered and had suffered multiple fractures, while the lower arm and hand had been lost. He had a crippled and withered right leg. In addition, he had suffered a crushing blow to the face that likely left him blind in one eye. All of these were long-healed injuries. Clearly, this guy had been around the block – twice, on his face.
Groups of Neanderthals were able to kill big game, but that would have been far harder for individuals. As a member of the group, they also received the Paleolithic equivalent of health insurance, a necessity in their kind of high-risk hunting. Because of group efficiency in hunting and the high degree of within-group cooperation, membership in a Neanderthal band or tribe was valuable. It would have been almost impossible to survive outside such a group. As Bill Hamilton has said, such arrangements are vulnerable to free-riders, individuals that take advantage of the benefits but don’t pull their weight – which in this case would have meant avoiding hunting and its risks. A high degree of within-band relatedness would have mitigated this tendency – due to kin selection, the principle that behaviors that cost the individual but help close relatives can be favored by selection. This suggests that Neanderthal bands may have been reluctant to accept outsiders, especially males, if we assume that males had the primary responsibility for hunting. Unrelated newbies would have had the most to gain by shirking dangerous duties.
Hamilton pointed out that, among social carnivores, the would-be immigrant often has to go through a difficult probationary period without necessarily succeeding – we see a similar pattern in some recent hunter-gatherers. There would have to have been a way in which new individuals could join the tribe, in order to avoid dangerous levels of inbreeding. It may be that only females changed bands, which is apparently the case in chimpanzees and is the most common pattern in humans.
Trends of this sort may have existed in anatomically modern humans as well, but the high risks associated with Neanderthals’ specialized big-game hunting may have taken things much further. We expect that they were more cooperative than our African ancestors and more clannish. This might have interfered with inter-band relations and made the development of trade and other inter-band social interactions more difficult.
If you take too many chances in the process of making a living, you'll get yourself killed before you manage to raise a family. Therefore there is a maximum sustainable risk per calorie acquired from hunting. If the average member of the species incurs too much risk, more than that sustainable maximum, the species goes extinct. The Neanderthals must have come closer to that red line than anatomically modern humans in Africa. Risks were particularly high because the Neanderthals seem to have had no way of storing food – they had no drying racks or storage pits in frozen ground like those used by their successors. Think of it this way: storage allow more complete usage of a large carcass such as bison that might weigh over a thousand pounds – it wouldn’t be easy to eat all of that before it went bad. Higher utilization - using all of the buffalo - drops the risk per calorie.
Since women in Africa were probably gathering vegetable foods, men there didn't have to produce as much food or produce it as steadily, which meant that they could choose game animals that were safer but less abundant. And that's what they did in Africa's Middle Stone Age (MSA). They went after relatively uncommon but mild-mannered eland, rather than abundant, deadly dangerous Cape buffalo. And as corollary, anatomically modern humans in Africa didn't have as heavy a build as Neanderthals - they didn't need it.
Although Neanderthals must have had very high levels of within-group cooperation, they were no angels: they had a weakness for long pork. As we said before, they may have experienced evolutionary pressures favoring clannishness or even hostility to outsiders. That's natural: reduced competition at one level allows more competition at a higher level. If the members of some ethny could 'all just get along', they could conquer the world, and likely would. We have found clear-cut evidence of cannibalism at several Neanderthal sites. At Krapina, every long Neanderthal bone found had been split open for marrow.
Like other early humans, Neanderthals were relatively uncreative; their tools changed very slowly and they show no signs of art, symbolism, or trade. Their brains were large and had grown larger over time, in parallel with humans in Africa, but we really have no idea what they did with them. Since brains are metabolically expensive, natural selection wouldn't have favored an increase in brain size unless it increased fitness, but we don't know what function that those big brains served. Usually people explain that those big brains are not as impressive as they seem, since the brain-to-body weight ratio is what’s really important, and Neanderthals were heavier than modern humans of the same height.
You may wonder why we normalize brain size by body weight. We wonder as well.
Among less intelligent creatures, such as amphibians and reptiles, most of the brain is busy dealing with a flood of sensory data. You’d expect that brain size would have to increase with body size in some way in order to keep up. If you assume that the key is how much surface the animal has, in order to monitor what’s causing that nagging itch and control all the muscles needed for movement, brain size should scale as the 2/3rds power of weight. If an animal has a brain that’s bigger than predicted by that 2/3rds power scaling law, then maybe it’s smarter than average. That argument works reasonable well for a wide range of species, but it can’t make sense for animals with big brains. In particular it can’t make sense for primates, since in that case we know that most of the brain is used for purposes other than muscle control and immediate reaction to sensation. Look at this way - if dividing brain volume by weight is a valid approach, Nero Wolfe must be really, really stupid.
We think that Neanderthal brains really were large, definitely larger than those of people today. This doesn’t necessarily mean that they were smarter, at least not as a culture. The archaeological record certainly indicates that they were not, since their material culture was definitely simpler than that of their successors. In fact, they may have been relatively unintelligent, even with their big brains. Although brain size certainly is correlated with intelligence in modern humans, it is not the only factor that affects intelligence. By the way, you may have read somewhere (The Mismeasure of Man) that brain volume has no relationship to intelligence, but that’s just a lie.
One paradoxical possibility is that Neanderthals lacked complex language and so had to be smart as individuals in order to learn their culture and technology, while that same lack severely limited their societal achievements. Complex language of the type we see in modern humans makes learning a lot easier: without it, learning to create even Mousterian tools may have been difficult. In that case, individuals would have to repeatedly re-invent the wheel (so to speak) while there would have been little societal progress.
It could also be that Neanderthal brains were less powerful than you’d expect because there just weren’t enough Neanderthals. That may sound obscure, but bear with us. The problem is that evolution is less efficient in small populations, in the same way that any statistical survey – polls, for example -becomes less accurate with fewer samples. Natural selection is pretty good at eliminating a defective gene when its disadvantage is significantly great than the inverse of the population size. When the disadvantage is smaller than that, the defective gene has a reasonable probability of reaching high frequency by drift. It can even become universal in that population. This tendency is insignificant in large populations, but it can lead to problems in small ones, as more and more slightly deleterious mutations accumulate. There is a countervailing tendency – the generation of favorable mutations, which are likely to spread – but that tendency becomes weaker and weaker as the population becomes smaller. Thus, over the long term, a population that is too small is likely to go extinct for purely genetic reasons, if some other disaster doesn’t strike first. This is an issue that concerns conservationists who are trying to maintain endangered species such as the whooping crane or Florida panther.
Neanderthals were not so rare as to risk extinction by genetic load. But the same argument has other implications. Even if a population is big enough for long-term survival, it may still suffer some genetic load. This would matter most for extremely complicated adaptations that relied on precise action of many genes: the more complicated the adaptation, the more vulnerable it would be to this kind of genetic sand in the gears. As it happens, the most complicated human adaptation is the brain, and one might expect that it would show the greatest vulnerability to such problems. There is some direct evidence of this, concerning a different kind of mutation load. Children whose parents are closely related, first cousins or closer, are significantly more likely to have two copies of deleterious recessive mutations – and their IQ is affected to a greater extent that other traits such as height. We think that the long-term effective population size of Neanderthals was less than that of anatomically modern humans, since Africa was less affected by the ice ages, which at their worst made most of Europe uninhabitable – so Neanderthals may have had more problems with genetic load. Because of this, Neanderthals may have had less efficient brains than their anatomically modern contemporaries or humans today.
Our favorite hypothesis is that Neanderthals and other archaic humans had a fundamentally different kind of learning than moderns. One of the enduring puzzles is the near-stasis of tool kits in early humans - as we have said before, the Acheulean hand-axe tradition last for almost a million years and extended from the Cape of Good Hope to Germany, while the Mousterian lasted for a quarter of a million years. Somehow these early humans were capable of transmitting a simple material culture for hundreds of thousands of years with little change. More information was transmitted to the next generation than in chimpanzees, but not as much as in modern humans. At the same time, that information was transmitted with surprisingly high accuracy. This must be the case, since random errors in transmission would have caused changes in those tool traditions, resulting in noticeable variation over space and time – which we do not see.
It looks to us as if toolmaking in those populations was, to some extent, innate: genetically determined. Just as song birds are born with a rough genetic template that constrains what songs are learned, early humans may have been born with genetically determined behavioral tendencies that resulted in certain kinds of tools. Genetic transmission of that information has the characteristics required to explain this pattern of simple, near-static technology, since only a limited amount of information can be acquired through natural selection, while the information that is acquired is transmitted with very high accuracy.
We know less about the archaic humans who lived east of the Movius line. They had long, low skulls, had no chins and low foreheads, but they didn't look like Neanderthals.
They had short flat faces, strong cheekbones, shovel-shaped incisors, and had no occipital bun. There were skeletal differences between the inhabitants of China and those found in Indonesia and Southeast Asia; those southeastern skulls looked more like those of early erectus. We also know that brain size increased over time in this population as well.
Starting 70,000 or 80,000 years ago, we begin to see some signs of increased cultural complexity in Africa. There is evidence of long-distance transport of tool materials (obsidian) in Ethiopia, which could be the first signs of trade. A set of pierced snail shells (~75,000 years old) in Blombos Cave in South Africa seem, judging from wear, to be the remains of a necklace, although there is no evidence that tools were used to pierce the shells. In that same site, researchers found pieces of ochre with a crosshatched pattern inscribed. We have found manufactured ostrich-egg beads in Kenya that are about 50,000 years old, the first clear examples of artificial decorative or symbolic (that is to say, useless) objects. We see a new kind of small stone points that must have been used on darts that were considerably smaller than previous spears. Although it would seem likely that such darts would have been propelled by atlatls, no atlatls have yet been found that date anywhere near that far back. There are reports of 90,000 year-old bone fish spears from central Africa which, if correct, would be evidence of a significant advance in tool complexity. However, since no other similar tools found in Africa are older than 30,000 years, those fish spears are roughly as anomalous as a Neanderthal-era thumb drive, and we have our doubts about that date. On the whole, the African archeological data of this period furnishes examples of new technology and simple symbolic objects, but the evidence is patchy, and it seems that some innovations appeared and then faded away for reasons that we don’t understand.
A note on behavioral modernity: the consensus seems to be that any clear evidence of a population making symbolic or decorative objects establishes their behavioral modernity, defined as cultural creativity and reliance on abstract thought. For some reason, anthropologists treat behavioral modernity as a qualitative character: an ancient population either had it or not, just as women are pregnant or not, never a ‘little bit pregnant’. It’s treated as a Boolean variable. Like so many basic notions in anthropology, this makes no sense. The components of ‘behavioral modernity’ had to be evolved traits with heritable variation, subject to natural selection – how else would they have come into existence at all? Surely ancient individuals and populations varied in their capacity for abstract thought and cultural innovation – behavioral modernity must be more like height than pregnancy.
In practice, this means that wildly different levels of cultural complexity are all lumped together as ‘behavioral modernity’: a few scratches on a rock, art that could easily be created by a three-year-old, is equated with the sophisticated representational art produced by modern humans in ice-age Europe. Obviously, the capabilities required are really the same, in just the same way that three-year olds today are just as capable as adults. This argument has a familiar ring: it’s the same rhetorical trick exemplified in the famous phrase ‘weapons of mass destruction’, which lumped together everything from First World War weapons like chlorine and phosgene gas to multi-megaton fusion bombs – weapons whose dangers varied over four orders of magnitude.
When you think about it further, behavioral modernity is an odd trait. It is a population trait, characteristic of a group rather than an individual. A group in which 99% of all individuals never have a single new idea is considered behaviorally modern, as long as a few individuals occasionally invent something new that is widely adopted and eventually discovered by us. Surely those creative individuals were uncommon - this certainly seems to be the case today. In fact, even in the most creative societies the number of significant new ideas generated per generation is orders of magnitude smaller than the population size. It looks as if there are two requirements for modernity: creative individuals (they can be and usually are rare) and a much larger number of people, most of the population, that are able to adopt and transmit new ideas without necessarily generating any. One can imagine populations that could adopt new ideas but were essentially incapable of generating any on their own.For example, chimpanzees can learn to smoke cigarettes but would never have come up with the filthy habit by themselves. This could well be the case for the late-Neanderthal Châtelperronian culture. And as we said before, one can easily imagine populations that differ in the number of creative individuals they throw off and in the complexity of ideas generated. One can also imagine variation in the ability of groups to learn new ideas: the ability to adopt simple new ideas does not necessarily imply the ability to adopt much more complex ones.
Although this can’t be the whole story, a population’s average intelligence well have determined whether it could adopt innovations of a certain degree of complexity, while the fraction that exceeded some fairly high intelligence threshold was the key factor in generating new ideas. This model treats ‘behavioral modernity’ as a quantitative biological trait like many others, rather than something magical.
The fact the ability to learn complex new ideas and transmit them to the next generation is universal in modern humans suggests that natural selection favored that kind of receptivity. On the other hand, the rarity of individual creativity suggests that the trait itself was not favored by selection in the past, but is instead a rare side effect.
We think that the archaeological record in Africa before the expansion of modern humans shows a gradual but slow increase in such abilities, which is the usual pattern for a trait favored by selection. On the other hand, the rate of change in the European Upper Paleolithic seems faster, almost discontinuous – but there is a well-understood biological pattern that may explain that as well.
The most dramatic evidence of some kind of significant change is the fact that anatomically modern humans expanded out of Africa about 50,000 years ago.
There is no hard evidence for modern humans outside of Africa before then (other than a temporary occupation of Palestine during the Eemian), and the earliest solid date for modern humans in Australia is about 46,000 years ago, which indicates that humans had left Africa before that time. Such an expansion required some new advantage, since it could only have occurred if those Africans were able to out-compete Neanderthals and Easterners on their own turf. Since those archaic humans would have been well-adapted to their local environments, the advantage must have been substantial.
We aren’t certain what that advantage was. If we judge by the Australian archeological record, it wasn't improved tools, at least not at first - the stone tool tradition of Australia is very old-fashioned, not noticeably advanced over that of Neanderthals even long after settlement. Although again, the actual fact that they succeeded in settling Australia indicates that they must have had new abilities, since homo erectus had not managed to reach Australia even though they had occupied Indonesia for more than a million years. Perhaps modern humans had invented floaties.
The expansion out of Africa appears to have been most rapid in the southern, warmer regions of Eurasia. Expansion into Europe came somewhat later, about 40,000 years ago, well after modern humans settled Australia.