February 9, 2010

-page 104-

Many Upper Palaeolithic sites feature elements interpreted as evidence of housing. These are commonly patterns of bone or stone concentrations that seem to delineate hut or tent structures. At the sites of Étiolles and Pincevent, in France, the distribution of stone artifacts, animal bones, hearths, and postholes has been interpreted as evidence of clearly defined huts. At Mezhirich, in the Ukraine, and Kostenki, in Russia, hut structures were found made of stacked or aligned mammoth bones. Distinctive hearths, often lined or ringed with rocks, is much more common in the Upper Palaeolithic Period than in earlier times.


Stone for tools was often obtained from more distant sources, sometimes in larger quantities than seen previously in the Stone Age. Occasionally, stone was traded or carried over several hundred kilometres. It seems likely, therefore, that trade and transport routes were more formalized than they had been in earlier times. The Upper Palaeolithic Period, without exception unites the trade of exotic materials, through proper documentation such oceanic marine life and exotic semiprecious stones, for personal ornamentation clears itself through the parchments and leaves such ornaments as beads or necklaces, or attire in needs of immediate purchase.

In the Upper Palaeolithic Period, evidential rituals are held in respective set prioritized classification for those categorical to a burial, least of mention, there is a strong possibility that by some sorted and less than are today’s religious ceremonials that might have had to some beginning during its time. For example, at Sungir, in Russia, three individuals were buried with ivory spears, pendants and necklaces of shells and animal teeth, and thousands of ivory beads that had apparently been sewn into their clothing.

The prehistoric psychological subject of the humanities, in favouring the arts, would paint, sculpture, and engrave. Comparably or by contrast this subject is inclined by such are the interests concerning the mind’s eye. Sites in Europe are famous for their artwork, but prehistoric Stone Age art has also been richly documented in Africa, Australia, and other parts of the world. Animals are common subjects of Upper Palaeolithic art, and human figures and abstract elements such as lines, dots, chevrons, and other geometric designs are also found.

Early humans around the world used natural materials such as red and yellow ochre, manganese, and charcoal to create cave art. Among the hundreds of European sites with Upper Palaeolithic cave paintings, the setting classifications confirming Altamira, in Spain, and Lascaux and the more recently discovered (and archaeologically oldest) Chauvet, in France, gives to a better understanding of imaginative Creationism. Animals such as bison, wild cattle, horses, deer, mammoths, and woolly rhinoceroses are represented in European Upper Palaeolithic cave arts, along with human figures that are uncommon. Later Stone Age paintings of animals have been found at sites such as in Apollo 11 Cave, in Namibia, and stylized engravings and paintings of circles, animal tracks, and meandering patterns have been found in Australia’s Koonalda Cave and Early Man Shelter.

Many small sculptures of human female forms (often called Venus figurines) have been found in many sites in Europe and Asia. Small, stylized ivory animal figures made more than 30,000 years ago were discovered in Vogelherd, Germany, and clay sculptures of bison were found in Le Tuc d̀Audoubert, in the French Pyrenees. In addition, many utilitarian objects-such as spear throwers and batons-were superbly decorated with engravings, sculptures of animals, and other motifs.

The earliest known musical instruments also come from the Upper Palaeolithic. Flutes made from long bones and whistles made from deer foot bones have been found at a number of sites. Some experts believe that Upper Palaeolithic people may have used large bones or drums with skin heads as percussion instruments.

The archaeological record of the Upper Palaeolithic shows a creative explosion of new technological, artistic, and symbolic innovations. There is little doubt that these populations were essentially modern in their biology and cognitive abilities and had fully developed language capabilities. There is a much greater degree of stylistic variation geographically (Some archaeologists have suggested that this be evidence of the emergence of ethnicity) and a more rapid developmental pace during the Upper Palaeolithic Period than in any previous archaeological period. Anthropologists debate whether these new Upper Palaeolithic patterns are due to biological transition or whether they are simply the products of accumulated cultural knowledge and complexity through time.

The Mesolithic (also known as the Epipaleolithic) extends from the end of the Pleistocene Ice Age, about 10,000 years ago, until the period when farming became central to peoples’ livelihood, which occurred at different times around the world. The term Mesolithic is generally applied to the period of post-Pleistocene hunting and gathering in Europe and, sometimes, parts of Africa and Asia. In the Americas the post-glacial hunters-a gatherer stage that predates the dominance of agriculture is usually called the Archaic. In the rest of the world, Mesolithic sites are usually characterized by microliths. Microlithic blade segments were commonly retouched into a range of shapes, including crescents, triangles, rectangles, trapezoids, and rhomboids, and thus the tools are often called geometric microliths. These forms often have multiple sharp edges. Many of these microliths probably served as elements of composite tools, such as barbed or a knife edge-tipped spears or arrows, or wooden-handled knives. The microliths were likely inserted into shafts or handles of wood or antler and reinforced with some type of adhesive.

The end of the ice age brought rapid environmental change in much of the world. With the warmer, post-glacial conditions of the Holocene Epoch, ice sheets retreated and sea levels rose, inundating coastal areas worldwide. Temperate forests spread in many parts of Europe and Asia. As these climatic and vegetative changes occurred, large herds of mammals, such as reindeer, were replaced by more solitary animals, such as a red deer, roe deer, and wild pig. Cold-adapted animals, such as the reindeer, elk, and bison, retreated to the north, while others, such as the mammoth, giant deer, and woolly rhinoceros, went extinct. The rich artistic traditions of Upper Palaeolithic Western Europe declined markedly after the end of the ice age. This may in part be because the changing environment made the availability of food and other resources less predictable, requiring population to spend more time searching for resources, leaving less time to maintain the artistic traditions.

Well-studied Mesolithic/Archaic sites include Star Carr, in England; Mount Sandel, in Ireland; Skara Brae, in Britain’s Orkney Islands; Vedbæk, in Denmark, Lepenski Vir, in Serbia, Jericho, in the West Bank, Nittano, in Japan, Carrier Mills, in Illinois, and Gatecliff Rockshelter, in Nevada. In the sub-Saharan Africa, many Later Stone Age sites of the Holocene Epoch could broadly be termed Mesolithic, due to their geometric microliths and bows and arrow technology.

During the Mesolithic, human populations in many areas began to exploit a much wider range of foodstuffs, a pattern of exploitation known as broad spectrum economy. Intensifying benefits from unidentified foods made up of wild cereals, seeds and nuts, fruits, small game, fish, shellfish, aquatic mammals and, tortoises, and invertebrates such as a snail. Seemed, nonetheless, that domesticated dogs or other smaller mammals were crucial to those societies possessing them, whereas, wolves were domesticated in Eurasia and North America to become, our dogs used as hunting companions, sentinels, and, in some societies, food.

Initially, one of the most puzzling features of animal domestication is the seeming arbitrariness with which some species have been domesticated while their close relatives have not. It turns out that all but a few candidates for domestication have been eliminated by the Anna Karenina principle. Humans and most animal species make for an unhappy marriage, for one or more of many possible reasons: the animal’s diet, growth rates, mating habits, disposition, tendencies to panic, and several distinct features of social organization. Only a few defiant mammal interchanges had ended in happy marriages with humans, by virtue of compatibility on all those separate counts.

To appreciate the changes that have developed under domestication compare the evolution of wolves, the wild ancestors of domestic dogs, particularly placed on parallel with the many breeds of dogs. Some dogs are much bigger than wolves (Great Danes), while others are much smaller (Pekingese). Some are slimmer and built for racing (greyhounds), while others are short-legged and useless for racing (dachshunds). They vary enormously in hair form and colour, sand some are even hairless. Polynesians and Aztecs developed dog breeds specifically raised for food. Comparing a dachshund with a wolf, you would not even suspect that the former had been derived from the latter if you did not already know it. Wolves were independently domesticated to become dogs in the Americas and probably in several different parts of Eurasia, including China and Southwest Asia. Modern pigs are derived from independent sequences of domestication in China, western Eurasia, and possibly other areas well. These examples reemphasize that the same few suitable wild species attracted the attention of many different human societies.

That is, domestication involves wild animals’ being transformed into something in addition as applicatory to humans. Truly domesticated animals differ in two ways from their wild ancestors. These differences result from two processes: human selection of those individual animals more useful to humans than other individuals of the same species, and automatic evolutionary responses of animals to the alternate forces of natural selection operating in human environments as compared with wild environments.

The wild ancestors of the Ancient Species of Big Domestic Herbivorous Mammals were spread unevenly over the globe. South America had only one such ancestor, which gave rise to the Ilama and alpaca. North America, Australia and sub-Saharan Africa had none at all. The lack of domestic mammals indigenous to sub-Saharan Africa is especially astonishing, since a main reason why tourist s visit Africa today is to see its abundant and diverse wild mammals. In contrast, the wild ancestors of the Big Herbivorous Mammals were confined to Eurasia.

We are reminded to the many ways in which big domestic mammals were crucial to those human societies that possessing them. Most notably, they provided meat, milked products, fertilizer. Land transport, leather, military assault vehicles. Plow traction, and wool, as well as germs that killed previously unexposed people s.

In addition, of course, small domestic mammals and domestic birds and insects have been useful to humans. Many birds were domesticated for meat, eggs, and feathers: the chicken in China, various duck and goose species in parts of Eurasia, turkey in Mesoamerica, guinea fowl in Africa, and the Muscovy duck in South America. Wolves were domesticated in Eurasia and North America to become dogs used as hunting companions, sentinel, pets, and in some societies, food. Rodent s and other small mammals domesticated for food include the rabbit in Europe, the guinea pig in the Andes, a giant rat in West Arica, and possibly a rodent called the hutia on Caribbean islands. Ferrets were domesticated in Europe to hunt rabbits, and cats were domesticated in North Africa and Southwest Asia t o hunt rodent pests. Small mammals domesticated as recently as the 19th and 20th centuries include foxes, mink, and chinchillas grown for fur and hamsters kept as pets. Even some insects have been domesticated, notably Eurasia’s honeybee and China’s silkworm moth, kept fo r honey and silk, respectfully.

Many of these small animals thus yielded food, clothing, or warmth. But none of them pulled plows or wagons, none bore riders, none except dogs pulled sleds or became war machines, and none of them have been as important food as have big domestic mammals.

That is, domestication involves wild animals’ being transformed into something more useful to humans. Truly domesticated animals differ in processes: human selection of those individual animals more useful to humans than other individuals of th same species, and automatic evolutionary responses of animals to the altered forces of natural selection operating in human environment ‘s’ as compared with wild environment ‘s’.

The ways in which domesticate animals have diverged from their wild ancestors include many species changing in size: cows, pigs and sheep became smaller under domestication, while guinea pigs became larger. Sheep and alpacas were selected for retention of wool and reduction of loss of hair, while cows have been selected for high milk yields. Several species of domestic animals have smaller brains and less developed sense organs than their wild ancestors, because they no longer need the bigger brains and more developed sense organs on which the r ancestors depended to escape from wild predators.

Dates of domestication provide a line of evidence confirming Galton’s view that early herding-peoples quickly domesticated all big mammal species suitable for being domesticated. All species for whose dates of domestication have archaeological evidence were domesticated between about 8000 and 2500 Bc.-that is, within the first few thousand years of th sedentary farming-herding societies that arose after the end of the last Ice Age. Although the era of big mammal domestication began with the sheep, goat, and pig and ended with camels. Since 2500 Bc., there have been no significant additions.

It’s true, of course, that some small mammals’ wee first domesticated long after 2500 Bc. For example, rabbits were not domesticated for food until the Middle Ages, ice and rats for laboratory research not until the 20th century, and hamsters for pets not until the 1930s. The continuing developments of domesticated small mammals aren’t surprising, because there are literally thousand’s of wild species as candidates, and because they were of too little value to traditional societies to warrant the effort of raising them. But big mammal domestication virtually ended 4,500 years ago. By then, all of the world’s 148 candidate big species must have been tested innumerable times, with the result that only a few passed the test and no other suitable ones remained.

Almost all of domesticated large mammals that live in herds, they maintain a well-developed dominance hierarchy between herd members and the herds occupy overlapping home ranges than mutually exclusive territories. When the herd is on the move, its members maintain a stereotype order: in the rear, the stallion. In the front, the top-ranking female, followed by her foals in order of age, with the youngest first, an behind her, the other mares in order of rank, each followed by her foals in order of age. In this way, many adults coexist in the herd without constant fighting and with each knowing in its rank.

That social structure is ideal for domestication, because of a pack line follow the human leaders, they would normally follow the top-ranking female. Herds o packs of sheep, goat, cow and ancestral dogs (wolves) have a similar hierarch, as young animals grow up in such a herd, they imprint onto the animals that they regularly see nearby. Under wild conditions these are members of their own species, but captive young animals also see humans nearby and imprint on humans as well.

Such social animals lend themselves to herding. Since they are tolerant of each other, they can be bunched up. Since they instinctively follow a dominant leader a will imprint on humans as that leader, they can readily be driven by a shepherd or sheepdog. Herd animals do well when penned in crowed conditions, because they are accustomed to ling on densely packed groups in the wild.

In contrast, members of most solitary territorial animal species cannot be herded. They do not tolerate each other, they do not imprint on humans, and the are not very submissive. Whoever saw a line of cats (solitary and territorial in the wild) following a human or allowing themselves to be herded by a human? Every cat lover knows that cats are not submissive too human in the way dogs instinctively are. Cats and ferrets are the sole territorial mammal species that were domesticated, because our motive for doing so was not to herd them in large roup raised for food but to keep them as solitary hunters or pets.

While most solitary territorial species haven’t been domesticated, it’s not conversely thus haven’t been domesticated. It’s not conversely the case that most herd species can be domesticated Most can’t for one of several additional reasons.

Most herd species don’t have overlapping home ranges but instead domesticated. Most can’t for one several additional reason.

Herds of many species don’t have overlapping home ranges but instead maintain exclusive territories against other herds. It’s n more possible to pen two such herds together than to pen two males in solitary species. Again, many species that live in herds for par of the yer are territorial in the breeding season, when they fight and do not tolerate each other’s presence.

Once, agin, many species that living in herds for part of the year are territorial in the breading season, when they fight and do not tolerate each other’s presence. That’s true of most deer and antelope species (again with the exception of reindeer), and it’s one of the main factors that has disqualified all the social antelope species for which Africa is famous from being domesticated. While one’s first association to African antelope is “vast dense herd spreading across the horizons.” In fact, the males of those herds space themselves into territories and fight fiercely with each other when breeding. Hence those antelope cannot be maintained in crowded encloses in captive, as can sheep or goats or cattle. territorial behaviour similarly y combines with a fierce disposition and a slow growth rate to banish rhinos from the farmyard.

Finally, many herd species, including again most deer and antelope, do not have a well-defined dominance hierarchy and are not instinctively prepared to become imprinted on a dominant leader (hence to become minim-printer on humans). As a result, though many deer and antelope species have been tamed (think of all those true Bambi stories), one never sees such tame deer and antelope driven in herds like sheep. That problem also derailed domestication of North American bighorn sheep, which belong to the same genus as Asiatic mouflon sheep, ancestors of our domestic sheep. Bighorn sheep are suitable to us similar to mouflons in most respects except a crucial one: they lack the mouflon’s stereotypical behaviour whereby some individuals behave submissively toward other individual whose dominance e they acknowledge.

The Fertile Crescent’s biological diversity over small distances contributed to an advantage-its wealth in ancestors not only on valuable crops but also of domesticated big mammals, least of mention, there were few or no wild mammal species suitable for domestication in the other Metatherian zones of California, Chile, southwestern Australia, and South Africa. In contrast, four species of big mammals-the goat, sheep, pig and cow were domesticated very early in the Fertile Crescent, possibly earlier than any other animal except the dog anywhere else in the world. Those species remain today four of the world’s five most important domesticated mammals. Yet, their wild ancestors were commonest in different parts of the Fertile Crescent, so that the four species were domesticated in different places: sheep possibly in the central part, goats either in the eastern part at higher elevation (the Zagros Mountains of Iran) or in the southwestern part (the Levant), pigs in the northern-central part, and cows in the western part, including Anatolia. Nevertheless, although the areas of abundance of these four wild progenitors thus differed, all four lived in sufficiently equivalent proximity that they were readily transferred after domestication from one part of the Fertile Crescent to another, and the whole region ended with all four species.

Agriculture was launched in the Fertile Crescent by the early domestication of eight crops, termed “founder crops” (because they founded agriculture in the region and possibly in the world). Those eight founders were the cereal’s emmer wheat, einkorn wheat, and barley, the pulse’s lentil, pea, chickpea, and bitter vetch, and the fibre crop flax. Of these eight, only two, flax and barley, range in the wild at all widely outside the Fertile Crescent and Anatolia. Two of the founders had very small ranges in the wild, chickpea being confined to southeastern Turkey and emmer wheat to the Fertile Crescent itself. Thus, agriculture could arise in the Fertile Crescent from domestication of locally available wild plants, without having to wait for the arrival of crops derived from plants domesticated elsewhere. Conversely, two of the eight founder crops could not have been domesticated anywhere in the world except in the Fertile Crescent, since they did not occur wild elsewhere.

Early food production in the Fertile Crescent is that it may have faced less competition from the hunter-gatherer lifestyle than that in another area, including the western Mediterranean. Southwest Asia has a few large rivers and only a short coastline, providing merger aquatic resources (in the form or river and coastal fish and shellfish). One of the important mammal species hunted for meat, the gazelle, originally lived in huge herds but was over exploited by the growing human population and reduced to low numbers. Thus, the food production package quickly became superior to the hunter-gatherer package. Sedentary villages based on cereals were already in existence before the rise of food production and predisposed those hunter-gatherers to agriculture and herding. In the Fertile Crescent the transition from hunter-gatherer to food production took place fast: as late as 9000 Bc. People still had no crops and domestic animals and were entirely dependent on wild foods, however by 6000 Bc. Some societies were almost dependent on crops and domestic animals.

The situation in Mesoamerica contrasts strongly: that area provided only two domesticable animals (the turkey and the dog), whose meat yield was far lower than that of cows, sheep, goats, and pigs, and corns, Mesoamerica’s staple grain, was, difficult to domesticate and perhaps slow to develop. As a result, domestication may have begun in Mesoamerica until around 3500 Bc. (The date remains very uncertain): those first developments were undertaken by people who were still nomadic-hunter-gatherers, and settled villages did not arise there until around 1500 Bc.

Our comparison begins with food production, a major determinant of local population size and society complexity-so, an ultimate factor behind the conquest. The most glaring difference between American and Eurasian food production involved big domestic mammal species. The encountering evolutionary principles can be extended to understanding much else about life besides marriage. We have by tendencies to seek easy, single-factor explanations of success. For most important things, though, success actually required avoiding many separate possible causes of failure. The Anna Karenina principle explains a feature o animal domestications that had heavy consequences for human history-namely that so many seemingly suitable big wild mammal species, such as zebra and peccaries, have never been domesticated and that the successful domestications were almost exclusively Eurasian. Having had of happenings, some active events were readily marked by some untold story that so many wild plant species are seemingly suitable for domestication, least of mention, that they were never domesticated.

That has an enormous set of differences between Eurasian and Native American societies due largely to the Late Pleistocene extinction (extermination?) Of most North and South America’s former big wild mammal species, was, that if it had been for those extinctions, modern history might have taken a different course. When Cortés and his bedraggled adventurers landed on the Mexican coast in 1519, they might have been driven into the sea by thousands of Aztec cavalry mounted on domesticated native American horses. Instead of the Aztec’s dying of smallpox, the Spaniards might have been wiped out by American germs transmitted by disease-resistant Aztecs. American civilizations resting on animal power might have been sending their own conquistadores t ravage Europe. However, it is, nonetheless, that these hypothetical outcomes were foreclosed by mammal extinction’s thousands of years earlier.

Apparently standing, it was nonetheless, parts of the explanation for Eurasia’s having been the main site of big mammal domestication is that it was the continent with the most candidates’ species of wild mammals to start with, and lost the fewest candidates to extinction in the last 40,000 years. It is also true that the percentage of candidates was domesticated is highest in Eurasia (18 percent), and is especially low in sub-Saharan Africa (no species domesticated out of fifty-one candidates.) Particularly surprising is the large number of species of African mammals that were never domesticated, despite there having Erasmian findings, under which are some counterparts that were zebras? Why Eurasia’s pigs, but not American peccaries or Africa’s three species of true wild pigs? Why Eurasia’s five species of wild cattle (aurochs, water buffalo, yaks, gaur, the banteng), but not the African buffalo or America bison? Why the Asian mouflon sheep (ancestor of our domestic sheep), bu not North American bighorn sheep.

Nevertheless, no one would seriously describe this evolutionary process as domestication, because and bats and other animal consumers do not fulfill the other part of the definition: they do not, consciously grow plants. In the same way, the early unconscious stages of crop evolution from wild plants consist of plants evolving in ways that attracted humans to eat and disperse their fruit without yet intentionally growing them. Human latrines, like those of aardvarks, may have been a testing ground of the first unconscious crop breeders.

Latrines are merely one many places where we accidentally sow the seeds of wild plants that we eat. When combining edible wild plants, then venturing home with our bounty, some seedlings spill en route or at our house. Some fruit rots while still containing perfectly good seeds, and gets thrown out uneaten into the garbage. As parts of the fruit that we take into our mouths, strawberry seeds are tiny and inevitably swallowed and defecated, but other seeds are large enough to be spat out. Thus, our spittoon and garbage dumps joined our latrines to form the fist agricultural research laboratories.

From your berry-picking days, you know that you select particular berries or berry bushed. Eventually, when the first farmers began to sow seeds deliberately, they would inevitably sow those from the plants they had chosen to gather, though they didn’t understand the genetic principle that big barriers have seeds likely to grow into bushed yielding more big berries. So, then, amid the mosquitoes on a hot humid day, you do not do it for just any strawberry bush. Even if unconsciously, you decide which bush looks most promising, and whether its worth it at all. What are your unconscious criteria?

One criterion, of course, is size. You prefer large berries, because it is not worth your while to getting sunburnt and mosquitos bitten for some lousy little berries. That provides part of the explanation why many crop plants have much bigger fruits than their wild ancestors do. It is especially familiar that modern supermarket strawberries and blaeberries are gigantic compared with wild ones, and those differences arose in recent centuries.

Still another obvious difference between seeds that we grow and many of their wild ancestors are in bitterness. Many wild seeds evolved to be bitter, bad-tasting and some poisonous, to deter animals from eating them. Thus, natural selection acts oppositely on seeds and on fruits. Certain plants, and their maturing fruit and seeds bore of a secured ripening, in that animals and the seeds themselves held within the fruit stayed of a bad-taste. Otherwise, the animal would also chew up the seed, and it could not sprout.

While size and tastiness are the most obvious criteria that human hunter-gatherers select wild plants, other criteria include fleshy or seedless fruits, oily seeds, and long and fibrous wild squashes and pumpkins have almost no fruit around their seeds, however, the preferences of early farmers selected for squashes and pumpkins consisting of far more flesh than seeds. Cultivated bananas were selected long ago to be all flesh and no seed, by that inspiring modern agricultural scientist to develop seedless oranges, grapes, and watermelon as well. Seedlessness provides a good example of how human selection can completely reverse the original evolved function of a wild fruit, which in nature serves as a vehicle for dispersing seeds.

In ancient times many plants were similarly selected for oily fruits or seeds. Among the earliest fruit trees domesticated in the Mediterranean world were olives, cultivated since around 4000 Bc. and used for their oil. Olive crops are not only bigger but also bursting oilers than wild ones. Ancient farmers selected sesame, mustard, poppies, and flax as well for oily seeds, while modern plant scientists have done the same for sunflower, safflower, and cotton.

It seems, least of mention, that’s why Darwin, in his On the Origin of Species, didn’t start with an account of natural selection, but instead smithed a lengthy account of how our domesticated plants and animals arose through artificial selection by humans. Rather than discussing the Galápagos Islands that were usually associated with him, Darwin began by discussing-how farmers develop varieties of gooseberries. He Wrote, “I have seen great surprise expressed in horticultural works at the wonderful skill of gardeners, in having produced such splendid results from such poor materials, but the art g=has been simple and regarding the result, has been followed almost unconsciously. It has consisted in always cultivating the best-known variety, sowing its seeds, and, when a better variety chanced to appear, selecting it, and so onward.” Those principles of crop development by artificial selection still serve as our most understandable model of the origin of species by natural selection.

Nonetheless, it is to say, that conflicting and the uncertainties of environmental characterological are ordinarily subjected to the instability but founded similarities that entice those propensities that these animals revisit their domesticated areas, in that, it is, nevertheless, that they lived independently in the surrounding of several different sites. Such cases can often be detected by analysing the resulting morphological, genetic, or chromosomal differences between specimens of the same crop or doleritic animal in different areas. For instance, India’s zebu breeds of domestic cattle possess humps lacking in western Eurasian cattle’s breeds, and genetic analyses show that the ancestor of modern Indian and western Eurasian cattle breeds diverged from each other hundreds of thousands of years go, long before animals were domesticated anywhere. That is, cattle were domesticated independently in Indian western Eurasia, within the last 10,000 years, starting with wild Indian and western Eurasian cattle subspecies that had diverged hundreds of thousands of years earlier.

Did all those peoples of Africa, the Americas, and Australia, despite their enormous diversity, nonetheless share some cultural obstacles to domestication not shared with Eurasian peoples? For example, did Africa’s abundance of big mammals, available to kill by hunting, make it superfluous for Africans to go to the trouble of tending domestic stock?

The answer to that question is unequivocal: No. The interpretation is refuted by differing types of evidence: rapid acceptance of Eurasian domesticates by non-Eurasian peoples, the universal human penchant for keeping pets, the rapid domestication of the Ancient Species of Big Domestic Mammals, the repeated independent domestications of some of them, and the limited successes of modern efforts art further domestications.

When Eurasia’s domestic mammal reached sub-Saharan Africa, they were adopted by the most diverse African people whenever conditions permitted. Those African herders thereby achieved ca huge advantage over African hunter-gatherers and quickly y displaced them. In particular, Bantgu farmers who acquired cows and sheep spread out of their homeland in Wes t Africa and within a short time overran the former hunter-gatherers in most of the rest of sub-Saharan Africa. Even without acquiring crops, Khoisan peoples who acquired cows and sheep around 2,000 years ago, displaced Khoisan hunter-gatherers over much of southern Africa. The arrival of the domestic horse in West Africa transformed warfare there and turned the are a into a set of kingdoms dependent on cavalry. The only factor that prevented horse s from spreading beyond West Africa was trypanosomic diseases borne by tsetse flies.

The same pattern repeated itself elsewhere in the word, whenever peoples lacking native wild mammal species suitable for domestication finally had the opportunity y to acquire Eurasian domestic animals. European horses were eagerly adopted by Native American in both North and South America, within a generation of the escape of horses from European settlements. For example, by the 19th century y North America’s Great Plain Indians wee famous as expert horse-mounted warriors and bison hunters, bu t they did not eve n obtain horses until the late 17th century, sheep acquired from Spaniards similarly transformed Navajo Indian societies and led to, among other things, th e weaving of beautiful woolen blankets fo r that the Navajo have become renowned. Within a decade of Tasmania’s settlement by Europeans with dogs, Aboriginal Tasmanians, used in hunting. Thus, among the thousands of culturally diverse native peoples of Austral, the Americas, and Africa, no universal cultural taboo stood in the way of animal domestication.

Surely, in some local wild mammal species of those continents had been domesticable, some Australian, American, and African peoples would have domesticated them and gained advantage from them, just as they benefited from the Eurasian domestic animals that they immediately adopted when those became available. For instance, consider all the peoples of sub-Saharan Africa living within the range of wild zebras and buffalo, why wasn’t there at least one African hunter-gatherer tribe that domesticated those zebras and buffalo and that thereby gained sway over other Africans, without having to await the arrival of Eurasian horses and cattle? All these facts indicate that th explanation for the lack of native mammal domestication outside Eurasia lay with the locally available wild mammals themselves, not with the local peoples.

Nonetheless, strong domesticate evidence for the same interpretation comes from pets. Keeping wild animals as pets, and taming them, constitute an initial stage in domestication. But pets have been reported from virtually all traditional human societies on all continents. The variety of wild animals thus tamed is far greater than the variety eventually domesticated, and includes some species that we would scarcely have imagined as pets.

For example, in the New Guinea villages it is often seen in the accompaniment to people with pet kangaroos, possums, and birds ranging from flycatchers to ospreys. Most of these captives are eventually eaten, though some are kept just as pets. New Guineans even regularly capture chicks of wild cassowaries (an ostrich-like large, flightless bid) and raise them to et s a delicacy-even though captive adults’ cassowaries are extremely dangerous and now and then disembowel village people. Some Asian people tame eagles for purposes of hunting, although those powerful pets have also been known o n occasion to kill their human handlers. Ancient Egyptians and Assyrians, and modern Indians, tamed cheetahs for use in hunting. Painting made by ancien Egyptians show that they further tamed the hoofed mammals such as gazelles and harthebeests, birds such as cranes, surprisingly the giraffes (which can be dangerous), and also hyenas. African elephants were tamed in Roman times despite the obvious danger, an Asian elephants are still being tamed today. Perhaps the most unlikely pet is the European brown bear, which the Ainu people of Japan regularly capture as young animals, tamed, and reared to kill and eat in a ritual ceremony.

Thus, many wild animal species reached the sequential sessions of animal-human relations leading to domestication, but only a few emerged at the other end of that sequence as domestic animals. Over a century ago, the British scientist Francis Galton summarized this discrepancy succinctly: “It would appear that every wild animal has had its chance of being domesticated, which [a] few . . . were domesticated long ago, but that the large remainder, who failed sometimes in only one small particular, are destined to perpetual wildness.”

Still. A line of evidence shows that so mammal species are much more suitable than others are provided by the repeated independent domestications of the same species. Genetic evidence based on the portions of our genetic material known as mitochondrial DNA recently confirmed, as had long been suspected, that humped cattle of India and humpless European cattle were derived from two separate populations of wild ancestral cattle that had diverged hundreds of thousands of years ago. This is, Indian peoples domesticated the local Indian subspecies of wild aurochs, Southwest Asians independently domesticated their own Southwest Asian subspecies of aurochs, and North Africa may have independently domesticated the North African aurochs.

Similarly, wolves were independently y domesticated to become dogs in the Americas and probably in several different parts of Eurasia, including China and Southwest Asia. Modern pigs are derived from independent sequences of domestication in China, western Eurasia, and possibly other areas as well. These examples reemphasize that the same few suitable wild species attracted the attention of many different human societies.

The failure of modern efforts provides a final type of evincing that had failure to domesticate the large residue of wild candidate spacies arose from shortcomings o those species, than from shortcomings of ancient humans. Europeans today are heirs to one of the longest traditions of animal domestication on Earth-that which began in Southwest Asia around 10,000 years ago. Since the fifteenth century, Europeans have spread around the globe and encountered wild mammal species not found in Europe. European settlers, such as those that have encountered New Guinea with pet kangaroos and possums, have tamed or made pts of many local mammals, just as have indigenous peoples. European herders and farmers emigrating to other continents have also made serious efforts to domesticate some local species.

In the 19th and 20th centuries at least six large mammals-the eland, elk, moose, musk ox, zebra, and American bison-have been the subjects of especially well-organized projects aimed at domestication, carried out by modern scientific animal breeders and geneticists. For example, eland, the largest African antelope, have been undergoing selection fo meat quality and milk quantifies in the Askaniya-Nova Zoological Park in the Ukraine, as well as in England, Kenya, Zimbabwe and South Africa; an experimental farm for elk (red deer, in British terminology) has been operated by the Rowett Research Institute at Aberdeen Scotland, and an experimental farm for moose has operated in the Pechero-Ilych National Park in Russia. Yet these modern efforts have achieved only very limited successes. While bison meat occasionally appears in some U.S. supermarkets, and while moose has been ridden, milked and used to pull sleds in Sweden and Russia, none of these efforts has yielded a result of sufficient economic value to attract many ranchers. It is especially striking that recent attempts to domesticate elands within Africa itself, where its disease resistance and climate tolerance would give it a big advantage over introduced Eurasian wild stock susceptible to African disease, have not caught on.

Thus, neither indigenous herders with access to candidate species over thousands of years, nor modern geneticists, have succeeded in making useful domesticates of large mammals beyond the Ancient Species of Big Herbivorous Domestic Mammals, which were domesticated by at least 4,500 years ago. Yet scientists today could undoubtedly, if they wished, fulfill of many species in that part of the definition of domestication, in that specifies the control of breeding and food supply. for example, the San Diego and Los Angeles zoos are now subjecting the last surviving California condors to a more draconian control of bleeding than that imposed upon any domesticated species. All individual condors have been genetically identified, and a computer programs determine which male will mate with which female in order to achieve human goals. Zoos are conducting similar breeding program for man y other threatened species, including gorillas and rhinos. But the zoos’ rigorous selection of Californian condors shows no prospect of yielding an economically useful product. Nor do zoos’ efforts with rhinos, although rhinos offer up to more than three tons of meat on the hoof. As we will now see and for the most of other big mammals, presents insuperable obstacles to domestication.

Meanwhile, are areas in which food production arose altogether independently, with the domestication of many indigenous crops (and, sometimes, animals) before the arrival to any crops or animals from other areas. There are only five such areas for which the evidence is now detailed and compelling: Southwest Asia, also known as the Near East or Fertile Crescent, China: Mesoamerica (the term applied to central and southern Mexico and adjacent areas of Central America: , the Andes of South America, and possibly the adjacent Amazon Basin as well: and the eastern United States. Some or all these centres that most comprise several nearby centres where to production often or lest were independent, such as North China’s Yellow River valley and South China’s Yangtze River valley. Besides these five areas where food production definitely arose de novo, four others-Africa’s Sahel zone, tropical West Africa, Ethiopia and New Guinea-are candidates for that distinction. However, there is some uncertainty in each case. Although indigenous wild plants were undoubtedly domesticated in Africa’s Sahel zone just south of the Sahara, cattle herding may have preceded agriculture there, and it is not yet certain whether those were independently domesticated Sahel cattle or, instead, domestic cattle of Fertile Crescent origin whose arrival triggered local plant domestication. It remains similarly uncertain whether the arrival of those Sahel crops then triggered the undoubted local domestication of indigenous wild plants in tropical West Africa, and whether the arrival of Southwest Asian crops is what triggered the local domestication of indigenous wild plants in Ethiopia. As for New Guinea, archaeological studies there have provided evidence of early agriculture well before food production in any adjacent areas, but the crops grown have not been definitely identified

Food production in the Fertile Crescent is that it may have faced less competition from the hunter-gatherer lifestyle that in another area, including the western Mediterranean. Southwest Asia has few late rivers and only a short coastline, providing relatively meagre aquatic resources (as river and coastal fish and small shellfish). One of the important mammal species hunted for meat, the gazelle, originally lived in huge herds but was over exploited by the growing human population and reduced to low numbers. Thus, the food production package quickly became superior to the hunter-gatherer package. Sedentary villagers based on cereals were already in existence before the rise of food production and predisposed those hunter-gatherers for balancing equations of integral separations, included by an absence to no agriculture nor to any herding. In the Fertile Crescent the transition from hunter-gatherer to food production took place as perhaps as late as 9000 Bc. people still had no crops and domestic animals and were entirely dependent on wild foods, however by 6000 Bc. some societies were almost completely dependent on crops and domestic animals.

Evidently, most of the Fertile Crescent’s founder crops were never domesticated again elsewhere after the initial domestication in the Fertile Crescent. Had they been repeatedly domesticated independently, they would exhibit legacies of those multiple origins as varied chromosomal arrangements or varied mutations. Therefore, these are typical examples of the phenomenon of preemptive domestication that have quickly spread of the Fertile Crescent or elsewhere, to domesticate the same wild ancestors. Once the crop had become available, there was no further need to gather it from the wild and by that set it in the path to domestication again.

In addition, of course, small domestic mammals and domestic and insects have also been useful to humans. Many were domesticated for meat, eggs, and feathers: the chicken in China, various duck and goos e species in parts of Eurasia, turkeys in Mesoamerica, guinea fowl in Africa, and the Muscovy duck in South America. Wolves were domesticated in Eurasia and North America to become our dogs used as hunting companions, sentinels, pets, and in some societies, food. Rodents and other small mammals domesticated for food included the rabbit in Europe, the guineapig in the Andes, a giant rat in West Africa, and possibly a rident called the hutia on Caribbean islands. Ferrets were domesticated in Europe to hunt rabbits, and cats were domesticated in North Africa and Southwest Asia to hunt rodent pests. Small mammals domesticated as recently as the 19th and 20th centuries include foxes, mink, and chinchilla grown for fur and hamsters kept as pets. Even some insects have been domesticated, notably Eurasia’s honeybee and China’s silkworm moths, kept for honey and slk, respectfully.

Many of these small animals thus yielded food, clothing, or warmth. However, none of them pulled plows or wagons, non-bore riders, none except dogs pulled sleds or became war machines, and none of them have been as important for food as have big domestic mammals.

It is true, of course, that some small mammals were first domesticated long after 2500 Bc. For example, Rabbits weren’t domesticated for food until the Middle Ages, mice and rats for laboratory research not until the 20th century, and hamsters for pets not until the 1930s. The continuing development of domesticated small mammals is not surprising, because there are literally thousands of wild species as candidates, and because they were of too little value to traditional societies to warrant the effort of raising them. Nonetheless, big mammals domesticated virtually ended 4,500 years ago. By then, all of the world’s 148 candidates big species have been tested innumerable tomes, so that only a few passed the test and no other suitable ones remained.

The failure of modern efforts provides a final type of evidence that past failures to domesticate the large residue of wild candidates species arose from shortcomings of those species, than from shortcomings of ancient humans. Europeans today are heirs to one of the longer traditions of animal domestication on Earth, which began in Southwest Asia around 10,000 years ago. Since the fifteenth century, Europeans have spread around the globe and encountered wild mammals species not found in Europe. European settlers, such as those in New Guinea with pet kangaroos and possums, have tamed or made pets of many local mammals, just as have indigenous people. European herders and farmers emigrating to other continents have also attempted to domesticate some local species.

Africa’s domesticated animal species can be summarized much more quickly than its plants, because there are so few of them. The sole animal that is know for sure was domesticated in Africa, because its wild ancestors are confined there, are some turkey-like bird’s cattle, the guineas fowl. Wild ancestors of domestic cattle, donkeys, pigs, and most domesticate is the dog. The house cat was a native to North Africa but also to Southwest Asia, so we cannot yet be certain where they were first domesticated, although the earliest dates currently known for domestic donkeys and house cats favour Egypt. Recent evidence suggests that cattle may have been domesticated independently in North Africa, Southwest Asia, and India, and that all three of these stocks have contributed to modern African cattle breeds. Otherwise, all the remainders of Africa’s domestic mammals must have been domesticated elsewhere and introduced as domesticates to Africa, because the inheritor’s wild ancestors fared out only in Eurasia. Africa’s sheep and goats were domesticated in Southwest Asia, its chicken in Southeast Asia, its horses in southern Russia, and its camels probably in Arabia.

Like New Guinea, with no domesticable mammals, as the sole foreign domesticated mammal adopted in Australia was the dog, which arrived from Asia (presumably in Austronesian canoes) around 1500 Bc. Achieving the obtainable was to establish itself in the outbacks of primitive Australia of becoming the dingo. Native Australian kept captive dingos as companions, watchdogs, and even as living blankets, leading to the expression ‘five-dogs- night’ to mean a very cold night. However, they did not use dingos/dogs for food, as did Polynesian, or for cooperative hunting of wild animals, as did New Guinea.

Some Mesolithic hunter-gatherers, such as the Natufian of the Near East, appear to have lived in small settlements based on an economy involving gazelle hunting and the harvesting of wild cereals using sickles with flint blade segments inset in bone handles. In the Near East and North Africa, Mesolithic populations processed wild plant foods using grinding stones.

Other Mesolithic technological innovations include the adz and axe (woodworking tools consisting of flaked stone blades set in bored antler sleeves and fastened to wooden handles), fishing weirs and traps, fish hooks, the first preserved bows and arrows, baskets, textiles, sickles, dugout canoes and paddles, sledges, and early skis. The Jomon culture of Japan produced pottery by 10,000 years ago, as did the Ertebølle culture of Scandinavia moderately later.

The development of broad spectrum economies in the post-glacial Mesolithic/Archaic period laid the foundations for the plants and animals, which in turn led to the rise of farming communities in some parts of the world. This development marked the beginning of the Neolithic.

Farming originated at different times in different places as early as about 9,000 years ago in some parts of the world. In some regions, farming arose through indigenous developments, and in others it spread from other areas. Most archaeologists believe that the development of farming in the Neolithic was one of the most important and revolutionary innovations in the history of the human species. It allowed more permanent settlements, much larger and denser populations, the accumulation of surpluses and wealth, the development of more profound status and rank differences within populations, and the rise of specialized crafts.

Neolithic Toolmaking generally shows an advanced portion of technological continuity with the Mesolithic, however, Neolithic industries often include blade and bladelet (small blades) technologies, and sometimes in the accompaniment with microliths. A vast horizon widened by a range-over of retouched tools, including endscrapers (narrower scrapers for working hides). Moreover, be in the back blades or bladelets (some of which were set into handles and used as sickles), and a widened range of activated points. In addition, ground and polished axes and adzes-which would have been used for forest clearance to plant crops, and for woodworking activities—are characteristic of the Neolithic. Such tools, although labour-intensive to manufacture, has a propensity to take a long time without requiring resharpening and consequently were highly prized by these early farmers. Large-scale trade networks of axes and stone are documented in the Neolithic, with artifacts sometimes found hundreds of miles from their sources. Other technological developments in the Neolithic include grinding stones, such as mortars and pestles, for the processing of cereal foods, the widespread use of pottery for surplus food storage and cooking, the construction of granaries for storage of grains, the use of domesticated plant fibres for textiles, and weaving technology.

Archaeologists have several theories to explain why humans began farming. The reasons probably differed moderately from one region to another. Some theories maintain that population pressure or changes in environment may have forced humans to find new economic strategies, which led to farming. Another theory maintains that a population of humans may have lived in a region where domesticating wild plants and animals was easily made in the development of agriculture seemed as an indispensably historical accident. Still another theory proposes that the rise of farming may have varied with social change, as individuals began to use agriculture as a means to take on wealth as food surpluses.

Different plant crops were cultivated in different places, depending on what wild plants grew naturally and how well they responded to cultivation. In the Near East, important crops included wheat, barley, rye, legumes, walnuts, pistachios, grapes, and olives. In China, millet and rice predominated. In Africa, millet, sorghum, African rice, and yams were commonly grown. Rice, plantains, bananas, coconuts, and yams were important in Southeast Asia. Finally, in the Americas, corn, squash, beans, potatoes, peppers, sunflowers, amaranths, and goose-foots were commonly grown.

Domesticated animals also varied from one region to another according, again, to availability and their potential to be domesticated. In Eurasia, Neolithic people domesticated dogs, sheep, goats, cattle, pigs, chickens, ducks, and water buffalo. In the Americas, domesticated animals included dogs, turkeys, llamas, alpacas, and guinea pigs. In Africa, the primary domesticated animals-cattle, sheep, and goats-probably spread from the Near East.

Well-studied early farming sites in Eurasia include Jericho, in the West Bank; Ain Ghazal, in Jordan; Ali Kosh, in Iran; Mehrgarh, in Pakistan; Banpocun (Pan-p’o-ts̀un), in China; and Spirit Cave, in Thailand. Important African sites include Adrar Bous in Niger, Iwo Eleru in Nigeria, and Hyrax Hill and Lukenya Hill in Kenya. In the Americas, sites showing early plant include Guila Naquitz, in Mexico, and Guitarrero Cave, in Peru.

Larger Neolithic settlements show a variety of new architectural developments. For instance, in the Near East, conical beehive-shaped houses or rambling, connected apartments-style housing was often constructed with mud bricks. In Eastern Europe, houses were made with wattle and daub (interwoven twigs plastered with clay) walls, and, in later times, long houses were constructed with massive timbers. In China, some settlements contain semisubterranean houses dug into clay, with evidence of walls and roofs made out of thatch or other materials and supported by poles.

The of plants and animals led to profound social change during the Neolithic. Surpluses of food, such as stored grain or herds of livestock, could become commodities of wealth for some individuals, leading to social differentiation within farming communities. Trade of raw materials and manufactured products between different areas increased markedly during the Neolithic, and many foreign or exotic goods appear to have developed special symbolic value or status. Some Neolithic graves contain rich stores of goods or exotic materials, revealing differentiations in terms of wealth, rank, or power

In certain areas, notably parts of the Near East and Western Europe, Neolithic peoples built massive ceremonial complexes, efforts that would have required extensive, dedicated work forces. Large earthworks and megalithic (‘giant stone’) monuments from the Neolithic (including the Avebury stone circle and the earliest stages of Stonehenge, in England, and the monuments of Carnac, in France), suggest more highly organized political structures and more complex social organization than among most hunters-gatherer populations. In the Americas, sites such as the mounds of Cahokia, in Illinois, also depict more complex, organized political and social order. The technological innovations and economic basis established and spread by Neolithic communities ultimately set the stage for the development of complex societies and civilizations around the world.

Humans produced metal tools and ornaments from beaten copper as early as 12,000 years ago in some parts of the world. By 6,000 years ago, early experiments in metallurgy, particularly extracting metals from copper ore (smelting), were being conducted in some parts of Eurasia, notably in Eastern Europe and the Near East. By 5,000 years ago, copper and tin ores were being smelted and alloyed in some regions, marking the dawn of the Bronze Age. Sculpting of bronze tools, such as axes, knives, swords, spearheads, and arrowheads became increasingly common over time. At first, copper and bronze tools were rare and stone tools were still inordinate common, but as time went on, metal tools gradually replaced stone as the principal raw material for edged tools and weapons.

In Eurasia and parts of Africa, the rise of metallurgical societies appears to coincide with the rise of the earliest state societies and civilizations, such as ancient Egypt, Sumer, Minoan Culture, Mycenae, and China. In the Americas, parts of sub-Saharan Africa, Australia, and the Pacific Islands, societies continued to use stone and other nonmetal materials as the principal raw materials for tools up to the time of European contact, starting in the 15th century ad. Although, technically, populations in these areas could have been said to be Stone Age groups, many had become agricultural societies and had formed flourishing civilizations.

Stone technology enjoyed a brief resurgence within iron-using societies with the coming of flintlock firearms, beginning in the 17th century. Carefully shaped flints-reminiscent of the geometric microliths of the Mesolithic and early Neolithic-were struck against steel to create a spark to ignite the firearm. By the end of the 20th century few human groups had a traditional stone technology, although a few groups on the island of New Guinea still relied on the use of stone adzes. Tools of metal, plastic, and other materials had replaced stone technologies virtually everywhere.

Cave Dwellers, is the term used to designate an ancient people who occupied caves in various parts of the world. Cave dwellers’ date generally from the Stone Age period known as the Palaeolithic, which began as early as 2.5 million years ago. Caves are natural shelters, offering shade and protection from wind, rain, and snow. As archaeological sites, caves are easy to colonize and often provide conditions that encourage the preservation of normally perishable materials, such as bone. As a result, the archaeological exploration of caves has contributed significantly to the reconstruction of the human past.

Cave Painting, of Lascaux, France where some Palaeolithic artists painted scenes in caves more than 15,000 years ago, such as the one here found in Lascaux, France. The leaping cow and group of small horses were painted with red and yellow ochre that were either blown through reeds onto the wall or mixed with animal fat and applied with reeds or thistles. It is believed that prehistoric hunters made these paintings to gain magical powers that would ensure a successful hunt.

Wherever caves were available, prehistoric nomadic hunters and gatherers incorporated them into the yearly cycle of seasonal camps. Most of their activities took place around campfires at the cave mouth, and some caves contain stone walls and pavements providing additional protection from winds and dampness. Hunting, particularly of reindeer, horse, red deer, and bison, was important; many caves are situated on valley slopes providing views of animal migration routes.

Stone Toolmaking Humans first made tools of stone at least 2.5 million years ago, initiating the so-called Stone Age. The Stone Age advanced through three stages over time-the Palaeolithic (which is subdivided into Lower, Middle, and Upper periods), Mesolithic, and Neolithic. Blade Toolmaking, was a development of the Upper Palaeolithic Epoch, which began about 40,000 years ago. This technique produced a far greater variety and higher quality of tools than did earlier methods of Toolmaking.

Artifacts have been found in caves in France, Spain, Belgium, Germany, Italy, and Great Britain. The association of these remains with the bones of extinct animals, such as the cave bear and saber-toothed tiger, makes evidently the great antiquity of many cave deposits. A variety of stone and bone heads were discovered in excavated caves, as by rule their documents the importance of spears until the bow and arrow appeared in the late Palaeolithic era. Other common tools included stone scrapers for working hides and wood, burins for engraving, and knives for butchering and cutting. Throughout the Palaeolithic period such tools became increasingly diverse and well made. Bone needles, barbed harpoons, and spear-throwers were made and decorated with carved designs. Evidence of bone pendants and shell necklaces also exists. Among the caves that have yielded relics of early humans are the Cro-Magnon and Vallonnet in France.

Wall paintings and engravings have been found in more than 200 caves, largely in Spain and France, dating from 25,000 to 10,000 years ago. Frequently found deep inside the caves, and the paintings depict animals, geometric signs, and occasional human figures. In the cave of La Colombière in France, a remarkable series of sketches engraved on bone and smoothed stones was unearthed in 1913. In caves such as Altamira in Spain and Lascaux in France, multicolored animal figures were drawn using mineral pigments mixed with animal fats. Some paintings adorn walls of large chambers suitable for ritual gatherings; others are found in narrow passages accessible only to individuals. Hunting and fertilities seem to have been important artistic themes. The ritual gatherings themselves promoted communication and intermarriage among the normally scattered small groups. Chinese caves contain some earliest evidence of human use of fire

On every continent, prehistoric foragers used caves. In the Zhoukoudian (Chou-k'ou-tien) Cave near Beijing, China, remains of bones and tools of The Homo erectus (Peking Man) have been discovered. Chinese caves contain some earliest evidence of human use of fire, approximately 400,000 years ago. In the Shânîdâr Cave in Iraq, 50,000-year-old Neanderthal skeletons were unearthed in 1957. Ancient pollen buried with them has been interpreted as evidence that these cave dwellers had developed funeral rituals. In the western deserts of North America, caves have been found that contain plant foods, woven sandals, and baskets, representing the desert culture of a belated 9000 years ago. Early inhabitants of Australia, the Middle East, and the Peruvian Andes have also left remains in caves.

Gradually people learned to grow food, rather than forage for it. This was the beginning of the Neolithic age, which, although ending in western Europe some 4500 years ago, continued elsewhere in the world until modern times. Once agriculture became important, people established villages of permanent houses and found new uses for caves, mainly as hunting and herding campsites and for ceremonial activities. In Europe, Asia, and Africa caves continued to be used as shelters by nomadic groups.

Cave Dwellings, as these concave inlets held of themselves the cave dwellings that are found in the Cappadocia region of Central Anatolia Göreme, Turkey. Known as ‘fairy chimneys’, they were carved into soft volcanic rock by anchorite (hermitic) Christian monks in the 4th century AD. Many of these dwellings are still occupied by Göreme Turks, who consider them to have the quality of being healthy and make a reduction in-rate as to call by name the place of to live.

In dry caves, preservation is often excellent, due to moistureless air and limited bacterial activity. Organic remains such as charred wood, nutshells, plant fibres, and bones are sometimes found intact. In wet caves, artifacts and other remains are often found encrusted with, or buried beneath, calcareous unloads of drip-stone. The collected evidence of human habitation on the cave floor was often buried under rock falls from the ceilings of caverns. Intentional burials have also been found in several cave sites.

Because of the unusual preservative nature of caves and the great age of many remains found in them, the fallacious belief has arisen that a race of cave people existed. Most cave sites represent small, seasonal camps. Because prehistoric people spend a copious measurement of the year in open-air camps, the caves contain the remains of only part of a group’s total activities. Also, the cultural remains outside caves were subject to greater decay. Thus, the archaeological record of remote times is better seen in cave deposits.

Caves have been systematically excavated during the past one hundred years. Since they often contain the remains of repeated occupations, caves can document changing cultures. For example, the economic transition from food collecting to agriculture is manifested to finds in highland Mexico and in Southeast Asia. Some caves in the Old World continued to be inhabited even after the close of the Stone Age; relics from the Bronze and Iron ages have been found in cave deposits. On occasion, material dating from the time of the Roman Empire has been recovered. The famous Dead Sea Scrolls, discovered in 1947, were preserved in caves.

In 1935 Doctor F. Kohl-Larsen discovered fragments of two skulls in the gravel at the northeast end of Lake Eyassi, Tanganyika Territory, Africa, in association with fossilized bones of antelopes, pigs, and hyenas resembling types of animals now living in that area. The two hundred fragments of the skulls have been painstakingly assembled by Doctor Hans Weinert of Kiel, Germany, so that there are now available for study the skull cap of one individual and part of the face of another. Though critical study of these East African finds is still far from completion, their closest resemblance may be to Pithecanthropus erectus, the famous Java ape man. These remains have been tentatively dated as 100,000 years ago.

Doctor Robert Broom of the Transvaal Museum, Pretoria, has continued his study of the human-like ape remains found in South Africa. He believes the Australopithecus skulls to be the most definite apes-like, except their teeth, which show a closer similarity to those of man than of the gorilla or chimpanzee, and therefore that they are not actual ancestors of man, but only, survivors of a possible apelike ancestral stock that existed before Ice Age times.

The distal end of a humerus, the proximal ends of an ulna, and the distal phalanx of a toe of Paranthropus robustus, and the distal ends of a femur of Plesianthropus were excavated in the Pleistocene bone breccia of Kromdrai, near Krugersdorp, South Africa, under the direction of Doctor Broom, thus suggesting that this early type of ape-man had to his graduation a diploma in Bipedalism, or he was ably of walking of an erect posture, making a distinct departure from previous assumptions as to posture of this species.

Professor Raymond Dart of Witwatersrand University (South Africa), the discoverer of the controversial Taungs skull (Australopithecus africanus) states that a high culture existed in the present habitat of the Bantu-speaking peoples of South Africa in the Late Stone Age before their coming in that part of Africa. Skeletons associated with the Mapungobwa finds appear to implicate that the civilization cantering to this place was associated with a race said to be intermediate between, and possibly a hybrid of, Cro-Magnon and Neanderthal types, which as known in Europe, are distinct races’

Finds of Neanderthaloid skulls and skeletons continue to be reported from widely separated areas. Digging in a cave at Mount Circeo on the Tyrrhenian sea, 50 miles south of Rome, Italy, Alberto Carlo Blanc uncovered an almost perfectly preserved Neanderthal skull, perfect except a fracture in the right temporal region. It is the third of this type found in Italy. The two skulls previously reported were found in 1929 and 1935 in the Sacopastore region, near Rome, but in not nearly so well preserved a condition as the present find. No other human bones were found here, but the skull was accompanied by fossilized bones of elephants, rhinoceri, and giant horses, all fractured, thus giving some evidence of the mode of life of Neanderthal man. Professor Sergio Sergi, of the Institute of Anthropology at the Royal University of Rome, who has studied this skull in detail believes it to be 70,000 to 80,000 years old. He concludes also that Neanderthal man walked as moderately of an erect positional stance, as modern man and not with its head thrust forward as had previously misfortunes of others.

Another Neanderthal skeleton is reported to have been found in a cave in Middle Asia by A. P. Okladnikoff of the Anthropological Institute of Moscow University and the Leningrad Institute of Anthropology. The bones of the skeleton were badly shattered, but the jaw and teeth of the skull, such as for themselves were crushed at the back, were almost complete.

The famous Chokoutien site near Peking, China, the home of ancient Peking man (Sinanthropus) previously reported, now proves also to have yielded additional more modern type skeletons studied by Doctor Franz Weidenreich and Doctor W. C. Pei, the leaders in research at this site. In the portion of the site known as the upper cave were found the remains of an advanced culture suggesting a resemblance to the Late or Upper Palaeolithic in Europe, thus implying an age of 100,000 to 200,000 years. These cultural remains were accompanied by skeletons of bear, hyena, and ostrich, long extinct forms, and tiger and leopard that longs since disappeared from this part of Asia. The three human skulls in properly positioned placement seemed of its properties that accorded themselves in a very detailed subjective study, so, to implicate that they probably belong to three different racial groups. Of the two female skulls studied, one bears close resemblance to the skulls of modern Melanesians, with frontal deformation, are the second skull deformations to Eskimo skulls who are the first. The brain case of the masculine skull in some valuing quality is much more than is primitive, almost in the Neanderthaloid stage, but in other features is reminiscent of Upper Palaeolithic Man. The face, is similar to, though not identical with, recent Mongolians. From this evidence it seems that racial mixture is no product of modern times, but has its roots in extreme antiquity. It should be noted also that though Mongolian types resembling the modern population of North China were not found in the upper cave, it does not necessarily mean that they were nonexistent during that period. It has been suggested that the population represented in the upper cave may have been a migrating group. Historic and prehistoric American Indian skulls resembling Melanesian, Eskimo, or more primitive types have been reported from time to time in America, so that it would appear from the present finds at Chokoutien that long before migrations from Asia to America are assumed to have taken place, types similar to those composing the native American populations were living permanently, or at least moving around in Eastern Asia.

In the more recent past, the movement and counter-movement of peoples have led to accelerated mixing of stocks and mutual infusion of physical characteristics. Perhaps more important than the transmission of physical characteristics has been the transmission of cultural characteristics. The diffusion of cultures, including tools, habits, ideas, and forms of social organization, was a prerequisite for the development of modern civilization, which would probably have taken place much more slowly if people had not moved from place to place. For instance, use of the horse was introduced into the Middle East by Asian invaders of ancient Sumer and later spread to Europe and the Americas. Even important historical events can be linked to distant migrations; the downfall of the Roman Empire in the 3rd to the 6th century AD, for example, was probably hastened by migrations following the building of the Great Wall of China, which prevented the eastward expansion of Central Asian tribes, thus turning them in the direction of Europe.

A group of people may migrate in response to the lure of a more favourable region or because of some adverse condition or combination of conditions in the home environment. Most historians believe that non-nomadic peoples are disinclined to leave the places to which they are accustomed, and that most historic and prehistoric migrations were stimulated by a deterioration of home conditions. This belief is supported by records of the events preceding most major migrations.

The specific stimuli for migrations may be either natural or social causes. Among the natural causes are changes in climate, stimulating a search for warmer or colder lands; volcanic eruptions or floods that render sizable areas uninhabitable; and periodic fluctuations in rainfall. Social causes, however, are generally considered to have prompted many more migrations than natural causes. Examples of such social causes are an inadequate food supply caused by population increase; defeat in war, as in the forced migration of Germans from those parts of Germany absorbed by Poland after the end of World War II in 1945; a desire for material gain, as in the 13th-century invasion of the wealthy cities of western Asia by Turkish tribes; and the search for religious or political freedom, as in the migrations of Huguenots, Jews, Puritans, Quakers and other groups to North America.

Throughout history, the choice of migratory routes has been influenced by the tendency of groups to seek a type of environment similar to the one they left, and by the existence of natural barriers, such as large rivers, seas, deserts, and mountain ranges. The belts of steppe, forest, and arctic tundra that stretches from central Europe to the Pacific Ocean have been a constant encouragement to east-west migration of groups situated along their length. On the other hand, migrations from tropical to temperate areas, or from temperate to tropical areas, have been rare. The desert regions of the Sahara in northern Africa separated the African from the Mediterranean peoples and prevented the diffusion southward of Egyptian and other cultures, and the Himalayas’ mountain system of South Asia cut off approach to the great subcontinent of India but from its eastern and western borders. Consequently of these and similar barriers, certain mountain passes and land bridges became traditional migratory routes. The Sinai Peninsula in northeastern Egypt, bounded on the east by the Arabian Peninsula, linked Africa and Asia; the Bosporus region of northwestern Turkey connected Europe and the Middle East; the Daryal Gorge in the Caucasus Mountains of Georgia, Armenia, Azerbaijan, and southwestern Russia was used by the successive tribes that poured out of the European steppes into the Middle East; and the broad valley between the Altay Mountains and the Tian Shan mountain system of Central Asia provided the route by which Central Asian peoples swept westward.

Among the distinct effects of migration are the stimulations of further migration through the displacement of other peoples; a reduction in the numbers of the migrating group because of hardship and warfare, changes in physical characteristics through intermarriage with the groups encountered; changes in cultural characteristics by adoption of the cultural patterns of peoples encountered; and linguistic changes, also affected by adoption. Anthropologists and archaeologists have traced the routes of many prehistoric migrations by the current persistence of such effects. Blond physical characteristics among some of the Berbers of North Africa are thought to be evidence of an early Nordic invasion, and the Navajo and Apache of the southwestern United States are believed to be descended from peoples of northwestern Canada, with whom they have a linguistic bond. The effects of migration are particularly evident in North, Central, and South America, where peoples of diverse origins live with common cultures.

Among the most far-reaching series of ancient migrations were those of the peoples who spread the Indo-European family of languages. According to a prevalent hypothesis, a large group of Indo-Europeans migrated from east-central Europe eastward toward the region of the Caspian Sea before 3000 Bc. Beginning shortly after 2000 Bc, the Indo-European people known as the Hittites crossed into Asia Minor from Europe through the Bosporus region, and when the bulk of the Indo-Europeans in the Caspian Sea area turned southward. The ancestors of the Hindus went southeastward into Punjab, in northwestern India, and along the banks of the Indus. and Ganges rivers; the Kassites went south into Babylonia, and the Mitanni of northern Mesopotamia went southwestward into the valleys of the Tigris and Euphrates rivers and other parts of the region between the Persian Gulf and the Mediterranean Sea known as the Fertile Crescent.

A migration of great importance to Western civilization was the invasion of Canaan (later known as Palestine) by the tribes of the Hebrew confederacy, which developed the ideas on which the Jewish, Christian, and Islamic religions are founded. These nomadic Semitic tribes, from the Arabian Peninsula and the deserts southeast of the Jordan River, moved (15th-10th century Bc) into a settled region that was alternately under the control of Egypt and Babylonia.

The civilizations of the ancient world were reached cities and countries situated along the edges of the great European and Asian landmass, around the Mediterranean Sea, in the Middle East, in India, and in China. The huge interior area was crossed and recrossed by nomadic tribes, which periodically overran the coastal settlements. Central Asia was the main reservoir of these nomadic hordes, and from it successive waves of migrations penetrated eastward into China, southward into India, and westward into Europe, driving before them subsidiary waves of displaced tribes and peoples. In the 3rd century Bc, the Xiongnu (Hsiung-nu), who were possibly related to the Huns, advanced eastward from Central Asia toward China and westward toward the Ural Mountains, driving other groups before them.

In another movement the Cimbri, thought to have been a Germanic people, drove southward from the eastern Baltic Sea region and twice entered the Roman Empire in the 2nd century Bc. In the 1st century Bc, Germanic groups from the southwestern Baltic area, possibly as a consequence of Cimbri pressure, also drove down into central Europe, occupying the territory between the Rhine and the Danube rivers. By the 3rd century AD, a newly expanding group, the Mongols, had arisen in Central Asia. Because of their pressure, the Huns invaded China and crossed over the Ural Mountains into the Volga River region. This migration displaced the Goths, who travelled from southwestern Russia toward the European domains of the Roman Empire, and in turn forced the Germanic Vandals into Gaul and Spain at the beginning of the 5th century ad. The Visigoths (western Goths) continued their westward advance through Italy, Gaul, and Spain, driving the Vandals before them into northern Africa and eastward to present-day Tunis. The Ostrogoths (eastern Goths) followed the Visigoths into Italy and settled there. The Huns, who had begun their movement in Central Asia eight centuries earlier, followed the Goths into Europe, after being displaced by the Mongols, and settled in what is now Hungary about the middle of the 5th century. The Mongols also forced many Slavs into eastern Europe. Thus, one of the most momentous and far-reaching events of history, the disintegration of the Roman Empire in the 3rd to the 6th century of the Christian era, was largely caused by migrations.

After the Hun invasions in the 3rd and 5th centuries, a period of equilibrium began. In the East, the Chinese maintained their strength against the nomads. In the West, Europe consolidated its own strength.

The weakness and decay of the Persian and the Byzantine empires encouraged the spread of a new migration out of Semitic Arabia that was far more extensive than that of the Hebrews into Canaan. United under the banner of Islam in the 7th and early 8th centuries, Arab tribes swept eastward through Persia to Eastern Turkistan and into northwest India; westward through Egypt and across northern Africa into Spain and southern France; and northwestwards through Syria into Asia Minor. The Arab penetration into Central Asia stimulated nomadic raids on the frontiers of the Chinese Empire and forced the western Asian Magyar tribes to move in the direction of Europe, crossing the Ural Mountains and southern Russia and finally reaching Hungary, where they settled in the 9th century.

Expansion of Chinese frontiers under the Song (Sung) dynasty in the 11th century forced the Seljuk Turkish tribes out of Central Asia. These tribes moved westward across the Ural Mountains into the Volga River region and thence south into Persia, Armenia, Asia Minor, and Syria, settling among the peoples there. In the 13th century, Mongol tribes under famed conqueror Genghis Khan, in one of the most astounding military migrations of recorded history, swept out of Mongolia and captured China, Turkistan, Afghanistan, Iran, Mesopotamia, Syria, Asia Minor, southern Russia, and even parts of eastern Europe. The Ottoman Turks, forced from their pasturelands in western Asia during the brief period of Mongol supremacy, migrated westward and entered Asia Minor in the 14th century, taking Constantinople (then the capital of the Byzantine Empire, in what is now northwestern Turkey) and advancing as far as Vienna, Austria, in the 15th century.

The maritime region consisting of Scandinavia and other lands bordering the North and the Baltic seas was a subsidiary reservoir of migratory groups. In the 5th and 6th centuries, Angles, Saxons, and Jutes, displaced by the Visigoths, sailed from northwest Germany and overran southern Britain. Norwegian mariners captured the Shetland, Orkney, Faroe, and Hebrides islands in the 7th and 8th centuries. In the 9th century, Swedish fighters poured out of the Baltic region through southern Finland, sweeping down into Russia and through the Ukraine along the Dnieper River. During the 9th century, Norwegians settled in Iceland and in Normandy (Normandie) in France. Icelanders reached Greenland in the late 10th century and established a colony there. Subsequently, they sailed even as far as North America but left no permanent settlers. The growth of the system of nation-states in Europe during the 2nd millennium AD again restored the equilibrium in the West, and no important ethnic invasions occurred thereafter.

More people have moved and resettled during the past 450 years than in any similar period of human history. The migrations preceding this period were collective acts, mostly voluntarily undertaken by the members of a group, but many of the more recent migrations have differed in at least two significant ways: They have been either voluntary individual acts or they have been enforced group movements, entirely against the will of the people who are being moved. The two types of migration began almost simultaneously after Europeans arrived in America in the late 1400s, and they have continued in one form or another up to the present day.

The era of modern migrations that began with the opening of the western hemisphere was continued under the impetus of the Industrial Revolution. Millions of western, and then eastern, Europeans, seeking political or religious freedom or economic opportunity, settled in North and South America, Africa, Australia, New Zealand, and other parts of the globe. As many as 20 million Africans were forcibly carried to the Americas by slave traders and sold into bondage. Millions of Chinese settled in Southeast Asia and moved overseas to work in the Philippine Islands, Hawaii, and the Americas. A large colony of Hindus was established in southern Africa, and many people from Arab lands migrated to North and South America.

The migrations from Europe were principally voluntary, in the sense that the emigrants could have stayed in their respective original homelands if they had accepted certain religions, creeds, political allegiances, or economic privations. The involuntary migrations were primarily those of the Africans captured for slave labour, but slave shipments were halted during the first half of the 19th century. However, a large-scale, essentially forced migration took place from southern Africa to the central and eastern parts of the continent, spurred by the expansionist force of the Zulu. Finally, many of the Chinese, Indian, and other Asian migrations, as well as some of the migrations of eastern and southern Europeans, were not strictly definable as either or free or unfree. The individual migrants signed agreements to travel in consignments of contract labour. Although ultimately many of these labourers settled permanently and with equal rights in the lands to which they went, the terms of their original contracts often severely limited their freedom and, in effect, left them little better than slaves for long periods through time.

It’s migration into the Americas, early movement or movements of humans to the Americas. The first people to come to the Americas arrived in the Western Hemisphere during the late Pleistocene Epoch (1.6 million to 10,000 years before present). Most scholars believe that these ancient ancestors of modern Native Americans were hunter-gatherers who migrated to the Americas from northeastern Asia.

For much of the 20th century it was widely believed the first Americans were the Clovis people, known by their distinctive spearpoints and other tools found across North America. The earliest Clovis sites date to 11,500 years ago. However, recent excavations in South America show that people have lived in the Americas at least 12,500 years. A growing body of evidence-from other archaeological sites to studies of the languages and genetic heritage of Native Americans-suggests the first Americans may have arrived even earlier.

Many details concerning the first settlement of the Americas remain shrouded in mystery. Today the search for answers involves researchers from diverse fields, including archaeology, linguistics, skeletal anatomy, and molecular biology. The challenge for researchers is to find evidence that can help determine when the first settlers arrived, how these people made their way into the Americas, and if migrating groups travelled by different routes and in multiple waves. Some archeologists and physical anthropologists have suggested that one or more of these migrations originated from places outside Asia, although this view is not widely accepted.

Whoever they were and whenever they arrived, the first Americans faced extraordinary challenges. These hardy settlers encountered a vast, trackless new world, one rich in animals and plants and yet entirely without other peoples. As they entered new territories, they had to locate essential resources, such as water, food, and materials to make or repair their tools. They had to learn which of the unfamiliar animals and plants would feed or cure them that might hurt or kill them. Their efforts ultimately proved successful. By the time European exploration of the Americas began in the late 15th century, the descendants of these ancient colonizers numbered in the millions.

From their evolutionary origins in Africa, anatomically modern humans, Homo sapiens, steadily spread out across Earth’s landmasses by 25,000 to 35,000 years ago humans had reached the far eastern reaches of modern Siberia in northeastern Asia-a region believed to be the most likely point of departure for any early migration to North America. Humans arrived in this remote corner of the world during the last major period of the Pleistocene Epoch, or Ice Age. Great glaciers covered much of the Northern Hemisphere at this time. In North America two immense ice sheets, the Laurentide in the east and the Cordilleran in the west, buried much of modern Canada and Alaska, as well as northern portions of the continental United States.

Pleistocene climates and environments were different from they are today, and so too were the Earth’s surfaces. Glaciers had captured a significant amount of the world’s water on land. Because that water no longer drained back to the oceans, worldwide sea levels dropped. Average sea levels were as much as 135 m (440 ft) lower than they are today.

As sea levels fell, large expanses of previously submerged continental shelf became dry land, including the area beneath what is now the Bering Sea. This area formed a 1,600-km- (1,000-mi-) wide land bridge that connected the northeastern tip of Asia and the western tip of modern Alaska. Known as Beringia, this natural land bridge existed from about 25,000 to nearly 10,000 years ago. It was a flat, cold, and dry landscape, covered primarily in grassland, with occasional shrubs and small trees. People and animals could use Beringia to walk from Siberia to Alaska and back.

Migrants from northeastern Asia could have trekked to Alaska with relative ease when Beringia was above sea level. Even travelling south from Alaska to what is now the continental United States posed significant challenges for any would-be colonizers. There were two possible routes south for migrating people: down the Pacific coast, or by way of an interior passage along the eastern flank of the Rocky Mountains. When the Laurentide and Cordilleran ice sheets were at their maximum extent, both routes were likely impassable. The Cordilleran reached across to the Pacific shore in the west and its eastern edge abutted the Laurentide, near the present border between British Columbia and Alberta.

Geological evidence suggests the Pacific coast route was open for overland travel before 23,000 years ago and after 14,000 years ago. During the coldest millennia of the last ice age, roughly 23,000 to 19,000 years ago, lobes of glaciers hundreds of kilometres wide flowed down to the sea. Deep crevasses scarred their surfaces, making travel across them dangerous. Even if people travelled by boat-a claim for which there is currently no direct archaeological evidence-the journey would have been difficult. There were almost certainly fleets of icebergs to outmanoeuvre. Rivers of sediment draining Cordilleran glacial fields severely restricted the availability of near-shore marine life, which early colonizers would have relied on for nourishment. By 14,000 to 13,000 years ago, however, the coast was ice-free. By then, too, the climate had warmed, and coastal lands were covered in grass and trees. Hunter-gatherer groups could have readily replenished their food supplies, repaired clothing and tents, and replaced broken or lost tools.

The warming climate gradually opened a second possible migration route through the massive frozen wilderness in the continental interior. Geologic evidence indicates that by 11,500 years ago the Cordilleran and Laurentide ice sheets had retreated far enough to open a habitable ice-free corridor between them. By then, much of the exposed land was probably restored enough to support plants and animals on which migrating hunter-gatherer peoples depended.

Scientific inquiry into the peopling of the Americas began in the 1870s. Then, many scholars wondered if modern humans had lived in the Americas for as long as they had in Europe, where numerous Stone Age sites indicated a Pleistocene-era occupation. Excavations at these sites revealed hand axes and other relatively simple stone tools, human bones, and the remains of several now-extinct animals, including the woolly mammoth. The discovery of Pleistocene-age animals alongside human bones and artifacts helped 19th-century archeologists establish the age of ancient human encampments in Europe.

Yet, search as they might, American archeologists found no comparable evidence of a Pleistocene-era human presence. Nonetheless, several sites revealed stone artifacts that some scholars believed looked similar to the ancient stone tools found in Europe. On the basis of this similarity, these experts claimed the American artifacts must be as old. By the 1890s, however, other scholars had challenged this claim. They argued the American and European artifacts did not really look alike, and they noted the American artifacts were of uncertain antiquity because none were found securely embedded in Pleistocene-age geological deposits. A lengthy debate ensued between those who saw evidence for ancient human settlement in the Americas and those who did not. This debate-often loud and sometimes bitter-remained unresolved for more than three decades.

In 1927 archaeologists finally demonstrated that humans had occupied the Americas during the Pleistocene. This breakthrough occurred at a site discovered by ranch foreman George McJunkin near Folsom in northeastern New Mexico. Excavations at the site uncovered a stone projectile point embedded in the rib bones of a now-extinct bison, an ancestor of the modern North American buffalo. Clearly, a human hunter had killed this Pleistocene-era animal. The Folsom discovery proved beyond doubt that humans had lived in the Americas since the last ice age.

The spearpoints used to bring down the Folsom bison where distinctive, finely made points possess a flute, or channel, on each face. These Folsom points were quite unlike those of the European Stone Ages. American archaeologists coined the term Paleo-Indian to identify the ancient Pleistocene Americans who had produced these well-crafted artifacts.

In the decade after Folsom, more Paleo-Indian sites were discovered. Some held Folsom spearpoints, but others revealed larger, less finely made fluted points. These large points occasionally appeared with the bones of mammoths. The first such find became known in 1933 at a site near Clovis in eastern New Mexico, where archaeologists found spearpoints and fossils in sediments below those that had produced Folsom artifacts. This meant that the Clovis people, as they came to be known, represented an even older Paleo-Indian culture. Just how much older was determined soon after the development of radiocarbon dating in the late 1940s This modern dating technology showed that the people who made Clovis artifacts had inhabited North America by about 11,500 years ago-some 600 years before the Folsom culture appeared.

The age of the earliest Clovis sites coincided neatly with geological evidence that by 11,500 years ago the Laurentide and Cordilleran ice sheets had retreated far enough to open a habitable ice-free corridor-a fact first recognized by University of Arizona archaeologist C. Vance Haynes. It appeared that Clovis groups had moved south from Alaska through the continental interior right after it became possible to do so. That no excavated site older than Clovis were found, at least initially, seemed to confirm that Clovis people were the first colonizers of the Americas.

Once they had travelled south of the ice sheets, Clovis groups spread rapidly. Soon after 11,500 years ago, Clovis and Clovis-like materials appear throughout North America. The oldest sites are in the Great Plains and the southwestern United States; younger sites are found in eastern North America. No subsequent group would achieve such a wide distribution, but Clovis groups did not stop in North America. According to the Clovis-first theory, they must have continued to South America. As these groups pushed south, the traditional thinking went, they developed different tools and other artifacts that were no longer readily recognizable as Clovis. They arrived at Tierra del Fuego on the southern tip of South America within 1,000 years of leaving Alaska.

The rapid dispersal of Clovis peoples throughout the hemisphere was remarkable given the landscape they traversed. Not only did they travel through desert, plains, and forest, they did so during the environmental upheaval that marked the end of the last Ice Age. Climates were growing warmer-drier in some areas and wetter in others-and the distributions of plants and animals were shifting in complex ways in response to the changing climates. As they entered each new habitat, they must have quickly learned to find suitable plant and animal foods. They would need stone to repair their toolkits, freshwater to drink, and the ability to overcome environmental challenges encountered along the way.

A long-favoured explanation for the rapid spread of Clovis people was that they preyed on large animals, such as mammoth and mastodon. These animals were themselves wide-ranging in their distribution. Archaeologists believed a reliance on big-game hunting meant that Clovis groups would have less need to learn about available local resources.

Archaeologists initially found some support for the big-game hunting hypothesis in archaeological excavations, as well as in the Clovis toolkit itself. Along the San Pedro River in Arizona, for example, are four Clovis sites separated by less than 20 km (12 mi). Each site yielded Clovis points embedded in the skeletons of mammoths. So similar are the points at these sites that they may be the handiwork of a single group, which obviously found good hunting in the area. The artifacts at San Pedro and other Clovis sites include a variety of tools handy for hunting, killing, and butchering game animals. There are the distinctive fluted spearpoints, shown experimentally by University of Wyoming archaeologist George Frison to be capable of bringing down elephant-sized animals. In addition, there are stone knives, scrapers, gravers (tools for scoring bone), drill, and a few preserved artifacts of ivory and bone. These tools, which occur in Clovis sites across North America, support the view that Clovis peoples were practicing the same way of life.

Clovis tools were typically made of superior quality fine-grained stone, including chert, jasper, and chalcedony. Such stone is durable and readily flaked by skilled Toolmakers into a desired, sharp-edged form. More important, it is easily resharpened and reused. That would be important to hunters pursuing wide-ranging big game. They could continue to use their stone tools as they tracked game far from the quarries where they acquired their stone. Analysis of these tools suggests that Clovis groups commonly travelled distances of 300 km (185 mi). In one instance, a dozen Clovis points quarried from the Texas Panhandle were left as a cache in northeastern Colorado, 485 km (300 mi) away. These distances indicate a range of movement across the landscape far greater than is observed in later periods of American prehistory.

The idea that Clovis people where big-game hunters could help explain an unsolved puzzle of the Americas in the late Pleistocene: the catastrophic extinction of dozens of species of large animals. Across the Americas millions of large animals known as megafauna disappeared. These animals included the mammoth, mastodon, and the giant ground sloth, as well as the horse, the camel, and many other herbivores. Some very large and formidable carnivores also died out, including the American lion, the saber-toothed tiger, and the giant short-faced bear. These extinctions were thought to coincide with the arrival of Clovis groups, a chronological coincidence that led University of Arizona ecologist Paul Martin to propose the hypothesis of Pleistocene overkill. This hypothesis, first put forward in 1967, contends that Clovis big-game hunters caused the extinctions. Martin suggested that overkill was especially likely-even inevitable-if Clovis group were the first Americans. For if the megafauna had never before faced human hunters, they would have been especially vulnerable prey to this new, dangerous, two-legged predator.

For decades the Clovis-first theory seemed to fit well with the available geologic and archaeological evidence. However, some archaeologists always harbored doubts about the Clovis-first scenario. These doubts intensified toward the end of the 20th century. A reassessment of Clovis subsistence led many to challenge the traditional view of Clovis people as big-game hunting specialists. In addition, the discovery of a pre-Clovis human presence in the Americas has undermined the claim that Clovis people were the first Americans.

Since the 1980s there has been increasing skepticism about the traditional view that Clovis groups were dependent on big-game hunting. Despite many years of searching, few Clovis archaeological sites have yielded evidence to support this view. The San Pedro Valley sites have proved to be the exception, not the rule. There are scarcely a dozen Clovis big-game kill sites known, mostly in western North America, with two possible kill sites in eastern North America. These contain the skeletal remains of just two of the Pleistocene Megafaunal-mammoth and mastodon. Clovis people did kill big game, but apparently not as often as once supposed.

A broader view of Clovis subsistence now suggests that they often targeted slower, smaller, less dangerous prey. The roasted remains of turtles, for example, have been found at many sites, including Aubrey and Lewisville in Texas, Little Salt Spring in Florida, and even at the original Clovis site in New Mexico. Other sites indicate that the diet in Clovis times included small and medium-sized mammals, such as beaver, snowshoe hare, and caribou, as well as fish and a variety of gathered plants.

Over time, the Pleistocene overkill hypothesis was clearly not strongly supported by the archaeological record. Archaeologists have yet to document a single Clovis sloth kill, horse kill, camel kill, or a kill of any of the other several dozen megafaunal species. Whatever caused the extinction of these animals, it was not human hunting. Scientists are currently pursuing alternative hypotheses to explain megafaunal extinctions, such as the possibility they were caused by late Pleistocene climatic and environmental change, or perhaps disease. The puzzle remains unsolved.

A revised view of Clovis subsistence coincides with a reevaluation of the Clovis toolkit. Analysis of Clovis spearpoints shows they were adequate weapons for bringing down big game, but they were not always used that way. Few spearpoints show the kinds of damage that routinely occurs when stone projectiles meet animal bone. Clovis point, like many items in the Clovis toolkit, were most likely used as multipurpose tools; many spearpoints show wear patterns indicating they were used as knives. There is also more variety in the Clovis toolkit than traditionally supposed. Clovis groups in different areas occasionally fashioned tools needed for particular tasks in the environments in which they found themselves. In addition, they probably made tools-perhaps wooden digging sticks or woven plant fibre nets with which to catch fish or small game, hat has not been preserved from that remote time. A varied, multipurpose toolkit is to be expected of groups that hunt and gather a range of foods.

If they were not pursuing wide-ranging big game, why were Clovis groups moving such great distances across the landscape? The answer may be exploration. Hunter-gatherer peoples need to know where to go when resources in one location begin to diminish, as animals are hunted out or flee and as available plants are gathered up. For colonizers in an unfamiliar landscape, that means ranging widely across newly discovered lands to see what resources occur where, when, and in what abundance. Not knowing where they might encounter stone to refurbish their tools on their journeys, it is not surprising that Clovis explorers selected only the highest quality stone for their toolkits, or that they left caches of tools along their way-as the cache in Colorado demonstrates. They could return to the caches to replace diminished supplies without having to walk all the way back to a distant stone quarry.

Claims of a pre-Clovis human occupation in the Americas have been around for decades. By the 1980s, dozens of such sites had been reported, some estimated to be as much as 200,000 years old.

Archaeologists have carefully scrutinized each site to determine if three basic criteria are present. Sites lacking all three criteria cannot be accepted as valid. First, the site must have genuine artifacts produced by humans or human skeletal remains. Second, these artifacts or remains must be found in unmixed geological deposits to ensure that younger objects are not accidentally buried in older layers of sediment. Third, these artifacts or remains must be accompanied by reliable radiocarbon dates that indicate a pre-Clovis occupation. For decades all sites reputed to be of pre-Clovis age failed to meet these criteria. All, that is, except one.

In the mid-1970s University of Kentucky archaeologist Tom Dillehay began excavating at Monte Verde, a site on the banks of Chinchihuapi Creek in southern Chile. Monte Verde is an extraordinary site. Unusual geological conditions quickly buried the remains of an ancient camp beneath wet, swampy sediments. Since the remains left on the surface by the site’s inhabitants were not exposed to the air, many organic remains-which normally decays and disappear-were preserved.

Dillehay’s team found an astonishing array of organic materials. These included wooden foundation timbers of roughly rectangular huts, finely woven string, and chewed leaves, seeds and other plant parts from nearby species-many with food or medicinal value. In addition, excavations revealed burned bones of mastodon along with pieces of its meat and hide. Some bits of hide still clung to pieces of wooden timbers, the apparent remnants of hide-coverings that once draped over the huts. Also found was the footprint of a child in the once-sticky mud, an assortment of hearths, and hundreds of stone, bone, and wood artifacts. Dillehay’s team firmly radiocarbon dated these organic remains to 12,500 years ago-1, 000 years before Clovis times.

The excavations at Monte Verde lasted nearly a decade, and the laboratory research, analysis, and writing about what Dillehay’s team had found took another dozen years. Dillehay’s findings had to be carefully studied and presented in order to overcome the skepticism of archaeologists who had grown accustomed to seeing pre-Clovis claims fail. When Dillehay’s second book on the results of his investigations appeared in 1997, most archaeologists were convinced; the Clovis barrier had fallen at last.

Since Monte Verde, several new candidates for a pre-Clovis settlement in North America have appeared. The Cactus Hill site in Virginia has yielded artifacts below layers in which Clovis-like fluted points were found. Precisely how old those more deeply buried artifacts might be are uncertain, however. The layer in which they were found has produced widely varying radiocarbon ages, from 16,000 years ago to modern times. It therefore remains unclear how old these artifacts might be. Archaeologists have also refocused attention on the Meadowcroft Rockshelter in Pennsylvania. Excavations at Meadowcroft in the 1970s and 1980s produced unmistakable artifacts in deposits perhaps as much as 14,250 years old. Questions remain, however, about whether the artifacts and organic remains are as old as the radiocarbon-dated charcoal. For the time being, neither site, nor any of the other sprinkling of recent pre-Clovis claims, is fully accepted by a still-cautious archaeological community.

The excavations at Monte Verde conclusively demonstrated that people inhabited the Americas in pre-Clovis times. Yet Monte Verde also raised many new questions about the first Americans. Several new theories have been advanced to explain the identity, antiquity, and entryway of the first Americans.

Most archaeologists believe the first Americans-whether travelling in a single migration or multiple migrations-originated in northeastern Asia. This view is based mainly on geological evidence that a land bridge once connected Asia and North America and on genetic similarities between northeastern Asian peoples and Native Americans. It is not, however, founded on any direct archaeological evidence. The kinds of tools typical of the Monte Verde site or Clovis culture are not found in either northeastern Asia or Beringia. All the same, Monte Verde is far from that region and in a very different environmental setting, so it is not surprising its artifacts are different.

Although archaeologists have yet to find a single Clovis spearpoint in northeastern Asia, one artifact comes close: a stone points from the site of Uptar in Siberia that has a flute on one face. Even so, the age of the spearpoint is unknown, and it is not otherwise similar to Clovis fluted points. There are archaeological sites in Alaska, such as the Nenana Complex, that slightly predates Clovis. However, these sites lack the hallmark of Clovis technology: fluted stone projectile points. A few Clovis-like fluted points have been found in Alaska, but these are younger, not older, than those to the south.

The absence of similar artifacts in Siberia or Alaska is not surprising. Finding archaeological traces of a small group, or several groups, that briefly passed through this vast area is a difficult task. In addition, the most recognizable feature of the first Americans’ eventfully occurred some 2,500 years before a habitable ice-free corridor opened in the North American interior. A coastal migration could explain how people arrived in Monte Verde 12,500 years ago. By the time the interior route opened, the ancient Monte Verdeans had long departed from the banks of Chinchihuapi Creek.

Finding sites occupied by coastal migrants, however, is no easy task. Much of the late Pleistocene-age shoreline along which migrating groups would have travelled was later submerged when the continental ice sheets melted and their waters returned to the sea. To meet this challenge, researchers are using sonar and taking core samples from the sea floor to explore and probe underwater landscapes and coastlines.

Archaeological excavations have occurred at sites on several islands off the coasts of Alaska and British Columbia. The effort has had some initial success. A cave on Prince of Wales Island in southeastern Alaska has yielded artifacts and human remains’ radiocarbon dated to about 10,000 years ago. Bear remains from another part of the same cave are dated to 41,000 years ago. These findings provide tantalizing hope that still older traces of a human presence can be found in this area. Further south, on one of the Channel Islands off the coast of California, and at several coastal Peruvian sites, material as much as 11,000 years old has been found. Still, none of these sites have produced remains old enough to be those of the first Americans.

Some archaeologists believe the first Americans did not come from northeastern Asia, but from Europe, crossing the North Atlantic Ocean by boat. No ancient boats have been found, but proponents note that modern humans travelled by boat to Australia perhaps 30,000 to 40,000 years ago. Archaeological support for this theory is based mainly on similarities observed between Clovis artifacts and those of the Solutrean Period of prehistoric Europe. Some researchers also find support for a North Atlantic route in several ancient human skeletons found in the Americas. These skeletons, proponents argue, appear to have more anatomical similarities with modern Europeans than with modern Native Americans.

Despite the claimed similarities, Solutrean and Clovis artifacts have important differences-in form, method of manufacture, and materials. Most obviously, Solutrean points lack fluting, and Solutrean sites include many stone artifacts and bone tools never found in the Americas. Most archaeologists believe the similarities in artifacts that do exist can be explained as the result of cultural convergence. The concept of cultural convergence suggests that different groups at different times and places might create or use similar materials or tools in similar ways. Solutrean and Clovis cultures are also separated by many thousands of kilometres, most of which is ocean, and by 5,000 years. The Solutrean period ended more than 16,500 years ago, while the earliest Clovis site is only 11,500 years old.

The ancient American skeletons considered by some archaeologists to be anatomically distinct from modern Native American’s also fail to support a North Atlantic route. After more detailed anatomical study, those remains, such as the 8,500-year-old skeleton found in Washington State known as Kennewick Man-proved to be far less similar to Europeans than initially believed. Kennewick Man does differ from modern Native Americans. However, many physical anthropologists believe this individual, as with all other ancient skeletal remains found in the Americas, are ancestral Native Americans. The fact that ancient and modern Native Americans do not precisely resemble each other is not surprising: many thousands of years of anatomical and evolutionary change separate them. In addition, for several thousand years after the Americas were first settled, the human population was small, widely scattered, and groups were relatively isolated for long periods of time. Under these circumstances, variability in anatomical features can emerge. Groups of ancient Americans would not necessarily look alike, let alone resemble their descendants many thousands of years later.

If the first Americans migrated from northeast Asia, then the study of modern Native American people-descendants of the first Americans-may hold vital clues about the number and timing of the ancestral migrations to the Americas. Linguists and geneticists have searched for these clues in the languages and genetic heritage of modern Native Americans.

Linguistic studies are based on the assumption that ancient elements, or ‘echoes’, of an ancestral language can still be heard in the shared words, grammar, sounds, and meanings of the diverse languages spoken by modern Native Americans. By searching for these elements, researchers hope to learn if all Native American languages evolved from a single ancestral tongue. This common ancestral tongue, if present, may be the language spoken by the earliest Americans. If these elements are not present, however, they could indicate the Americas were peopled at different times by groups speaking distant or unrelated languages.

Linguists are still searching for answers. Most linguists, however, believe the sheer number and variety of Native American languages-of which hundreds are known-bespeaks a long period of language diversification. University of California linguist Johanna Nichols estimates that language diversification in the Americas began as early as 35,000 years ago.

Historical studies of the genetic material of modern Native Americans appear to offer additional clues about the earliest Americans. These studies are based on the knowledge that some types of deoxyribonucleic acid (DNA, the chemical that encodes genetic information) are inherited strictly from one parent or the other, but not both. Mitochondrial DNA (mtDNA) is passed from mothers to their offspring, and Y. Chromosome DNA is passed from fathers to sons. Genetic change in these types of DNA is a result of mutation, not recombination of the parents’ DNA. By looking at the genetic difference in mtDNA or Y-chromosome DNA over time, researchers can determine how closely related certain populations are and how much time has elapsed since they were members of the same population.

Genetic studies have shown that virtually all Native Americans share a set of four major mtDNA lineages, and at least two such lineages on their Y chromosome. This indicates these groups are all closely related to one another. The nearest relatives of Native Americans beyond the Americas are the native peoples of northeastern Asia. Native Americans are unrelated genetically to Europeans. Geneticists have variously estimated that peoples of Asia and the Americas were part of the same population from 21,000 to 42,000 years ago.

Geneticists, like linguists, still debate when and how many migratory bands may have trekked from Asia to the Americas. Some scholars believe the evidence indicates a single migration. Others see support for multiple movements of people across Beringia and back. How this is resolved, and how the genetic heritage and languages of modern Native Americans are linked to ancient archaeological data, such as Clovis artifacts, remain important unsolved challenges.

One of the most obvious ways of directly linking ancient and modern Native Americans is by examining the DNA found in prehistoric human skeletal remains. Such remains are extremely rare, however, and recovering DNA from ancient remains can be difficult, if it is even preserved. In the United States the difficulty of linking ancient remains with modern Native Americans might be a strictly scientific concern was it not for legislation that has influenced the progress and conduct of such research.

The Native American Graves Protection and Repatriation Act (NAGPRA), signed into law in 1990, was aimed at righting the wrongs of earlier generations of scientists. In the past, researchers sometimes indiscriminately collected the bones of Native Americans for study and display in museums and universities. Native American peoples were not the only groups to receive such treatment, but their remains and artifacts were gathered in lopsided numbers. To many Native Americans, this was one more instance of mistreatment at the hands of Euro-Americans. In response, NAGPRA required institutions in possession of Native American skeletal remains and artifacts to return them at the request of known lineal descendants.

In the wake of NAGPRA, thousands of skeletons and associated artifacts were returned to Native American peoples. Many of these objects are only a few hundred years old. In such cases, debates over the identity of the descendants have been rare. Other cases, particularly those involving older remains, are more difficult to resolve. Proving lineal descent in cases of greater antiquity is no easy task. This is because descendants of early Americans formed new groups as populations grew, and these groups moved away to settle new lands. A group living 11,000 years ago would almost certainly be ancestral to many modern Native American tribes, not just one. In the future, geneticists may identify sufficiently precise genetic markers to link DNA extracted from ancient human skeletal remains with a group of modern tribes. Nonetheless in most cases, making the link to only one tribe will be difficult.

In one prominent case involving the 8,500-year-old remains of Kennewick Man, the debate over lineal descent ended in a court of law. These remains were found in Washington State in 1996 on property belonging to the federal government. Five Native American tribes living in the area submitted a joint claim under NAGPRA for the return of the remains. A group of archaeologists and physical anthropologists then filed a lawsuit to block the return until detailed scientific studies, including analysis of Kennewick Man’s DNA, could be conducted.

The lawsuit sparked several years of legal and scientific wrangling. The Native American groups felt scientific studies were an unnecessary desecration of the remains. They believed they had lived in the area since the beginning of human prehistory in the Americas; therefore, Kennewick Man must be one of their ancestors. The scientists bringing the lawsuit, however, argued that ancestry could not be ascertained without detailed study. This research, they noted, would also add vital information to the meager knowledge about ancient American peoples. Both sides were well intentioned, and under the ambiguous terms of NAGPRA, both were right. NAGPRA allows lineal descendants to be identified not just by DNA, but also by tribal traditions and geographic proximity. The dispute remains unresolved.

Fortunately, few NAGPRA cases have been as contentious as that surrounding Kennewick Man. The human remains from Prince of Wales Island, found about the same time, were excavated and analysed without pitting science against tribal tradition, or archaeologists against Native Americans. Ensuring there is room for both perspectives remains and important challenge under the framework established by NAPGRA.

Studies of the first Americans entered the 21st century on the cusp of change. The traditional view that the first Americans were fast-moving Clovis big-game hunters who migrated into the North American interior on the heels of retreating ice sheets has been undermined. Evidence from Monte Verde demonstrates that humans arrived in the Western Hemisphere in pre-Clovis times, and a reassessment of Clovis subsistence suggests Clovis people were not the big-game hunting specialists imagined in the past. As yet, no widely accepted theory has arisen to replace the older Clovis-first theory. Researchers are proposing many new ideas. Which of these ideas will succeed or fail remains to be seen.

The instruments of archaeological study continue to improve at a rapid pace. Shovels and trowels, the traditional tools of excavation, are now being used alongside ground-penetrating radar, seismic studies of surface features, and other techniques to find now-buried sites. A variety of new studies are providing information about where the materials to make ancient stone artifacts were acquired, how the artifacts were made, and how they were used. These include studies of the geological sources of stone artifacts, experimental work in stone fracture mechanics better to understand how stone tools were made, and analyses of microscopic wear patterns visible on such artifacts. A batteries of techniques are now available to study the chemical composition of bone, plant, shell, and other organic and inorganic remains, providing archaeologists with a clearer picture of the environments to which the first Americans adapted. New dating techniques under development should allow archaeologists reliably to date sites more than 50,000 years old-the current limits of radiocarbon dating. These techniques could prove useful in the event sites of greater antiquity are eventually found in the Americas.

The time-honoured process of acquiring archaeological evidence through careful and meticulous site excavation continues. Where the oldest preserved sites might be is not yet known. There are obvious places to look, however, including eastern Siberia, which is still relatively unknown to archaeologists. Other promising locations for future research include the remnants of Beringia, coastal islands of the Pacific, the Isthmus of Panama-through that any group headed into South America must have passed, and perhaps places not yet imagined. Some of the most interesting discoveries in years to come may even be made in museums, when new techniques for analysis are applied to old collections of artifacts and human remains. Affirmatively done with the interest and cooperation of Native American groups.

Archaeologists may never find evidence of the very first humans to arrive in the Western Hemisphere. It is, after all, a very big place. However, ongoing research is sure to reveal much about how the first Americans colonized a new world.

In America, the search for additional evidence of Folsom Man continues. Near Fort Collins, Colorado, Doctor Frank H. Roberts, Jr., continued excavation at a camp site, uncovering a variety of tools and weapons, and the first known decorated objects from any Folsom site, two decorated beads. This earliest American. Folsom Man, may have lived contemporaneously with Old World Cro-Magnon Man, or some 25,000 years ago. This tentative date was assigned recently by Doctors Kirk Bryan and Louis L. Ray of Harvard University based on studies made at the Folsom camp site, known as the Lindenmeier site, in northeastern Colorado. Many stone points, identified as typically Folsom were found in an earth stratum above the floor of an ancient valley that is traceable to a terrace on a local stream. The terrace has been dated as of the late Ice Age. The dating is based on the assumed correlation between this late Ice Age stage with the Mankato of the Middle West and the Pomeranian of Europe. From this it appears that the culture-bearing layer of the Lindenmeier site was developed at the end of the glacial advance, or 25,000 years ago.

An attempt has been made to adapt the method of dating ruins by analysis of tree rings, so successfully carried out in America, specifically in Southwestern United States, to Viking ruins in Southern Norway. E. de Geer, who has been carrying on this work, reports, that, from a study of the remaining timbers in a wooden burial chamber in a Viking mound, it was constructed in 931 ad. A Swedish fort in Gotland was found by the same method to have been built in five AD.

The Homo habilis, is an extinct primate in the categorical classification set order of the group called the subfamily. A set-classification of group members made up of the humans. Scientists believe this species lived in Africa between two million and 1.5 million years ago. H. habilis are the earliest known members of the genus Homo, the branch of Hominines believed to have evolved into modern humans. The term Homo habilis means handy man, a name selected for the deposits of primitive tools found near H. habilis discovered fossils.

Scientists distinguish H. habilis from australopithecines, the more primitive Hominines from which it evolved, by analysing key physical characteristics. H. habilis had a larger brain than australopithecines. The braincase of H. habilis measured at least 600 cubic centimetres (thirty-seven cu inches) compared with the 500 cu cm (thirty-one cu in) typical of australopithecines. Australopithecines had long arms and short legs, similar to the limbs of apes. The overall body form of australopithecines was also apelike in having large body bulk to its height. Proportionally, H. habilis resembled modern humans with its limbs and small body bulk compared with its height. H. habilis had smaller cheek teeth (molars) and a less protruding face than earlier Hominines. H. habilis were taller than australopithecines, but shorter than A Homo erectus, a later, more humanlike species.

The use of primitive tools implies that H. habilis had developed a different way of gathering food than earlier Hominines, which fed only on vegetation. H. habilis probably ate meat plus fruits and vegetables. Anthropologists disagree on whether H.habilis obtained this meat through hunting, scavenging, or a combination of both techniques

British-Kenyan anthropologist Louis Leakey discovered the first fossil evidence of H. habilis at Olduvai Gorge in northern Tanzania in 1960. Other anthropologists have since discovered specimens in northern Kenya, South Africa, and Malawi. Although all these specimens had a larger brain than australopithecines, some had especially large brains (almost 800 cu cm or forty-nine cu. in.) and more modern skeletons. However, their large and slightly protruding faces seem more primitive than those of other H. habilis specimens. Most scientists now believe that these fossils represent a distinct species named Homo rudolfensis. Scientists debate over which of these two species evolved into the later, even larger-brained H. erectus. Many consider H.rudolfensis the more likely candidate because of its large brain and more modern skeleton. For anthropology, the science of man, 1964 was an eventful and exciting year. Perhaps the most important development of 1964 was the discovery in Africa of a new humanlike, tool-using species, possibly a direct ancestor of man. This was not the only remarkable thing. The new species, named Homo habilis, was very old, probably 1.75 million years old, which makes him nearly twice as old as any previously known tool-using animal. The appearance of Homo habilis on the scene caused great excitement between paleontologists and physical anthropologists and has led many of them to a major reconsideration of much of man's biological history

The new discovery, like so many other important finds of recent years, was made by the top fossil finder of the 20th century, Louis S. B. Leakey, curator of the Coryndon Museum Centre for Prehistory and Palaeontology, Nairobi, Kenya. Professor Leakey's work is invariably done with his wife. Mary, a geologist, and their three sons, who have also recovered important fossil materials. The finds were made in the incredible fossil-rich Olduvai Gorge, an arid chasm in the Serengeti Desert of mainland Tanzania (formerly Tanganyika). The section of Olduvai Gorge excavated by the Leakeys is the most spectacular single prehistoric site in the world. The gorge cuts directly through four main stratigraphic levels, or beds, and in these four beds there are undisturbed paleontological and archaeological deposits covering a time span of nearly two million years. The gorge contains the stratified records of the development of stone tools from the most simple beginnings to elaborately fashioned hand axes; it contains fossil evidence of four major types of men or near-men; and it is rich in fossil remains of ancient fauna, including insects, fish, reptiles, and mammals of the lower and middle Pleistocene periods.

The Homo’s habilis excavations were announced by Dr. Leakey at the National Geographic Society in Washington, D.C., and in the April, 4, 1964, issue of Nature. The Olduvai fossil remains are being studied by Professor Phillip Tobias, University of Witwatersrand, Johannesburg, and Dr. John R. Napier, Royal Free Hospital School of Medicine, London.

The Leakeys found bones and teeth representing sixteen hominid individuals in beds’ I and II (the two lowest beds) of Olduvai Gorge. One of these was the well-known Zinjanthropus, which is placed roughly in the genus Australopithecus. The australopithecines were a genuses of near-men living about one million years ago, perhaps a little earlier. They were originally considered close to the direct line of man's ancestry, but this is now in doubt. All of the other ascertaining affectualities were considered by Leakey, Tobias, and Napier to represent Homo habilis, a more advanced hominid with a size and shape intermediate between Australopithecus and Homo. Acquainted with the genus that includes modern man and his immediate ancestors that of time were past over by 500,000 years. The specific name habilis are from the Latin and means ‘able, handy, mentally skilful, vigorous’.

Not all of the 206 bones making a complete skeleton of Homo habilis have yet been discovered. The recovered parts, however, are numerous enough to give a good picture of his anatomy and, by inference, of his behaviour. The recovered parts include the remains of two or three skulls, three mandibles (jawbones), about forty teeth, parts of a hand and foot, the bones of a lower leg, a collarbone, and some rib fragments.

Some features distinguishing modern man from his ancestors of earlier epochs include legs and feet adapted for an upright posture and bipedal gait; hands adapted for tool use rather than the locomotion; teeth and jaws adapted for a meat -eater rather than a purely herbivorous diet; a brain adapted for good hand-eyes coordination in tool manufacture and used; and the ability to express with language of the human sort. Except language, the foot, hands, jaws, teeth, and brain case of Homo habilis suggest his close relatives cut an end of the line between the pre-humans and human grades.

The fossil foot is nearly complete, lacking only the back part of the heel and the toes. The foot bones are within the range of variation of The Homo sapiens. The large toe is stout and carried parallels to the other toes; the longitudinal and transverse arch system is like ours. The bones of the foot and leg show the adult Homo habilis had an upright posture and bipedal locomotion, a slender body build, and a stature of about four feet.

The hands are not entirely apelike, nor are they typically human. The hand bones are heavier than ours, and the finger bones are curved inward. The tips of the fingers and thumb are broad, stout, and covered by flat nails, as our modern man's. Probably Homo habilis could not oppose his thumb and fingertips in the precision, pen-holding the grip of modern man, but his hand can make stone tools.

The jaws are smaller than those of Australopithecus; the front of the lower jaw is retreating, with no development of an external bony chin. The incisor teeth are large, the canines are large compared with the premolars, and the premolars and molars are narrow in the tongue-to-cheek dimension. Both the manlike proportions of the teeth and the remains of fish, reptiles, and small mammals found in his living sites show that Homo habilis had an omnivorous diet.

The skull is intermediate in shape between Australopithecus and modern man. The mass of the facial about the cranial part of the skull is reduced and is thus more like the advanced forms. The greatest breadth of the skull is high on the vault. The curvature of the parietal bones is intermediate; that of the occipital bone resembles A Homo sapiens.

The brain case of the Olduvai specimen known as No. 7 has an estimated endocranial volume of 680 cc. The endocranial volume for australopithecines ranges from 435 to 600 cc., that of pithecanthropines from 775 to 1,225 cc., and that of modern man from about 1,000 to 2,000 cc., with an average of about 1,350 cc. Thus, the brain of Homo habilis, although both absolutely and proportionally larger than any of the australopithecines, were not large, either absolutely or relationally by contrast, in co-occurrences among a modern humans. A typical adult Homo habilis had a body weight of about 75 pounds and a brain weight of a little more than one pound, whereas a modern man of 150 pounds has a brain weight of about 3 pounds. In the period following Homo habilis, hominid body weight doubled, but the weight of the brain tripled.

The stone tools found in association with Homo habilis are typical of the Oldowan industry first recognized by Leakey 30 years ago. Similar tools are found elsewhere in East Africa, and in South Africa, Angola, and North Africa. These tools are commonly called pebble tools because most of them are made from waterworn pebbles. Most of the Oldowan choppers are worked on both faces to produce a sharp but irregular cutting edges.

These rough choppers made from potato-sized pieces of stone are the earliest known stone tools; they date from the very beginning of the Pleistocene. There is abundant evidence from Olduvai Gorge showing that the great hand ax or Chelles-Acheul culture evolved directly from the Oldowan stone industry.

Oldowan pebble tools and the skeletal remains of Homo habilis are associated in six sites. At some East and South African sites, pebble tools are also found in association with Australopithecus, but Homo habilis are, according to Professor Tobias, always associated with Oldowan tools, whereas Australopithecus is not. The evidence from the six sites unquestionably shows that early hominids regularly manufactured tools of a set design before they developed hands or brains like those of modern man.

The ages of Homo habilis are as standardly forged and unforeseeable as the fossils themselves. Before these new finds most anthropologists thought the earliest Toolmaker lived less than one million years ago. The potassium-argon process of dating had more than doubled the age of known tool manufacture.

The principle of the potassium-argon technique is simple. The radioactive isotope potassium forty (K40) found in volcanic rock is known to disintegrate into calcium forty and argon forty (A40), an inert gas. The rate of transmutation is constant and very slow; one half the K40 atom changes to A40 atoms, each stand for about 1.3 billion years. The phosphorus-containing mineral anorthoclase is found in the volcanic deposits of Olduvai Gorge. While the lava was in a molten state beneath the earth, no A40 accumulated in the mineral because the gas boiled away. After the lava erupted and cooled, however, nearly all newly formed A40 atoms were imprisoned in the crystalline structure of the anorthoclase. By removing the mineral at a low temperature and then heating it, scientists have succeeded in collecting the released A40 atoms to be counted in a mass spectrometer. Because no A40 was initially present and because the rate of accumulation is also known, this count gives an estimate of the age of the rock. Several samples give age estimates ranging from 1.57 to 1.89 million years, or an average of 1.75 for Bed I in Olduvai, where Homo habilis were found and where the first tools of hominid manufacture appear.

The Homo erectus is an extinct primate classified in the subfamily Homininae and the genus Homo, which include humans. Scientists learn about extinct species, such as The Homo erectus, by studying fossils-petrified bones buried in sedimentary rock. Based on their analysis of these fossils, scientists believe that Homo erectus lived from about 1.8 million to 30,000 years ago. Until recently, The Homo erectus was considered an evolutionary ancestor of modern humans, or Homo sapiens.

The anatomical features of A Homo erectus are more humanlike than those of earlier Hominines, such as australopithecines and Homo habilis. The Homo erectus had a larger brain, measuring up to 1150 cc, and a rounder cranium-the portion of the skull that covers the brain-than earlier Hominines. A Homo erectus was also taller, with a flatter face and smaller teeth. Large differences in body size between males and females, characteristic of earlier hominine species, are less evident in Homo erectus specimens.

This larger brain and more modern body-enabled The Homo erectus to do many things its hominine ancestors had never done. A Homo erectus appears to have been the first hominine to venture beyond Africa. It was the first hominine who effectually engaged of systematic hunting, the first to make anything resemble home bases (campsites), and the first to use fire. Evidence suggests that the childhood of The Homo erectus archeologic remains that periods longer than of earlier Hominines, providing an extended period in which to learn complex skills. These skills are reflected in the proportionally sophisticated stone tools associated with The Homo erectus are being included of their archeological remains. Although still primitive compared with the tools made by early Homo sapiens, the tools made by Homo erectus are much more complex than the simple, small pebble tools of earlier Hominines. The most characteristic of these tools was a teardrop-shaped hand ax, known to archaeologists as an Acheulean ax.

Scientific study of The Homo erectus began in the late 19th century. Excited by Charles Darwin‘s theory of evolution and fossil discoveries in Europe, scientists began to search for the fossilized remains of ‘the unknown factor’, the evolutionary ancestor of both human beings and modern apes. In 1891 Dutch anthropologist Eugene Dubois travelled to Java, Indonesia, where he unearthed the top of a skull and a leg bone of an extinct hominine. Measurements of the skull suggested that the creature had possessed a large brain, measuring 850 cc, while the leg-bone anatomy suggested that it had walked upright. In recognition of these characteristics, Dubois named the species Pithecanthropus erectus, or ‘erect ape-man.

Canadian anthropologist Davidson Black found similar fossils in China in the late 1920s. Black named his discovery Sinanthropus pekinensis, or ‘Peking Man’. Later studies by Dutch scientist G. H. von Koenigswald and German scientist Franz Weidenreich showed that the fossils discovered by Dubois and Black came from the same species, which was eventually named The Homo erectus.

Since these earliest discoveries, Homo erectus fossils have been found in East Africa, South Africa, Ethiopia, and various parts of Asia. Kenyan fossil hunter Kamoya Kimeru discovered an almost complete Homo erectus skeleton, known as the Turkana boy, near Lake Turkana in northern Kenya in 1984. The oldest known specimen, dated at almost two million years old, also comes from northern Kenya. Recently developed dating methods have shown that Homo erectus also lived in Java almost two million years ago, but Scientific assumptions about The Homo erectus have changed dramatically since the early 1990s. Anthropologists long assumed that the species spread from Africa to parts of Asia and Europe and that these dispersed populations gradually evolved into The Homo sapiens, or modern humans. Most anthropologists now think it more likely that Homo sapiens originated from a small population in Africa within the past 200,000 years. According to this theory, descendants of this African population of Homo sapiens spread throughout the eastern hemisphere, replacing populations of more ancient Hominines, perhaps with limited interbreeding.

Many anthropologists now believe that some Homo erectus specimens should be classified as a separate species named Homo ergaster. According to this view, Homo ergaster appeared first in East Africa and quickly spread into Asia, where it evolved into The Homo erectus. The Homo sapiens arose in Africa from a population descended from Homo ergaster. Until recently, A Homo erectus was thought to have died out about 300,000 years ago. Recent studies of Homo erectus populations in Java suggest that they may have lived until as recently as 30,000 years ago, long after the evolution of modern humans.

Anthropologists also debate whether Homo erectus used language. Some scientists argue that the brain size of The Homo erectus, the shape of its vocal structures, and the complexity of its behaviour suggest that it had a capacity for spoken language far beyond the rudimentary vocalizations of apes. Other anthropologists reject this conclusion. They point out that the first evidence of artistic expression, a trait closely linked with language, appears only about 40,000 years ago. These skeptics also point to the primitive quality of the tools associated with The Homo erectus. Some anatomical evidence also suggests that Homo erectus overran language abilities. The spinal column of early Homo erectus was much narrower than that of modern humans. This anatomical characteristic implies that Homo erectus had fewer nerves to control the subtle movements of the rib cages required for the production of spoken language. This question may remain unanswered, because, unlike stone tools, spoken words never become part of the archaeological record.

The skulls and teeth of early African populations of the middle Homo differed subtly from those of later H. erectus populations from China and the island of Java in Indonesia. H. ergaster makes a better candidate for an ancestor of the modern human line because Asian H. erectus has some specialized features not seen in some later humans, including our own species. H. heidelbergensis has similarities to both H. erectus and the later species H. neanderthalensis, although it may have been a transitional species evolving between Middle Homo and the line to which modern humans belong.

The Homo’s ergaster probably first evolved in Africa around two million years ago. This species had a rounded cranium with a brain size of between 700 and 850 cu. cm. (forty-nine to fifty-two cu. in.), a prominent brow ridge, small teeth, and many other features that it shared with the later H. erectus. Many paleoanthropologists consider H. ergaster a good candidate for an ancestor of modern humans because it had several modern skull features, including proportionally thin cranial bones. Most H. ergaster fossils come from the time range of 1.8 million to 1.5 million years ago.

The most important fossil of this species yet found is nearly a complete skeleton of a young male from West Turkana, Kenya, which dates as early as 1.55 million years ago. Scientists determined the sex of the skeleton from the shape of its pelvis. They also found from patterns of tooth eruption and bone growth that the boy had died when he was between nine and twelve years old. The Turkana boy, as the skeleton is known, had elongated leg bones and arm, leg, and trunk proportion that essentially match those of a modern humans, in sharp contrast with the apelike proportions of H. habilis and Australopithecus afarensis. He appears to have been quite tall and slender. Scientists estimate that, had he grown into adulthood, the boy would have reached a height of 1.8 m (6 ft) and a weight of 68 kg (150 lb). The anatomy of the Turkana boy shows that H. ergaster was particularly well adapted for walking and perhaps for running long distances in a hot environment (a tall and slender body dissipates heat well) but not for any significant amount of tree climbing. The oldest humanlike fossils outside Africa have also been classified as H. ergaster, dated of nearly 1.75 million years’ old. These finds, from the Dmanisi site in the southern Caucasus Mountains of Georgia, consist of several crania, jaws, and other fossilized bones. Some of these are strikingly like East African H. ergaster, but others are smaller or larger than H. ergaster, suggesting a high degree of variation within a single population.

H. ergaster, H. rudolfensis, and H. habilis, in summing up to possibly two robust Australopiths, all might have coexisted in Africa around 1.9 million years ago. This finding goes against a traditional paleoanthropological view that human evolution consisted of a single line that evolved progressively over time-an Australopiths species followed by early Homo, then Middle Homo, and finally H. sapiens. It appears that periods of species diversity and extinction have been common during human evolution, and that modern H. sapiens has the rare distinction of being the only living human species today.

Although H. ergaster appears to have coexisted with several other human species, they probably did not interbreed. Mating rarely succeeds between two species with significant skeletal differences, such as H. ergaster and H. habilis. Many paleoanthropologists now believe that H. ergaster descended from an earlier population of Homo-perhaps one of the two known species of early Homo-and that the modern human line descended from H. ergaster.

Sophisticated dating techniques combined with new fossil discoveries suggest that skeletal remains unearthed in Africa in 1995 come from the earliest known human ancestors to walk upright, according to a report published in the journal Nature on May 7, 1998.

Researchers said the new findings suggested that Bipedalism (walking on two legs) emerged 4.07 million to 4.17 million years ago, about 500,000 years earlier than was previously believed. Experts said the new research had important implications for the study of human origins because Bipedalism is widely considered a key, evolutionary adaptation that set the human lineage apart from that of other primates.

The new findings are based on fossils found three years ago in northern Kenya near Lake Turkana. Scientists identified the fossils as belonging to a newly discovered primordial human species, the Australopithecus Anamensis, a creature with apelike teeth and jaws, long arms, and a small brain.

Initial efforts to set the age of the sediments in which the fossils were discovered failed, raising doubts about the fossils' antiquity. In addition, a lower-leg bones provide for critical evidence of Bipedalism was found in a different sedimentary layer, suggesting the bone could be younger or from a different species.

Nevertheless, a new dating effort, led by anthropologist Meave G. Leakey of the National Museums of Kenya, used an argon-dating analysis technique that examined crystals in sedimentary volcanic ash. Researchers said the technique showed the lower -leg bone to be a ‘little’ younger than the other fossils dated at 4.07 million to 4.17 million years ago. This finding showed the remains belonged to the same species. The dating analysis was further supported by the subsequent discovery of dozens of new fossils in the area, the researchers said.

Before the discovery of Australopithecus Anamensis, the earliest known bipedal human ancestor was Australopithecus afarensis, the famous “Lucy” skeleton discovered in Ethiopia in 1974 and estimated to be three million to 3.7 million years old. Based on the new findings, some scientists believe that A. Anamensis may be the most ancient species of australopithecine.

One of the earliest defining human traits, Bipedalism-walking on two legs as the primary form of locomotion-evolved more than four million years ago. Fossils show that the evolutionary line leading to us had achieved a substantial upright posture by around four million years ago, then began to increase in body size and in relative brain size around 2.5 million years ago. However, other important human characteristics-such as a large and complex brain, the ability to make and use tools, and the capacity for language-developed more recently. Many advanced traits-including complex symbolic expression, such as art, and elaborative cultural diversity emerged mainly during the past 100,000 years.

Few books have rocked the world the way that. On the Origin of Species did. Influenced in part by British geologist Sir Charles Lyell’s theory of a gradually changing earth, British naturalist Charles Darwin spent decades developing his theory of gradual evolution through natural selection before he published his book in 1859. The logical-and intensely controversial-extension of Darwin’s theory was that humans, too, evolved through the ages. For people who accepted the biblical view of creation, the idea that human beings shared common roots with lower animals was shocking. In this excerpt form, on the Origin of Species, Darwin carefully sidesteps the issue of human evolution (as he did throughout the book), focussing instead on competition and adaptation in lower animals and plants the Darwinian evolution process by natural selection is fundamentally very simple: natural selection occurs whenever genetically influenced variation among individuals affects their survival and reproduction. If a gene codes for characteristics that result in fewer viable offspring in future generations, that gene is gradually eliminated. For instance, genetic mutations that increase vulnerability to infection, or cause foolish risk taking o lack of interest in sex, will never become common. On the other hand, genes that cause resistance to infection, appropriate risk tasking, and success in choosing fertile mates are likely to spread in the gene pool, even if they have substantial costs.

A classical example is the spread of a gene for dark wing colour in a British moth population living downwind from winds major sources of air pollution. Pale moths were conspicuous on smoke-darkened trees and easily caught by rare mutant forms of the moth, whose colour more closely matched that of the bark escaped the predators’ beaks. As the tree trunks became darker, the mutant gene spread rapidly and largely displaced the gene for pale wing colour. That is all there is to it. Natural selection involves no plan, no goal, and no direction-just genes increasing and decreasing in frequency depending on whether individuals with those genes have, relatively to other individuals, greater or lesser reproductive success.

The simplest of natural selection has been obscured by many misconceptions. For instance, Herbert Spencer’s nineteenth-century catch phrase ‘survival of the fittest’ is widely thought to summarize the process, but it actually promotes several misunderstandings. First, survival is of no consequence by itself. This is why natural selection has created some organisms, such as salmon and annual plants, that reproduces only once, then die. Survival increases fitness only because it increases later reproduction. Genes that increase lifetime reproduction will be selected for even if they result in reduced longevity. Conversely, a gene that decreases total lifetime reproduction will obviously be eliminated by selection even if it increases an individual’s survival.

Further confusion arises from the ambiguous meaning of ‘fittest’. The fittest individual, in the biological sense, is not necessarily the healthiest, strongest, or fastest. In today’s world, and many of those of the past, individuals of outstanding athletic accomplishment need not be the ones who produce the most grandchildren, a measure that should be roughly correlated with fitness. To someone who understands natural selection, it is no surprise that parents are so concerned about their children’s reproduction.

A gene or an individual cannot be called ‘fit’ in isolation but only as for a particular species in a particular environment. Even in single environment, every gene involves compromises. Consider a gene that makes rabbits more fearful and by that helps to keep them from the jaws of foxes. Imagine that half the rabbits in a field have this gene. Because they do more hiding and less eating, these timid rabbits might, on average, a bit less fed than their bolder companions. If, hunkered down in the March snow waiting for spring, two thirds of them starve to death while this is the fate of only one third of the rabbits who lack the gene for fearfulness, then, come spring, only a third of the rabbits will have the gene for fearfulness. It ha been selected against. It might be nearly eliminated by a few harsh winters. Milder winters or an increased number of foxes could have the opposite effect. It all depends on the current environment.

While natural selection has been changing us in many small ways in the last ten thousand years, his is but a moment on the scale of evolutionary time. Our ancestors of tn thousand or perhaps fifty thousand years ago, looked and acted fully human, if we could magically transport babies from that time a rear the in modern families. We could exec them to grow up into perfectly modern lawyers or farmers of athletes or cocaine addicts.

The point of the rest, is that we are specifically adapted to Stone Age conditions. These conditions ended a few thousand years ago, n=but evolution has not had time since then to adapt us to a world of dens population, modern socioeconomic conditions, low levels of physical activity, and the many other novel aspects of modern environment, we ae not referring merely to the world of offices, classrooms and fast-food restaurants. Life on any primitive farm or in any third-world village may also be thoroughly abnormal for people whose bodies s were designed for the word of the Stone Age hunter-gatherer.



Even more specifically, we seem too adapted to the ecological and socioeconomic condition experienced by tribal societies living in the semiarid habitat characteristic of sub-Saharan Africa. This is most likely where our species originated and lived for tens o thousand of years and where we spent perhaps 90 precent of it history after becoming fully human and recognizable as the species we are today. Prior to that was a far longer period of evolution in Africa in which our ancestor’s skeletal features lead scientist to give them names, such as Homo erectus and Homo habilis. Yet even these more remote ancestors walked erect and used their hand for making and using tools. We can only guess at many aspects of their biology speech capabilities and social organization is not apparent in stone artifacts and fossil remains, but there is no reason to doubt that their ways of life were rather similar to those of more recent hunter-gatherers.

Technological advance later allowed our ancestors to invade other habitats and regions, such as deserts, bungles, and forests. Beginning about one hundred thousand years ago, our ancestors began to disperse from Africa to parts of Eurasia, including seasonally frigid regions made habitable advances in clothing, habitation and food acquisition and storage, yet despite the geographical and climatic diversity, people still lived in small tribal groups with hunter-gatherer economies. Grainfield agriculture, with its revolutionary alteration of human dit and socioeconomic systems, was practiced fist in southwestern Asia about eight thousand years ago, and shortly therefore after in India and China. It took another thousand years or more to spread to central and western Europe and tropical Africa and to begin independently in Latin America. Most of our ancestors of a few thousand years still lived in bands of hunter-gatherers. We are, the words of some distinguished anthropologist, “Stone Ages, in the fast lane.”

Even so, it is nevertheless, that, all humans are primates. Physical and genetic similarities show that the modern human species, Homo sapiens, has a very close relationship to another group of primate species, the apes. Humans and the so-called great apes (large apes) of Africa-chimpanzees (including bonobos, or so-called pygmy chimpanzees) and gorillas-have the same ancestors that lived sometime between eight million and six million years ago. The earliest humans evolved in Africa, and much of human evolution occurred on that continent. The fossils of early humans who lived between six million and two million years ago come entirely from Africa. Humans and great apes of Africa have the same ancestor that lived between eight million and five million years ago.

Most scientists distinguish among twelve to nineteen different species of early humans. Scientists do not all agree, however, about how the species are related or which ones simply died out. Many early human species-probably most of them-left no descendants. Scientists also debate over how to identify and classify particular species of early humans, and about what factors influenced the evolution and extinction of each species.

The tree of Human Evolution where fossil evidence suggests that the first humans too evolved was from ape ancestors, at least six million years ago. Many species of humans followed, but only some left descendants on the branch leading to Homo sapiens. In this slide show, white skulls represent species that lived around the time stated to the point; gray skulls represent extinct human species.

Early humans first migrated out of Africa into Asia probably between two million and 1.7 million years ago. They entered Europe in some respects later, generally within the past one million years. Species of modern humans populated many parts of the world much later. For instance, people first came to Australia probably within the past 60,000 years, and to the Americas within the past 35,000 years. The beginnings of agriculture and the rise of the first civilizations occurred within the past 10,000 years.

The scientific study of human evolution is called Paleoanthropology. Paleoanthropology is a sub-field of anthropology, the study of human culture, society, and biology. Paleoanthropologists search for the roots of human physical traits and behaviour. They seek to discover how evolution has shaped the potentials, tendencies, and limitations of all people. For many people, Paleoanthropology is an exciting scientific field because it illuminates the origins of the defining traits of the human species, and the fundamental connections between humans and other living organisms on Earth. Scientists have abundant evidence of human evolution from fossils, artifacts, and genetic studies. However, some people find the idea of human evolution troubling because it can seem to conflict with religious and other traditional beliefs about how people, other living things, and the world developed. Yet many people have come to reconcile such beliefs with the scientific evidence.

Modern and Early Humans have undergone major anatomical changes over evolution. This illustration depicts Australopithecus afarensis (centre), the earliest of the three species; Homo erectus, -an intermediate species, and Homo sapiens and modern human. H. erectus and modern humans are much taller than A. afarensis and have flatter faces with a much larger brain. Modern humans have a larger brain than H. erectus and almost flat face beneath the front of the braincase.

All species of organisms originate through the process of biological evolution. In this process, new species arise from a series of natural changes. In animals that reproduce sexually, including humans, the term species refers to an ordered set-groups of adult members regularly interbreed, resulting in fertile offspring-that is, offspring themselves adequate of reproducing. Scientists classify each species with a unique, two -part scientific names. In this system, modern humans are classified as Homo sapiens.

The mechanism for evolutionary change resides in genes-the basic units of heredity. Genes affect how the body and behaviour of an organism develop during its life. The information contained in genes can be change-a process known as mutation. The way particular genes are expressed how they affect the body or behaviour of an organism-can also change. Over time, genetic change can alter a species’s overall way of life, such as what it eats, how it grows, and where it can live.

Genetic changes can improve the ability of organisms to survive, reproduce, and, in animals, raise offspring. This process is called adaptation. Parents pass adaptive genetic changes to their offspring, and ultimately these changes become common throughout a population-a group of organisms of the same species that share a particular local habitat. Many factors can favour new adaptations, but changes in the environment often play a role. Ancestral human species adapted to new environments as their genes changed, altering their anatomy (physical body structure), physiology (bodily functions, such as digestion), and behaviour. Over long periods, evolution dramatically transformed humans and their ways of life.

Geneticists estimate that the human line began to diverge from that of the African apes between eight million and five million years ago (paleontologists have dated the earliest human fossils to at least six million years ago). This figure comes from comparing differences in the genetic makeup of humans and apes, and then calculating how long it probably took for those differences to develop. Using similar techniques and comparing the genetic variations among human populations around the world, scientists have calculated that all people may share common genetic ancestors that lived sometime between 290,000 and 130,000 years ago.

Humans belong to the scientific order named Primates, a group of more than 230 species of mammals that also includes lemurs, lorises, tarsiers, monkeys, and apes. Modern humans, early humans, and other species of primates all have many similarities and some important differences. Knowledge of these similarities and differences helps scientists to understand the roots of many human traits, and the significance of each step in human evolution.

The origin of our own species, Homo sapiens, is one of the most hotly debated topics in Paleoanthropology. This debate centres on whether or not modern humans have a direct relationship to H. erectus or to the Neanderthals, are well-known as to a greater extent a nontraditional set-grouped of humans who evolved within the past 250,000 years. Paleoanthropologists commonly use the term anatomically modern Homo sapiens to distinguish people of today from these similar predecessors.

Traditionally, paleoanthropologists gave to a set-classification as Homo sapiens, any fossil human younger than 500,000 years old with a braincase larger than that of H. erectus. Thus, many scientists who believe that modern humans descend from a single line dating back to H. erectus. The name archaic Homo sapiens to refer to a variety of fossil humans that predate anatomically modern H. sapiens. The defining term ‘archaic’, denotes a set of physical features typical of Neanderthals and other species of a late Homo before modern Homo sapiens. These features include a combination of a robust skeleton, a large but low braincases (positioned amply behind, than over, the face), and a lower jaw lacking a prominent chin. In this sense, Neanderthals are sometimes classified as subspecies of archaic H. sapiens-H. Sapient, or categorized as neanderthalensis. Other scientists think that the variation in archaic fossils existently falls into clearly identifiable sets of traits, and that any type of human fossil exhibiting a unique set of traits should have a new species name. According to this view, the Neanderthals belong to their own species, H. neanderthalensis.

The Neanderthals lived in areas ranging from western Europe through central Asia from about 200,000 to about 28,000 years ago. The name Neanderthal comes from fossils found in 1856 in the Feldhofer Cave of the Neander Valley in Germany, which-is the modern form of thal and means “valley” in German. As for the swing of moving forward, did it take place in one geographical area, in one group of humans, who were by that enabled to expand and replace the former human populations of other parts of the world? Or did it occur in parallel in different regions. In each, of which the human populations living today would be descendants of the populations living there before its move forwards. The modern-looking human skulls from Africa around 100,000 years ago have been taken to support the former view with a forwarded occurrence that in Africa. Molecular studies (of so-called mitochondrial DNA) were initially also interpreted as to an American origin of modern humans though the meaning of those molecular findings is currently in doubt. On the other hand, skulls of humans living in China and Indonesia hundreds of thousands of years ago are considered by some physical anthropologists to exhibit features still found ion modern Chinese and in Aboriginal Australians, respectfully. If true, that finding would suggest parallel evolution and multi-regional origins of modern humans, than origins in a single Garden of Eden. The issue remains unresolved.

While Neanderthals lived in glacial times and were adapted to the cold, they penetrated no farther north than northern Germany and Kiev. That is not surprising, since Neanderthals apparently lacked needles, sewn clothing, warm houses, and other technology essential to survive in the coldest climates. Anatomically modern peoples who did possess such technology had expanded into Siberia by around 20,000 years ago (there are the usual much older disputed claims). That expansion may have been responsible for the extinction of Eurasia’s wooly mammoth and wooly rhinoceros.

Scientists realized several years later that prior discoveries-at Engis, Belgium, in 1829 and at Forbes Quarry, Gibraltar, in 1848-also represented Neanderthals. These two earlier discoveries were the first early human fossils ever found. In the past, scientists claimed that Neanderthals differed greatly from modern humans. However, the basis for this claim came from a faulty reconstruction of a Neanderthal skeleton that showed it with bent knees and a slouching gait. This reconstruction had given the common yet mistaken impression that Neanderthals were much simpler descents who lived crude lifestyles. On the contrary, Neanderthals, like the species that preceded them, walked fully upright without a slouch or bent knees. In addition, their cranial capacity was quite large at 1,500 cu. cm. (about ninety cu. in.), larger on average than that of modern humans. (The difference probably relates to the greater muscle mass of Neanderthals as compared with modern humans, which usually correlates with a larger brain size.)

Along with many physical similarities, Neanderthals differed from modern humans in several ways. The typical Neanderthal skull had a low forehead, a large nasal area (suggesting a large nose), a cultivated but projecting nasally and cheek region, a prominent brow ridge with a bony arch over each eye, a non-projecting chin and a visible space behind the third molar (in front of the upward turn of the lower jaw).

Neanderthal and Modern Human Skulls the skull of Homo neanderthalensis differs considerably from that of anatomically modern humans, or Homo sapiens. Neanderthals had thick-walled skulls, sloping foreheads, and heavy brow ridges. This contrasts with the thin-walled skulls, high foreheads, and flat faces of modern humans. Neanderthals also had more pronounced and powerful jaws but less of a chin than do modern humans.

Neanderthal also had a much heavily assembled body and larger boned skeletons than do modern humans. Other Neanderthal skeletal features included a bowing of the limb bones in some individuals, broad scapulae (shoulder blades), hip joints turned outward, a long and thin pubic bone, short lower leg and arm bones on the upper bones, and large surfaces on the joints of the toes and limb bones. Together, these traits made a powerful, compact body of short stature-males averaged 1.7 m. (5 ft. 5 in.) tall and 84 kg. (185 lb.), and females averaged

1.5 m. (5 ft.) tall and 80 kg. (176 lb.).

The short, stocky build of Neanderthals conserved heat and helped them withstand extremely cold conditions that prevailed in temperate regions beginning about 70,000 years ago. The last known Neanderthal fossils come from western Europe and date from approximately 36,000 years ago.

What is more, as Neanderthal populations grew in number in Europe and parts of Asia, other populations of nearly modern humans arose in Africa and Asia. Scientists also commonly refer to these fossils, which are distinct from but similar to those of Neanderthals, as archaic. Fossils from the Chinese sites of Dali, Maba, and Xujiayao display the long, low cranium and large face typical of archaic humans, yet palaeanthropologist also has features similar to those of modern people in the region. At the cave site of Jebel Irhoud, Morocco, scientists have found fossils with the long skull typical of archaic humans but also the modern traits modern of measure have higher forehead and flat mid-face. Fossils of humans from East African sites older than 100,000 years-such as Ngaloba in Tanzania and Eliye Springs in Kenya-also seem to show a mixture of archaic and modern traits.

Ancient Human Footprints the oldest known footprints of an anatomically modern human are embedded in rock north of Cape Town, South Africa. Geologist David Roberts and palaeanthropologist Lee Berger announced the discovery of the footprints in August 1997. A human being made the footprints about 117,000 years ago by walking through wet sand, which eventually hardened into rock.

The oldest known fossils that possess skeletal features typical of modern humans assign a date to between 130,000 and 90,000 years ago. Several key features distinguish the skulls of modern humans from those of archaic species. These features include a much smaller brow ridge, if any; a globe-shaped braincase; and a flat or parallelled projecting face of reduced in size and found under the front of the braincase. Among all mammals, only humans have a face positioned directly beneath the frontal lobe (forward-most area) of the brain. As a result, modern humans have a tendency in having a higher forehead than did Neanderthals and other archaic humans. The cranial capacity of modern humans ranges from about 1,000 to 2,000 cu cm (sixty to 120 cu in), with the average being about 1,350 cu cm (eighty cu in).

Scientists have found both fragmentary and nearly complete cranial fossils of early anatomically modern Homo sapiens from the sites of Singha, Sudan; Omo, Ethiopia; Klasies River Mouth, South Africa and Skhûl Cave, Israel. Based on these fossils, many scientists conclude that modern H. sapiens had evolved in Africa by 130,000 years ago and started spreading to diverse parts of the world beginning on a route through the Near East sometime before 90,000 years ago.

The 1994 discovery in Sierra de Atapuerca, Spain, of well -preserved hominid bones pushed back the date for the arrival in Europe of our early human ancestors to 800,000 years ago. Anthropology professor Brian Fagan discusses these and other recent findings about the first members of the human family to live in Europe, and he dispels the widespread myth that Neanderthals were dumb and brutish.

Paleoanthropologists are engaged in an ongoing debate about where modern humans evolved and how they spread around the world. Differences in opinion rest on the question of whether the evolution of modern humans took place in a small region of Africa or over a broad area of Africa and Eurasia. By extension, opinions differ about whether modern human populations from Africa displaced all existing populations of earlier humans, eventually resulting in their extinction. Those in who think of modern humans as originating only in Africa and then spread from place to place the world support of the out of Africa hypotheses. Those who think modern humans evolved over a large region of Eurasia and Africa support the so -called multi-Regional hypothesis.

Researchers have conducted many genetic studies and carefully assessed fossils to detect which of these hypotheses agrees more with scientific evidence. The results of this research do not entirely confirm or reject either one. Therefore, some scientists think a compromise between the two hypotheses is the best explanation. The debate between these views has implications for how scientists understand the concept of race in humans. The question raised is whether the physical differences among modern humans evolved deep in the past or most recently.

Scientists reported in the May 16, 1996, issued of the journal Nature that later Neanderthals were likely interactively relational, and, perhaps even traded goods, with Cro-Magnons, their anatomically modern human neighbours. Researchers in Arcy-sur-Cure, France, 35 km (22 mi) southeast of Auxerre, said they found hominid fossils alongside bone and ivory jewellery nearly identical to artifacts attributed to anatomically modern humans.

The fossils were found in Arcy-sur-Cure long ago, but scientists could not determine to which human species the bones belonged. The shape of the inner ear gave anthropologists a clue that the 34,000-year-old fossil remains found decades ago were from a Neanderthal, not a modern human. The ear morphology may also shed light on the relationship of Neanderthals to humans of today.

The ornaments found at the Arcy site included a bone ring, grooved animal teeth, and animal claws with small holes made at one end, presumably so they could be strung on a cord and hung around the neck. They resemble jewellery found at sites in northern Spain and central and southwestern France where Cro-Magnons lived. Anthropologists Jean-Jacques Hublin of the Musée de l'Homme in Paris, France, and Fred Spoor of University College in London, England, the coauthors of the report, concluded that the presence of jewellery at the Arcy site nearly identical to jewellery at the Cro-Magnon sites showed that Neanderthals probably traded with Cro-Magnons rather than imitated the style of their contemporary neighbours. The resemblance was too close in appearance to nearby Cro-Magnon finds for imitation, they believe. Anatomically modern humans first arrived in Europe about 40,000 years ago.

The relationship of Neanderthals to modern humans has long been a topic of scientific debate. The fossil record suggests Neanderthals disappeared from 30,000 to 40,000 years ago. Neanderthals characteristics differ most obviously from anatomically modern humans in the formation of the skull and face. The Neanderthal had a sloping forehead, no chin, protruding browridges, large teeth, and strong jaw muscles. The brains of Neanderthals were larger than those of modern humans. Apart from the face, the Neanderthals had thicker bones and larger musculature, long bodies and short legs. Some Neanderthal's features, especially body proportion, were cold-weather adaptations similar to those developed by modern people living in arctic conditions, such as the Inuit.

Hublin and Spoor used high-resolution, computerized X rays to scrutinize a temporal (side) bone from the skull of a one -year-old Neanderthal. They found that the ear canal-known as the labyrinth-within the bone was distinctly different in size and location from the same bone in Homo erectus, an early human ancestor, and anatomically modern humans. The labyrinth consists of three hollow rings and is involved in maintaining balance.

Some scientists classify the Neanderthal as a separate species, Homo neanderthalensis. Because the features of the Neanderthal's labyrinth do not exist in modern humans, the scientists believe that the muscular hominid belongs to a separate species, or at least is not an ancestor of modern humans. Some experts believe that Neanderthals evolved from an archaic Homo sapiens into an evolutionary dead end. Other researchers have speculated that later Neanderthals may have interbred with Cro-Magnons, but Hublin argues that his new evidence does not support that theory. In their report to Nature, Hublin and Spoor said their findings did not show any trend toward more modern human characteristics.

Archives consist of articles that originally appeared in Collier's Year Book because they were published shortly after events occurred, they reflect the information available then. Cross references refer to Archive articles of the same year. Archaeology Top stories in archaeology in 1995 included new dates for the Neanderthals and the discovery of the frozen bodies of a Scythian equestrian and an Inca woman. Last Neanderthals. New dates from Zafarraya Caves in southern Spain suggest that Neanderthals had lived in some previous millennia after scholars assumed they had become extinct. The dates also suggest that Neanderthals coexisted with modern humans in Western Europe for 10,000 years or more, rather than being replaced quickly by overwhelmingly superior modern groups, as many archaeologists have argued. Samples of animal bones and teeth found with Neanderthal remains and artifacts were subjected to both carbon and thorium/uranium testing, producing dates of around 30,000 years ago. In northern Spain stone tools of a type generally associated with modern humans appeared between 40,000 and 38,000 years ago. Elsewhere in Europe, Neanderthal and modern human populations mixed, but in southern Spain, Neanderthals survived without strong biological or cultural interaction with the newcomers, probably because they were isolated. The existence of a Neanderthal population in southern Spain long after modern humans arrived in the north makes it unlikely that modern humans reached Western Europe from Africa via the Strait of Gibraltar. Frozen Bodies. A frozen Scythian equestrian, dated to around 500 Bc, was found in Siberia's Altai Mountains. The man, 25-30 years of age, was buried with his horse, bow and arrows, an ax, and a knife. He was wearing a thick wool cap, high leather boots, and a coat of marmot and sheepskin. On his right shoulder is a large tattoo of a stag. The horse's harness was decorated with wood carvings of griffins and animals covered in gold foil. The horseman's body, like the body of a richly attired woman who also was discovered in the same area in 1993, had been buried in a log-lined chamber under more than 2 metres (7 feet) of permafrost. The horseman's mummy was moved to a Moscow lab for preservation. In southern Peru the frozen body of an Inca woman of 12-14 years of age, probably a sacrificial victim, was found near the summit of a 6,300-metre (20,700-foot) peak. The remains, dated around Ad. 1500, were discovered 60 metres (200 feet) below a stone sanctuary. The peak is usually ice-covered, but the recent eruption of a nearby volcano had blanketed it with ash. The dark-coloured ash absorbed the sun's warmth instead of reflecting it as the ice had, and the ice melted. Two more bodies were later found farther down the slope, along with the remains of a camp used by the sanctuary's builders and priests. Several small figurines of gold, silver, gold-copper alloy, and oyster-like shell were found near the girl's body, and two had been wrapped in the layers of wool and cotton cloth in which it was bundled. The body had an elaborate feather headdress. The most important aspect of the find is that the bodies were frozen, providing an opportunity to study Inca diet and health. Early Bone Points From Africa. Archaeologists dated barbed bone points found in eastern Zaire to 90,000 years or older. The ability to make such tools earlier supports an African origin of behaviourally and biologically modern humans, the archaeologists said. Barbed points do not occur before 14,000 to 12,000 years ago at sites in Eurasia. The barbed points, and unbarbed points and a flat dagger-shaped object with rounded edges, came from three sites at Katanda in the Semliki River valley. Dating of the immediately overlying sands and hippo teeth found in them suggests an age of 80,000 to 90,000 years ago for the site. The barbed points were found with mammal and fish remains, of which catfish were most abundant. The catfish were probably caught during the rainy season when they spawned on the inundated flood-plain and were easy to catch. The Katanda sites show that a complex bone industry and seasonal use of aquatic resources had developed by 90,000 years ago, following a specialized subsistence pattern most often associated in Europe with the end of the Ice Age nearly 80,000 years later. Earliest Weaving. Impressions of woven fabric on four fragments of clay from Pavlov I, an Upper Palaeolithic site in the Czech Republic, have been the earliest evidence of weaving ever found. The fragments were carbon dated in 1995 to between 26,980 and 24,870 years ago. The dates are at least 7,000 to 10,000 years earlier than those of any other evidence of weaving. Two of the better-preserved specimens of tightly spaced rows characteristically of a finely woven bag or mat. The fineness and the method of weaving used, known as twining, suggested that the material may have been produced using a loom and that the weavers were accomplished and not experimenting with a new technology. This means that the actual arrival of weaving may be even earlier than the date of the Pavlov specimens. The impressions from Pavlov I show that a wide range of items, such as baskets, nets, and snares, were likely to have been available to the hunter-gatherers of the Upper Palaeolithic Epoch. Chauvet Art. The spectacular decorated Grotte Chauvet in southern France, whose discovery was announced in January, has proved to have the world's oldest known cave paintings, carbon dated to more than 30,000 years ago. The cave also contains human and bear footprints, flints, bones, and hearths. Submarines and Archaeology. The Confederate vessel Hunley, the first submarine ever to sink a warship in combat, was discovered in May off the coast of Charleston, SC. Famous for its attack on the USS Housatonic during the American Civil War, the submarine went down shortly after sinking the ship on February 17, 1864. The Hunley was made from an iron locomotive boiler and carried a copper canister filled with 40 kilograms (90 pounds) of black powder at the end of a long spar. Manned by volunteers who powered its hand-cranked propeller, the Hunley placed its charge alongside the target and then backed up, detonating the explosive with a long cord that triggered the firing mechanism. The US Navy alleged that in 1995 that the NR-1, a formerly classified submarine, would be used to search the Mediterranean sea-floor for ancient shipwrecks. The submarine's windows and extensive light and sonar arrays give to develop the perfect tense for searching for ancient sunken wrecks, and its remote, and controlled arm can retrieve objects. The NR-1, the world's smallest nuclear submarine, will enable archaeologists to study the open-waters trade routes of antiquity, not just the coastal routes. Its first archaeological mission will be to explore the trade route between Carthage, on the North African coast, and Rome. The discovery in 1995 of the Japanese submarine I-52, which was sunk on June 23, 1944, deepened concerns about the growing accessibility of the deep oceans. American and British treasure hunters were in a race to find the sub and its cargo, 2 metric tons of gold. Both groups hired Russian research vessels with sophisticated sonar and photographic capabilities. In May the American group found the submarine 5,000 metres (17,000 feet) down in the mid-Atlantic. The discoverer stated that the gold, valued at $25 million, would be recovered with the least disturbance possible to the vessel, which may still hold the remains of 109 men. The Japanese government may retain title to both vessel and contents. Nonetheless, the implications lay clear: anyone with sufficient financial backing can happen upon, and, if possible, the person would unscrupulously pillage shipwrecks-ancient, medieval, or modern. Egyptian Tombs. Important discoveries were made in 1995 at both well-known and newly found cemeteries in Egypt. At Saqqara, near Cairo, French archaeologists discovered the necropolis of three queens of the Sixth Dynasty Pharaoh Pepi I (2332-2283 Bc). A pyramid 45 metres (150 feet) high found buried in sand at Saqqara is the tomb of Queen Meretites, a descendant of Pepi I. It may provide information on a turbulent period at the end of the dynasty when powerful governors paid only nominal allegiance to the pharaoh. Egyptian and Canadian archaeologists put a vast pre-dynastic cemetery at Tell Hassan Dawoud, 100 kilometres (60 miles) east of Cairo, dating to 3000 Bc or earlier. Many tombs yielded gold, marble, and ceramic artifacts. Not all of the burials had grave offerings, however, suggesting that Egypt's society was strongly stratified 500 years before the pharaohs. The largest tomb ever found in Egypt's Valley of the Kings was partly explored in 1995. The tomb was the burial place of many 100 or more offspring of Rameses II, who reigned around 1279-1212 Bc. Artifacts recovered from the tomb bear the names of at least four of his sons, and the name of the firstborn, Amon-her-khepeshef, are painted on a wall. Awaiting the presence to the discovery as few combinations were known about the preponderance of the pharaoh's descendants. The tomb is unlikely to hold any great treasure, since a papyrus in Turin records its robbery in 1150 Bc. Its chief importance is the information it may yield about family burials and tomb plans of New Kingdom royalty. Syrian Bronze Age Cemetery. Archaeologists working at Telles-Sweyhat on the Euphrates River in northern Syria discovered an intact tomb in what may be an unplundered cemetery containing up to 150 such tombs. Investigation of several tombs could provide a sample of human remains largely enough to set biological relationships and social organization of the people through DNA analysis. A large sample would also allow study of diet and disease in the population. The tomb, dated around 2500-2250 Bc, held the remains of several individuals. More than 100 ceramic vessels were in the tomb, along with incised bone, beads, and shells. Copper and bronze objects included daggers, axes, and a javelin. Bones of many pigs, sheep, goats, and cows in the tomb are the remains of funerary offerings. Bird eggs had been placed in the eye sockets of one animal skull

However, it was the Primates, of whom are an order of mammals that includes humans, apes, which are the closest living relatives to humans, monkeys, and some less familiar mammals, such as tarsiers, lorises, and lemurs. Humans and other primates share a common evolutionary descent. Consequently, primates have always fascinated scientists because their physical features, social organization, behavioural patterns, and fossil remains provide clues about our earliest human ancestors.

Primates evolved from tree-dwelling ancestors. Although some species, such as humans, have since taken to the ground, all primates’ share features that are related to their tree-climbing ancestry. These include arms and legs that can move more freely than those of most other mammals, flexible fingers and toes, forward-facing eyes that can judge distances accurately-a vital aid when moving about high above the ground-and large brains.

Primates live in a wide range of habitats but are restricted by their need for warmth. Most primates live in tropical jungles or dry forests, but some live in dry grasslands, and others have settled in cold, mountainous regions of China and Japan. The world's most northerly primate, the Japanese macaque, has learned to bathe in hot springs to survive through the winter snows. In parts of the tropics, monkeys can be seen within a few miles of busy city centres, but despite this adaptability, most of the world’s primates retain a close dependence on trees. Apart from humans, baboons are the only primates that have fully made the transition to life out in the open, and even they instinctively climb to safety if danger threatens.

Some primates, especially the smaller species, are active only at night, or nocturnal, while others are diurnal, active during the day. Most primate species-particularly monkeys-are highly sociable animals, sometimes living in troops of more than 100 members. Smaller primates, especially nocturnal ones, have a disposition to favour solitary and secretive.

Primates range in size from quite small to quite large. The world's largest species, the lowland gorilla at 200 kg (400 lb) is more than 6,000 times the weight of the smallest primate, the pygmy mouse lemur from Madagascar. Measuring just 20 cm (eight in) from nose to tail, and weighing about thirty g (1 oz), this tiny animal was first identified about two centuries ago, but was later assumed to be extinct until its rediscovery in 1993.

There are about 235 species of primates. Scientists use more than one way to classify primates, and one system divides the order into two overall groups, or suborders: the prosimians and the anthropoids.

The prosimians, or ‘primitive primates’, make up the smaller of these two groups, with about sixty species, and include lemurs, Pottos, galagos, lorises, and, in some classification systems, tarsiers. Lemurs are only found on the islands of Madagascar and Comoros, where they have flourished in isolation for millions of years. Pottos and galagos are found in Africa, while lorises and tarsiers are found in southeast Asia. Typical prosimians are small to medium-sized mammals with long whiskers, pointed muzzles, and well-developed senses of smell and hearing. Most prosimians are nocturnal, although in Madagascar some larger lemurs are active by day.

In the past, tree shrews were often classified as primates, but their place in mammal classification has been the subject of much debate. Today, based on reproductive patterns and on new fossil evidence, most zoologists classify them in an order of their own, the Scandentia.

The remainder of the world's primates makes up the anthropoid, or “humanlike” suborder, which contains about 175 species. This group consists of humans, apes, and monkeys. Most anthropoids, apart from baboons, have flat faces and a poor sense of smell. With a few exceptions, anthropoids are usually active during the day, and they find their food mainly by sight.

Evolution has affected the thumbs and big toes of primates. In most mammals, these digits bend in the same plane as the other fingers and toes. Nevertheless, in many primates, the thumbs or big toes are opposable, meaning that they are set apart in a way that permits them to meet the other digits at the tips to form a circle. This enables primates to grip branches, and equally importantly, pick up and handle small objects. Instead of having claws, most primates have flat nails that cover soft, sensitive fingertips-another adaptation that helps primates to manipulate objects with great dexterity.

Primate skulls show several distinctive features. One of these is the position of the eyes, which in most species is on the front of the skull looking forward, rather than on the side of the skull looking to the side as in many other mammals. The two forward-facing eyes have overlapping fields of view, which give primates stereoscopic vision. Stereoscopic vision permits accurate perception of distance, which is helpful for handling food or swinging from branch to branch high above the ground. Another distinctive feature of primate skulls, in anthropoids particularly, is the large domed cranium that protects the brain. The inside surface of this dome clearly shows the outline of an unusually large brain-one of the most remarkable characteristics of this group. The shapes of anthropoid brains are different from other mammals; the portion of the brain neuronally devoted toward vision is especially large, while the portion involved with smell is comparatively small.

The primate order includes a handful of species that live entirely on meat (carnivores) and a few that are strict vegetarians (herbivores), but it is composed chiefly of animals that have varied diets (omnivores). The carnivorous primates are the four species of tarsiers, which live in Southeast Asia. Using their long back legs, these pocket-sized nocturnal hunters leap on their prey, pinning it down with their hands and then killing it with their needle-sharp teeth. Tarsiers primarily eat insects but will also eat lizards, bats, and snakes.

Other prosimians, such as galagos and mouse lemurs, also hunt for insects, but they supplement their diet with different kinds of food, including lizards, bird eggs, fruit, and plant sap. This opportunistic approach to feeding is seen in most of monkeys and in chimpanzees. Several species of monkeys, and chimpanzees, but not the other apes, have been known to attack and eat other monkeys. Baboons, the most adept hunters on the ground, often eat meat and sometimes manage to kill small antelope.

Primates display a wide range of mating behaviours. Solitary primates, such as aye-ayes and orangutans, have simple reproductive behaviour. Within the territory that each male controls, several females live, each with their own territory. The male mates with any females within his territory that are receptive. Other species, such as gibbons, form small family groups consisting of a monogamous pair and they’re young. Gorillas form harems, in which one adult male lives with several adult females and they’re young. Among social primates, breeding can be complicated by the presence of many adults. Males may cooperate in defending their troop's territory, but they often fight each other for the chance to mate. In some species, only the dominant male mates with the females in the group. Chimpanzee females mate promiscuously with several adult males, although they usually pair up with one of the high-ranking males during the final few days of estrus, spending all of their time together and mating together exclusively.

Primates have the most highly developed brains in the animal kingdom, rivalled only by those of dolphins, whales, and possibly elephants. Anthropoid primates in particular are intelligent and inquisitive animals that are quick to learn new patterns of behaviour. This resourcefulness enables them to exploit a wide range of foods and may help them to escape attacks by predators.

Many zoologists believe that primates' large brains initially evolved in response to their tree-dwelling habits and their way of feeding. Anthropoid primates, which have the largest brains, live in a visual world, relying on sight to move about and to settle and manipulate food. Unlike smell or hearing, vision generates a large amount of complex sensory information that has to be processed and stored. In primate brains, these operations are carried out by part of the brain called the cerebral cortex, which evolved into such a large structure that the rest of the brain is hidden beneath it. Some unrelated mammals, such as squirrels, also live in trees, but they have less-developed eyesight and much smaller brain.

Increased brainpower has had impressive effects on the way primates live. It has helped them to move about and find food and enabled them to develop special skills. One of the most remarkable of these is Toolmaking, seen in chimpanzees and, to a far greater extent, in humans. Toolmaking, as opposed to simple tool use, involves a preconceived image of what the finished tool should look like-something that is only possible with an advanced brain.

The intelligence of primates is also evident in their social behaviour. For species that live in groups, daily life involves countless interactions with relatives, allies, and rivals. Mutual cleaning and grooming of the fur, which removes parasites, helps to reinforce relationships, while threats-sometimes followed by combat-maintain the hierarchy of dominance that permeates typical primate troops.

Primates use a variety of methods to communicate. In solitary prosimians, when animals are not within sight of each other, communication is often accomplished by using scents. Such animals use urine, faeces, or special scent glands to mark territory or to communicate a readiness to mate. In social anthropoids, visual and vocal signals are much more important. Most monkeys and apes communicate with a complex array of facial expressions, some of which are similar to the facial expressions used by humans. The earliest fossils of primates discovered date from the end of the Cretaceous Period, about sixty-five million years ago. These early fossils include specimens of a species called Notharctus, which resembles today's lemurs and had a long pointed snout. The ancestors of another prosimian group, the tarsiers, are known from fossils that date from the early Eocene Epoch, about fifty million years ago. In 1996 researchers in China recovered fossil bones of a primitive primate no bigger than a human thumb. The animal, named Eosimias, had existed of some forty-five million years ago. Many scientists believe that Eosimias is an example of a transitional animal in the evolution of prosimians to anthropoids. The origin of anthropoids is difficult to pin down. A single anthropoid fossil has been found that may come from the Eocene Epoch, but conclusive fossil evidence of anthropoids does not appear until the Oligocene Epoch, which was introduced some thirty-eight million years ago. These early anthropoids belonged to a lineage that led to the catarrhine primates-the Old World monkeys, apes, and humans. The platyrrhine primates, which include all New World monkeys, are presumed to have diverged from the Old World monkeys during the Eocene Epoch. They evolved in isolation on what was then the island continent of South America. Genetic analysis shows that New World monkeys clearly have the same ancestry with the catarrhines, which means that they must have reached the island continent from the Old World. Exactly how they did this is unclear. One possibility is that they floated across from Africa on logs or rafts of vegetation, journeying across an Atlantic Ocean that was much narrower than it is today

Of all primate groups, the apes and the direct ancestors of humans have been the most intensively studied. One key question that concerns once the two groups diverged. Based on the comparisons of genes and the structure of body parts, scientists think that the line leading to the orangutan diverged from the one leading to humans about twelve million years ago. The ancestral line leading to chimpanzees did not diverge until more recently, probably between five and seven million years ago. This evidence strongly suggests that chimpanzees are our closest living relatives. Apes and monkeys also play an important role in the field of medical research. Because their body systems work very much like our own, new vaccines and new forms of surgery are sometimes tried on apes and monkeys before they are approved for use on humans. Species that are most often used in this way include chimpanzees, baboons, and rhesus monkeys. This kind of animal experimentation has undoubtedly contributed to human welfare, but the medical use of primates is an increasingly controversial area, particularly when it involves animals captured in the wild.

The species most under threats are those affected by deforestation. This has been particularly severe in Madagascar, the only home of the lemurs, and it is also taking place at a rapid rate in Southeast Asia, threatening gibbons and orangutans. The almost total destruction of Brazil's Atlantic rainforest has proved catastrophic for several species, including the lion tamarins, which are found only in this habitat. Primates are also threatened by collection for the pet trade and by hunting. Illegal hunting is the chief threat facing the mountain gorilla, a rare African subspecies that lives in the politically volatile border region straddling Uganda, Rwanda, and the Democratic Republic of the Congo.

In the face of these threats, urgent action is currently underway to protect many of these endangered species. The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) currently forbids the export of many primates, although not all countries have chosen to follow this law. More direct methods of species preservation include habitat protection and captive breeding programs. In some case-for example, the lion tamarin-these programs have met with considerable success. However, without the preservation of extensive and suitable natural habitats, many primate species are destined for

Our closest living relative are three surviving species of great apes: the gorilla, the common chimpanzee, And the pygmy chimpanzee (also known as bonoboo). Their confinement to Africa, along with abundant fossil evidence, strongly suggests that they also played the earliest stages of human evolution out in Africa, human history, as something separate from the history of animals, occurring about seven million years ago (estimates range from five to nine million years ago). Around that time, a population of African apes broke off into several populations, of which one preceded to evolve into modern gorillas, a second into the two modern chimps, and the third into humans. The gorilla line apparently split slightly before the split between the chimp and the human lines.

The primate, is the order of mammals that includes humans, apes, which are the closest living relatives to humans, monkeys, and some less familiar mammals, such as tarsiers, lorises, and lemurs. Humans and other primates share a common evolutionary descent. Consequently, primates have always fascinated scientists because their physical features, social organization, behavioural patterns, and fossil remains provide clues about our earliest human ancestors.

Primates evolved from tree-dwelling ancestors. Although some species, such as humans, have since taken to the ground, all primates’ share features that are related to their tree-climbing ancestry. These include arms and legs that can move more freely than those of most other mammals, flexible fingers and toes, forward-facing eyes that can judge distances accurately-a vital aid when moving about high above the ground-and large brains.

Primates live in a wide range of habitats but are restricted by their need for warmth. Most primates live in tropical jungles or dry forests, but some live in dry grasslands, and others have settled in cold, mountainous regions of China and Japan. The world's most northerly primate, the Japanese macaque, has learned to bathe in hot springs to survive through the winter snows. In parts of the tropics, monkeys can be seen within a few miles of busy city centres, but despite this adaptability, most of the world’s primates retain a close dependence on trees. Apart from humans, baboons are the only primates that have fully made the transition to life out in the open, and even they instinctively climb to safety if danger threatens.

Some primates, especially the smaller species, are active only at night, or nocturnal, while others are diurnal, active during the day. Most primate species-particularly monkeys-are highly sociable animals, sometimes living in troops of more than 100 members. Smaller primates, especially nocturnal ones, have a tendency to solidarity and secretive.

Primates range in size from quite small to quite large. The world's largest species, the lowland gorilla at 200 kg. (400 lb.) is more than 6,000 times the weight of the smallest primate, the pygmy mouse lemur from Madagascar. Measuring only 20 cm. (8 in.) from nose to tail, and weighing about thirty g. (1 oz.), this tiny animal was first identified about two centuries ago, but was later assumed to be extinct until its rediscovery in 1993.

There are about 235 species of primates. Scientists use more than one way to classify primates, and one system divides the order into two overall groups, or suborders: the prosimians and the anthropoids.

The prosimians, or ‘primitive primates’, make up the smaller of these two groups, with about sixty species, and include lemurs, Pontos, galagos, lorises, and, in some classification systems, tarsiers. Lemurs are only found on the islands of Madagascar and Comoros, where they have flourished in isolation for millions of years. Pontos and galagos are found in Africa, while lorises and tarsiers are found in southeast Asia. Typical prosimians are small to medium-sized mammals with long whiskers, pointed muzzles, and well-developed senses of smell and hearing. Most prosimians are nocturnal, although in Madagascar some larger lemurs are active by day.

In the past, tree shrews were often classified as primates, but their place in mammal classification has been the subject of much debate. Today, based on reproductive patterns and on new fossil evidence, most zoologists classify them in an order of their own, the Scandentia.

The remainder of the world's primates makes up the anthropoid, or ‘humanlike’ suborder, which contains about 175 species. This group consists of humans, apes, and monkeys. Most anthropoids, apart from baboons, have flat faces and a poor sense of smell. With a few exceptions, anthropoids are usually active during the day, and they find their food mainly by sight.

Apes are found only in Africa and Asia. They have no tails, and their arms are longer than their legs. Monkeys from Central and South America, known as New World monkeys, have broad noses and nostrils that open sideways. They are called platyrrhine, which means broadly-nosed. Monkeys from Africa and Asia, known as Old World monkeys, have narrow noses and nostrils that face downward-a characteristic also seen in apes and humans. Old World Monkeys are called catarrhine, which mean downward-nosed.

During evolution, primates have kept several physical features that most other mammals have lost. One of these is the clavicle, or collarbone. In primates, the clavicle forms an important part of the shoulder joint. It helps to stabilize the shoulder, permitting a primate to support its weight by hanging from its arms alone-something that few other mammals can do. Some primates, particularly gibbons and the siamang, use this ability to move through the trees from one branch to another by swinging from arm to arm. This type of locomotion is called the brachiation.

During evolution, many mammals have gradually lost limb bones as they have adapted to different ways of life: horses, for example, have lost all but a single toe on each foot. Nearly all primates, by contrast, have retained a full set of five fingers and toes, and usually these digits have become increasingly flexible as time has gone through. In the aye-aye, a prosimian from Madagascar, the third finger on each hand is long and thin with a special claw at the end. Aye-ayes use these bony fingers to extract insect grubs from bark.

Evolution has affected the thumbs and big toes of primates. In most mammals, these digits bend in the same plane as the other fingers and toes. However, in many primates, the thumbs or big toes are opposable, meaning that they are set apart in a way that permits them to meet the other digits at the tips to form a circle. This enables primates to grip branches, and equally importantly, pick up and handle small objects. Instead of having claws, most primates have flat nails that cover soft, sensitive fingertips-another adaptation that helps primates to manipulate objects with great dexterity.

Tails are absent in humans and apes, but in most monkeys and prosimians, the tail plays a special role in maintaining balance during movement through the treetops. Many New World monkeys have prehensile tails, which can be wrapped around branches, gripping them like an extra hand or foot.

Primate skulls show several distinctive features. One of these is the position of the eyes, which in most species is on the front of the skull looking forward, rather than on the side of the skull looking to the side as in many other mammals. The two forward-facing eyes have overlapping fields of view, which give primates stereoscopic vision. Stereoscopic vision permits accurate perception of distance, which is helpful for handling food or swinging from branch to branch high above the ground. Another distinctive feature of primate skulls, in anthropoids particularly, is the large domed cranium that protects the brain. The inside surface of this dome clearly shows the outline of an unusually large brain-one of the most remarkable characteristics of this group. The shapes of anthropoid brains are different from other mammals: The portion of which the distributive contribution under which the brain is enwrapped to the visual modalities is especially large, while the compensable portion of attribution to smell is comparatively small.

The primate order includes a handful of species that live entirely on meat (carnivores) and a few that are strict vegetarians (herbivores), but it is composed chiefly of animals that have varied diets (omnivores). The carnivorous primates are the four species of tarsiers, which live in Southeast Asia. Using their long back legs, these pocket-sized nocturnal hunters leap on their prey, pinning it down with their hands and then killing it with their needle-sharp teeth. Tarsiers primarily eat insects but will also eat lizards, bats, and snakes.

Other prosimians, such as galagos and mouse lemurs, also hunt for insects, but they supplement their diet with different kinds of food, including lizards, bird eggs, fruit, and plant sap. This opportunistic approach to feeding is seen in most of monkeys and in chimpanzees. Several species of monkeys, and chimpanzees, but not the other apes, have been known to attack and eat other monkeys. Baboons, the most adept hunters on the ground, often eat meat and sometimes manage to kill small antelope.

Most apes and monkeys eat a range of plant-based foods, but a few specialize in eating leaves. South American howler monkeys and African colobus monkeys eat the leaves of many different trees, but the proboscis monkey on the island of Borneo is more selective, surviving largely on the leaves of mangroves. These leaf-eating monkeys have modified digestive systems, similar to cows, which enable them to break down food that few other monkeys can digest. Other apes and monkeys eat mostly fruit, while some marmosets and lemurs depend on tree gum and sap.

Compared with many other mammals, primates have few young, and their offspring take a long time to develop. The gestational period, the time between conception and birth, is remarkably long compared with other mammals of similar size. A tarsier, for example, gives birth to a single young after a gestational period of nearly six months. By contrast, a similarly sized rodent will often give birth to six or more young after the gestational period lasting just three weeks. Most primates usually give birth to a single baby, although some species, such as dwarf lemurs, usually have twins or triplets.

Once the young are born, the period of parental feeding and protection can be even more drawn out. In small prosimians the young are often weaned after about five weeks, but in apes they are often fed on their mother's milk for three or four years, and they may continue to rely on her protection for six or more years. This long childhood-which reaches its extreme in humans-is a crucial feature of a primate's life because it enables complex patterns of behaviour to be passed on by learning.

Some primates have fixed breeding seasons, but many can breed anytime of the year. In many species, females signal that they are in estrus-receptive and ready to mate-by releasing special scents. In other species, females develop conspicuous swelling around their genitals to signal their readiness for mating. Such swelling is especially noticeable in chimpanzees. While most copulation occurs when the females are receptive, in some species, such as humans and pygmy chimpanzees, copulation frequently occurs even if the female is not in estrus.

Primates display a wide range of mating behaviours. Solitary primates, such as aye-ayes and orangutans, have simple reproductive behaviour. Within the territory that each male control, his imperative territorial rights are in assess of several females live, each with their own territory. The male mates with any females within his territory that are receptive. Other species, such as gibbons, form small family groups consisting of a monogamous pair and they’re young. Gorillas form harems, in which one adult male life with several adult females and they’re young. Among social primates, breeding can be complicated by the presence of many adults. Males may cooperate in defending their troop's territory, but they often fight each other for the chance to mate. In some species, only the dominant male mates with the females in the group. Chimpanzee females mate promiscuously with several adult males, although they usually pair up with one high-ranking male during the final few days of estrus, spending all of their time together and mating together exclusively.

Primates have the most highly developed brains in the animal kingdom, rivalled only by those of dolphins, whales, and possibly elephants. Anthropoid primates in particular are intelligent and inquisitive animals that are quick to learn new patterns of behaviour. This resourcefulness enables them to exploit a wide range of foods and may help them to escape attacks by predators.

Many zoologists believe that primates' large brains initially evolved in response to their tree-dwelling habits and their way of feeding. Anthropoid primates, which have the largest brains, live in a visual world, relying on sight to move about and to find and manipulate food. Unlike smell or hearing, vision generates a large amount of complex sensory information that has to be processed and stored. In primate brains, these operations are carried out by part of the brain called the cerebral cortex, which evolved into such a large structure that the rest of the brain is hidden beneath it. Some unrelated mammals, such as squirrels, also live in trees, but they have less-developed eyesight and much smaller brain.

Increased brainpower has had important effects on the way primates live. It has helped them to move about and find food and enabled them to develop special skills. One of the most remarkable of these is Toolmaking, seen in chimpanzees and, to a far greater extent, in humans. Toolmaking, as opposed to simple tool use, involves a preconceived image of what the finished tool should look like-something that is only possible with an advanced brain.

The intelligence of primates is also evident in their social behaviour. For species that live in groups, daily life involves countless interactions with relatives, allies, and rivals. Mutual cleaning and grooming of the fur, which removes parasites, helps to reinforce relationships, while threats-sometimes followed by combat-maintain the hierarchy of dominance that permeates typical primate troops.

Primates use a variety of methods to communicate. In solitary prosimians, when animals are not within sight of each other, communication is often accomplished by using scents. Such animals use urine, faeces, or special scent glands to mark territory or to show a readiness to mate. In social anthropoids, visual and vocal signals are much more important. Most monkeys and apes speak with a complex array of facial expressions, some of which are similar to the facial expressions used by humans.

Primates also talk with a repertoire of sounds. These range from the soft clicks and grunts of the colobus to the songs of the gibbon and the roaring of the howler monkey, which can sometimes be heard more than 3 km. (2 mi.) away. Far-carrying calls are used in courtship, both to keep group members from getting separated and to mark and maintain feeding territories. Some primate utterances convey more precise messages, often denoting specific kinds of danger. In the wild, researchers have observed that chimpanzees run through as much as thirty-four different calls, and evidence suggests that they can pass on information-such as the location of food-using this form of communication.

Comparatively, little in effect is known about the origins of primates compared with many other groups of mammals, because primates have left few fossil remains. The chief reason for the scarcity of fossils is that forests, the primary home for most early primates, do not create good conditions for fossilization. Instead of being buried by sediment, the bodies of early primates were more likely to have been eaten by scavengers and their bones dispersed.

The earliest fossils of primates discovered dates from the end of the Cretaceous Period, about sixty-five million years ago. These early fossils include specimens of a species called Notharctus, which resembles today's lemurs and had a long pointed snout. The ancestors of another prosimian group, the tarsiers, are known from fossils that date from the early Eocene Epoch, about fifty million years ago. In 1996 researchers in China recovered fossil bones of a primitive primate no bigger than a human thumb. The animal, named Eosimias, existed in as much as forty-five million years ago. Many scientists believe that Eosimias is an example of a transitional animal in the evolution of prosimians to anthropoids.

The origin of anthropoids has been difficult to pin down. A single anthropoid fossil has been found that may come from the Eocene Epoch, but conclusive fossil evidence of anthropoids does not appear until the Oligocene Epoch, which was launched around thirty-eight million years ago. These early anthropoids belonged to a lineage that led to the catarrhine primates-the Old World monkeys, apes, and humans. The platyrrhine primates, which include all New World monkeys, are presumed to have diverged from the Old World monkeys during the Eocene Epoch. They evolved in isolation on what was then the island continent of South America. Genetic analysis shows that New World monkeys clearly have the same ancestry with the catarrhines, which means that they must have reached the island continent from the Old World. Exactly how they did this is unclear. One possibility is that they floated across from Africa on logs or rafts of vegetation, journeying across an Atlantic Ocean that was much narrower than it is today.

Of all primate groups, the apes are the direct ancestors of humans that bring on the most provocative of studies. One distinguishing query that finds of its vexation is that of two groups diverging. Based on the comparisons of genes and the structure of body parts, scientists think that the human line leading to the orangutan branched from leading lines of human developments around twelve million years ago. The ancestral line leading to chimpanzees did not diverge until more recently, probably between five and seven million years ago. This evidence strongly suggests that chimpanzees are our closest living relatives.

The word primate means ‘the first’. When it was originally coined more than two centuries ago, it conveyed the widely held idea that primates were superior to all other mammals. This notion has since been discarded, but nonhuman primates still generate great interest because of their humanlike characteristics.

In scientific research, much of this interest has focussed on primate behaviour and its correspondence with human behaviour. Attempts have been made to train chimpanzees and orangutans to mimic human speech, but differences in anatomy make it very difficult for apes to produce recognizable words. A more revealing series of experiments has involved training chimpanzees, and later gorillas, to understand words and to respond using American Sign Language. In the late 1960s, a chimp named Washoe learned more than 130 signs. In the 1970s and 1980s, a gorilla named Koko learned to use more than 500 signs and to recognize an additional 500 signs. One outcome of these long-running experiments was that the chimps or gorillas occasionally produced new combinations of signs, suggesting that the animals were not simply repeating tricks that they had learned. More recently, chimps have been trained to talk with humans by using coloured shapes or computer keyboards. They too have shown an ability to associate abstract symbols with objects and ideas-the underlying basis of language.

Apes and monkeys also play an important role in the field of medical research. Because their body systems work very much like our own, new vaccines and new forms of surgery are sometimes tried on apes and monkeys before they are approved for use on humans. Species that are most often used in this way include chimpanzees, baboons, and rhesus monkeys. This kind of animal experimentation has undoubtedly contributed to human welfare, but the medical use of primates is an increasingly controversial area, particularly when it involves animals captured in the wild.

According to figures published by the World Conservation Union (IUCD), more than 110 species of primates-nearly half the world's total-are currently under threat of extinction. This makes the primates among the most vulnerable animals on earth.

The species most under threats are those affected by deforestation. This has been particularly severe in Madagascar, the only home of the lemurs, and it is also taking place at a rapid rate in Southeast Asia, threatening gibbons and orangutans. The almost total destruction of Brazil's Atlantic rainforest has proved catastrophic for several species, including the lion tamarins, which are found only in this habitat. Primates are also threatened by collection for the pet trade and by hunting. Illegal hunting is the chief threat facing the mountain gorilla, a rare African subspecies that lives in the politically volatile border region straddling Uganda, Rwanda, and the Democratic Republic of the Congo.

In the face of these threats, urgent action is currently underway to protect many of these endangered species. The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) currently forbids the export of many primates, although not all countries have chosen to follow this law. More direct methods of species preservation include habitat protection and captive breeding programs. Sometimes-for example, the lion tamarin-these programs have met with considerable success. However, without the preservation of extensive and suitable natural habitats, many primate species are destined for extinction.

Humans as primates, have themselves of a physical and genetic similarities showing that the modern human species, Homo sapiens, has a the same close relationship to another group of primate species, the apes. Humans and the so-called great apes (large apes) of Africa-chimpanzees (including bonobos, or so-called pygmy chimpanzees) and gorillas-have the same ancestors that lived sometime between eight million and six million years ago. The earliest humans evolved in Africa, and much of human evolution occurred on that continent. The fossils of early humans who lived between six million and two million years ago come entirely from Africa.

Humans and great apes of Africa have the same ancestor that lived between eight million and five million years ago. Most scientists distinguish among twelve to nineteen different species of early humans. Scientists do not all agree, however, about how the species are related or which ones simply died out. Many early human species-probably most of them-left no descendants. Scientists also debate over how to identify and classify particular species of early humans, and about what factors influenced the evolution and extinction of each species.

The tree of Human Evolution and fossil evidence shows that the first humans evolved from ape-like descentable ancestries, in, at least six million years ago. Many species of humans followed, but only some left descendants on the branch leading to The Homo sapiens. In this slide show, white skulls represent species that lived during the period shown; gray skulls represent extinct human species.

Early humans first migrated out of Africa into Asia probably between two million and 1.7 million years ago. They entered Europe much later, generally within the past one million years. Species of modern humans populated many parts of the world much later. For instance, people first came to Australia probably within the past 60,000 years, and to the Americas within the past 35,000 years. The beginnings of agriculture and the rise of the first civilizations occurred within the past 10,000 years.

The scientific study of human evolution is called Paleoanthropology. Paleoanthropology is a sub-field of anthropology, the study of human culture, society, and biology. Paleoanthropologists search for the roots of human physical traits and behaviour. They seek to discover how evolution has shaped the potentials, tendencies, and limitations of all people. For many people, Paleoanthropology is an exciting scientific field because it illuminates the origins of the defining traits of the human species, and the fundamental connections between humans and other living organisms on Earth. Scientists have abundant evidence of human evolution from fossils, artifacts, and genetic studies. However, some people find the concept of human evolution troubling because it can seem to conflict with religious and other traditional beliefs about how people, other living things, and the world developed. Yet many people have come to reconcile such beliefs with the scientific evidence.

Modern and Early Humans have undergone major anatomical changes during evolution. This illustration depicts Australopithecus afarensis, the earliest of the three species, the Homo erectus, an intermediate species, whereby the Homo sapiens, a modern human, and Homo’s ergaster. The modern humans are much taller than A. afarensis and have flatter faces and a considerable brawny brain. Modern humans have a larger brain than H. erectus and almost flat face beneath the front of the braincase.

All species of organisms originate through the process of biological evolution. In this process, new species arise from a series of natural changes. In animals that reproduce sexually, including humans, the term species refers to a conjunctive organization into groups whose adult members regularly interbreed, resulting in fertile offsprings that are, offsprings themselves and adequately of procreating. Scientists classify each species with a unique, but two-part scientific names. In this system, modern humans are classified as Homo sapiens.

The mechanism for evolutionary change resides in genes-the basic units of heredity. Genes affect how the body and behaviour of an organism develop during its life. The information contained in genes can be change-a process known as mutation. The way particular genes are expressed -how, and in its gross affect, the body or behaviours of an organism-can also change. Over time, genetic change can alter a species’s overall way of life, such as what it eats, how it grows, and where it can live.

Genetic changes can improve the ability of organisms to survive, reproduce, and, in animals, raise offspring. This process is called adaptation. Parents pass adaptive genetic changes to their offspring, and ultimately these changes become common throughout a population-a group of organisms of the same species that share a particular local habitat. Many factors can favour new adaptations, but changes in the environment often play a role. Ancestral human species adapted to new environments as their genes changed, altering their anatomy (physical body structure), physiology (bodily functions, such as digestion), and behaviour. Over long periods, evolution dramatically transformed humans and their ways of life.

Geneticists estimate that the human line began to diverge from that of the African apes between eight million and five million years ago (paleontologists have dated the earliest human fossils to at least six million years ago). This figure comes by comparing the differences in the genetic makeup of humans and apes, and, then figuring how long it probably took for those differences to develop. Using similar techniques and comparing the genetic variations among human populations around the world, scientists have calculated that all people may share common genetic ancestors that lived sometime between 290,000 and 130,000 years ago.

Humans belong to the scientific order named Primates, a group of more than 230 species of mammals that also includes lemurs, lorises, tarsiers, monkeys, and apes. Modern humans, early humans, and other species of primates all have many similarities plus some important differences. Knowledge of these similarities and differences helps scientists to understand the roots of many human traits, and the significance of each step in human evolution.

All primates, including humans, share at least part of a set of common characteristics that distinguish them from other mammals. Many of these characteristics evolved as adaptations for life in the trees, the environment in which earlier primates evolved. These include more reliance on sight than smell; overlapping fields of vision, allowing stereoscopic (three-dimensional) sight; limbs and hands adapted for clinging on, leaping from, and swinging on tree trunks and branches; the ability to grasp and manipulate small objects (using fingers with nails instead of claws); large brains in relation to body size; and complex social lives.

The scientific classification of primates reflects evolutionary relationships between individual species and groups of species. Strepsirhini (meaning ‘turned-nosed’) primates-of that the living representatives include lemurs, lorises, and other groups of species all commonly known as prosimians-evolved earliest and are the most primitive forms of primates. The earliest monkeys and apes undergo an evolution from transmissiblel haplorhine (meaning ‘simply-nosed’) primates, of which the most primitive living representative is the tarsier. Humans evolved from ape ancestors.

Tarsiers have traditionally been grouped with prosimians, but many scientists now recognize that tarsiers, monkeys, and apes share some distinct traits, and group the three together. Monkeys, apes, and humans-who share many traits not found in other primates-together make up the suborder Anthropoidea. Apes and humans collectively carry out the superfamily of the Hominoidea, a grouping that emphasizes the close relationship among the species of these two groups.

Paleoanthropologists Donald C. Johanson of the Cleveland Museum of Natural History and Tim D. White of the University of California, Berkeley, announced that in January the discovery of the most ancient Hominid (humanlike) species yet uncovered, which they have named Australopithecus afarensis. The fossils on which they base this claim are about three million to four million years old and were found during the 1970's at two widely separated localities in East Africa. The majority were collected at Hadar, a remote region of the Afar Depression of Ethiopia, by Johanson; the others were uncovered in northern Tanzania at Laetolil, 30 miles south of Olduvai Gorge, by anthropologist Mary Leakey. The Hadar material consists of bones from at least thirty-five individuals and includes the best preserved australopithecine skeleton yet found. Nicknamed ‘Lucy’ by Johanson, this skeleton is about 40 percent complete and is evidently of a female who stood about 3.5-4 feet tall and lived some three million years ago. The Laetolil fossils, closer to four million years old, are astonishingly similar in many ways to the Hadar material. Because of the remarkable completeness and good preservation of both fossil collections, we are afforded a previously unavailable glimpse of early human evolution. Analysis suggests that these creatures had rather small brains, no larger than those of the gorilla, but the leg and pelvic bones clearly indicate that A. afarensis walked on two legs like humans. In these respects the newly described fossils do not differ substantially from previously described australopithecine species, which date from about 1.5 million to 2.5 million years ago. All previously recognized hominids, however, show larger cheek teeth (molars) and smaller front teeth (incisors and canines) than the apes. A. afarensis, in contrast, shows very broad incisors and large, projecting canines, often more like those of an apes. The appearance of such primitive dental characteristics in an australopithecine has profound implications for evolutionary history. The most widely held theory states that the evolutionary lines leading to modern humans and apes diverged some twelve million to fifteen million years ago, when apes from which humans were descendably waved to move out of the trees, and so, began to exploit the resources of open grasslands for food. This change in habitats is thought to have produced the characteristic humanlike denotation, which is more efficient at chewing tough food, such as seeds, roots, and tubers, than is the denotation of the apes. Fossil teeth and jaws of a human type character, is the creature called Ramapithecus, which existed about ten million years ago and is generally considered ancestrally human. However, the primitive, more primordial apes of A. afarensis have now cast doubt on the status of Ramapithecus as an ancestral hominid and made unclear the ultimate reason for the differentiation of human ancestors from the apes.

Researchers in South Africa having discovered what they believe are the oldest and best-preserved skulls and skeletons of one of humanity's earliest ancestors, according to a report published in the December 9, 1998, publication of The South African Journal of Science. Paleontologists said the fossilized remains would exceedingly be every bit as two million years older than the oldest previously known complete hominid skeletons. The new finding is expected to reveal much about the anatomy and evolution of early humans, and may rank among the most important breakthroughs in Paleoanthropology (the archaeological study of early human evolution).’It is one of many key elements from ape to man’, said Ronald J. Clarke, a paleoanthropologists at the University of Witwatersrand in Johannesburg, South Africa, who led the team that made the discovery. The skeleton was discovered in the fossil-richly vicinitized near the Sterkfontein Caves, near Krugersdorp in northeastern South Africa. The skeleton is of a small, adult hominid who was about 1.2-m. (4-ft.) tall and weighed about thirty-two k. (70 lb.). Clarke's team dated the bones at 3.2 million to 3.6 million years old.

The bones are believed to belong to a species of australopithecine, an early hominid that had human and apelike features. However, most of the bones remain embedded in rock within the cave. Paleontologists may be accredited for being as far as possible and study of anatomical skeletons until it is removed, a process that Clarke said could take a year or longer. Clarke made the discovery after unexpectedly finding four hominid foot bones in a box of unsorted fossils at the university in 1994. Another search of boxes in a university storage room in May 1997 revealed more foot and lower leg bones. To Clarke's astonishment, all of the bones appeared to belong to the same hominid.

Clarke's initial discovery, announced in 1995, added new evidence to a longstanding debate among anthropologists about the path of early hominid development. Clarke and several of his colleagues argued that the bones of the specimen, dubbed Little Foot, reflected a transition from four-legged tree dwellers to two-can walk upright. In particular, Clarke said the specimen's humanlike ankles and grasping, an apelike big toe suggested that the creature was a capable tree climber who could easily walk on two legs. Other anthropologists dismissed the idea, however, asserting that humans evolved from plains-dwelling hominids and did not live in trees.

After finding additional bones in 1997, Clarke believed that the rest of the skeleton might be present in Sterkfontein's Silberberg Grotto, where the bones had been originally excavated. Within days of searching, his assistants discovered a piece of fossilized bone protruding from the cave wall that perfectly matched one of Clarke's fossil fragments. Although the excavation is at a preliminary stage, Clarke said the remainder of the skeleton might be present and intact, laying face downward in limestone.

Before Clarke's find, the most comprehensive australopithecine skeleton was an Australopithecus afarensis specimen known as Lucy, discovered by anthropologist Donald Johanson in Ethiopia in 1974 and dated at 3.2 million years old. Lucy, however, is only about 40 percent complete. The oldest known complete hominid skeleton was a species of Homo erectus which had ben excavated in Kenya and dated at 1.5 million years old.

The new discovery is considered extraordinary because the fossil record of early hominids is so fragmentary. Paleontologists have had to piece together knowledge about ancient human species by using bone fragments derived from many individuals, making generalizations about anatomy difficult. Once the bones have been chipped from the grotto rock, scientists will examine the hips and legs to figure out whether or not the creature could easily climb trees. In addition, they hope skeletal features will give them clues about the specimen's sex and how these early hominids lived, including their likely diet and possible foraging behaviours.

Scientists also believe the skeleton's intact skull could shed light on another key puzzle of early human evolution: the relation between a brain size and upright locomotion. Many experts believe that it was the ability to walk on two legs -rather than brain size or use of tools-that set the human lineage apart from all other primates

Another mystery scientists will explore is whether the fossils represent of either or a good example of Australopithecus afarensis (like Lucy), of a southern African hominid species, known as Australopithecus africanus, or possibly a new species together. If the species is unrelated to Lucy and is older, then it could force anthropologists to reconsider their views about hominid evolution in Africa. Because of Lucy's age, many scientists now believe that

A. afarensis is a common ancestor to all succeeding australopithecine species.

Nevertheless, some paleontologists cautioned that the age of the new find had not yet been conclusively established and could be only about two million year’s old. The most accurate forms of dating require the presence of volcanic ash, which contains radioactive elements that decay in a predicable way. No such material was present in the cave.

To date the skeleton, Clarke and his team finds distinctive animal fossils near the hominid remains. The age of these animal fossils had already been determined at other datable sites. This technique is not foolproof, however, because movements in the rock layers could make fossils from animals that did not coexist appear next to each other, experts said.

Yet, Spanish paleoanthropologists recently described the fossil remains of several human ancestors from the last Ice Age that were found at a cave site in northern Spain. Did their findings identify a distinct human ancestral species, as the Spaniards suggested? The debate over the paths of human evolution continued as new findings about Human ancestors came of Spanish paleoanthropologists, and have added to the complexity of theories about early humans in Europe with their recent description of the fossil remains of several Ice Age human ancestors found in northern Spain. The researchers suggested that these early humans, who lived more than 780,000 years ago, may have been a separate species that preceded both modern humans and the now-extinct early humans known as Neanderthals.

The researchers said that among the fossils were the facial bones of a boy showing both primitive and modern features and identifying these human ancestors as a distinct species. They suggested the name Homo’s antecessor for the proposed new species. Spanish paleoanthropologists José Bermúdez de Castro of the National Museum of Natural Sciences in Madrid, Spain, and his colleagues described the fossils in the May 30, 1997, issues of the Journal Science.

Although anthropologists agreed that the fossil find was very important, most were not ready to accept the ancient humans as representing a new species. Anthropologists pointed out that not only are the dental and facial bones of a boy scant evidence on which to identify a new species, but also there is a chance that some boy's features were in an intermediate stage that would have changed when the boy reached adulthood. The Spanish researchers' proposed path of human evolution also was controversial, because it pushes groups of early humans off the direct line leading to modern humans, suggesting that there may have been more dead ends in human evolution than previously thought.

The Spanish scientists first reported finding this group of fossils, the oldest remains of pre-humans ever found in Europe, in August 1995. Previously the oldest known Europeans were a group of early humans sometimes classified as a separate species, Homo heidelbergensis. The earliest known specimens from this group date from roughly 500,000 years ago. Using a technique known as Paleomagnetic analysis, the Spanish researchers dated the fossil remains found recently in northern Spain to at least 780,000 years ago, in the Pleistocene Epoch.

Paleomagnetic dating is because the direction of the earth's geomagnetic field has reversed often during the history of the world. The dates of these irregular reversals in geomagnetic polarity have been well documented. Currently the geomagnetic polarity of the earth is facing north, but less than a million years ago it faced south. Internal magnetic traces in the layers of rock that lay the groundwork for nearby fossils shown that the fossils had been buried before the earth's magnetic field last switched direction, from south to north, 780,000 years ago.

The Spanish paleoanthropologists found more than eighty fossils representing at least six individuals, including both juveniles and adults. The fossils were found in a deep pit at a cave site known as Gran Dolina, in the Atapuerca Mountains near the city of Burgos in northern Spain.

The characterization of early human compliance in Europe is yet unclear. Anthropologists agree that hominids-a family of bipedal primates that includes modern human beings and all extinct species of early humans-have their origins in Africa. The Spanish anthropologists speculated that the early humans they called Homo antecessor may have first evolved in Africa from a primitive human classified by some paleoanthropologists as Homo ergaster and by others as early Homo erectus and that Homo sapiens developed from Homo antecessor in Africa. The researchers further proposed that Homo antecessor migrated to Europe, and that Homo heidelbergensis and then Neanderthals (also called Homo sapiens neanderthals) evolved from this line.

Under this proposed set of relationships, Homo heidelbergensis would not be a common ancestor of Homo sapiens and Neanderthals, as currently believed by many anthropologists. The Spanish anthropologists' model also pushes forward the later members of the Homo erectus, perhaps in a direct inclination away from their evolving lineage leading face-to-face to modern human beings. An early Homo erectus appeared about 1.9 million to two million years ago in Africa, and more recent examples of this early human have been found in China and Java, Indonesia. In the Spanish scientists' proposed model, Asian members of

A Homo erectus becomes a side path on human evolution, rather than an intermediate step between Homo ergaster and Homo heidelbergensis.

The human family tree has usually become less linear and more complex over the past decade as anthropologists have made more discoveries.

Archaeological study covers an extremely long span of time and a great variety of subjects. The earliest subjects of archaeological study date from the origins of humanity. These include fossil remains believed to be of human ancestors who lived 3.5 million to 4.5 million years ago. The earliest archaeological sites include those at Hadar, Ethiopia, Olduvai Gorge and Laetoli, Tanzania, East Turkana, Kenya, and in a balanced significance in East Africa. These sites contain evidence of the first appearance of bipedal (upright-walking), apelike early humans. Laetoli even reveals footprints of humans from 3.6 million years ago. Some sites also contain evidence of the earliest use of simple tools. Archaeologists have also recorded how primitive forms of humans spread out of Africa into Asia about 1.8 million years ago, then into Europe about 900,000 years ago.

The first physically modern humans, The Homo sapiens, appeared in tropical Africa between 200,000 and 150,000 years ago-dates determined by molecular biologists and archaeologists working together. Dozens of archaeological sites throughout Asia and Europe show how people migrated from Africa and settled these two continents during the last Ice Age (100,000 to 15,000 years ago). Archaeological studies have also provided much information about the people who first arrived in the Americas more than 12,000 years ago.

In their search for the original cradle of humanity, anthropologists have long been looking for remains of early man in all corners of the world. In this effort they have eliminated the Americas and Australia from the competition and to convey the honour of having seen man's first emergence to either Africa or Asia. Any find of early man made in these areas takes on, therefore, a particular importance. In 1953 fragments of a human skull without the face were discovered in probably late Pleistocene levels near Hopefield, 76 miles north of Capetown, Africa, close to Saldanha Bay. This Saldanha skull is very big and low, has a strongly receding forehead, tremendous bone ridges across the eyebrows, and its bones are extremely thick. All these features are typical of the most primitive types of early man. Probably the most significant fact is that this new specimen resembles very closely the famous Rhodesian Man known, from a skull found in late Pleistocene levels at Broken Hill in 1921. This Rhodesian lowbrow is one of the most puzzling finds of early man ever made, for he combines certain extremely primitive characteristics with some very modern features. He has enormous brow ridges, the heaviest ever found in any type of early man, a very strongly projecting face and unusually broad palate, a strongly receding forehead, and low cranial vaults. Still, his large skull has a brain volume within the range of recent man, and in spite of the various primitive features, he suffered from a truly modern disease: tooth decay. This plague of modern humans appears in the Neolithic, the period in which pottery was discovered and in which man began to boil his food. Every dentist will tell you that soft food is the greatest enemy of teeth, and thus the boiling of food instead of the earlier roasting caused the frequent occurrence of tooth decay in the Neolithic period. Nevertheless, the Rhodesian Man must have been an unfortunate creature as afar apart from the dubious honour of being the first man who needed a dentist, he suffered from mastoiditis and rheumatism, as appears from a careful inspection of his skull and his tibia. To top it all, Sir Arthur Keith, Britain's most distinguished anthropologist, suggests that this truly sick man suffered from acromegaly, a disease of overgrowth of the head, feet, and hands. Although such a diagnosis has previously been doubted on certain grounds, it is the merit of Saldanha man of really absolving his Rhodesian cousin from at least this latter verdict and saving his face-than reason. Since both skulls resemble each other so closely, the large size of the skull and the enormous orbital ridges can no longer be considered as pathological features but must are typical of early man in Africa.

Fossils suggest that he evolutionary line leading to us had achieved an upright posture by around four million years ago, then began to increase in body size and a relative brain size of around 2.5 million years ago. That protohuman generally brought into a different state of awareness the fact that Australopithecus africanus, Homo habilis and Homo erectus, their apparent evolution into each other by some chronological succession. Although the Homo erectus, the stage reached around 1.7 million years ago, was close to us modern humans in body size, its brain size, but still barely half of ours. Stone tools became common around 2.5 million years ago, but they were merely the crudest of flaked or battered stones. In zoological significance and distinctiveness, The Homo erectus was more than an ape, but still much less than a modern human.

Since the biological regularities of living organisms display an active and intimate engagement with their environment that is categorically different from that of inorganic matter, we can conclude that they represent profound oppositions. Since organic and inorganic matter are constructs that cannot be applied simultaneously in the same situation and yet are both required for a complete description of the situation, they must be as be viewed in compliments. In that, given the lawful regularities displayed by organic and inorganic matter are different. A profound complementary relationship exists between the law of physics and that of biology. For example, a complete description in mathematical physics of all the mechanisms of a DNA molecule would not be a complete description of organic matter for an obvious reason. The quality of life associated with the known mechanism of DNA replication exists except an objectivised description. It seems likened to the seamless web of interactions under which the organism holds with its environment, only to suggest that the laws of nature have accorded for biological regularities. Additionally, as it seems agreed to the behaviours we associate with life, which are not merely those of mathematical physics. Even if we could replicate all of the fundamental mechanisms of biological life by manipulating inorganic matter in the laboratory, this problem would remain. To prove that no laws other than those of mathematical physics are involved, that if we would be obliged to create life without any interaction with an environment in which the life form sustains itself or interacts.

Although most physical scientists probably assume that the mechanism of biological life can be completely explained following the law of mathematical physics, many phenomena associated with life cannot be explained in these terms. For example, the apparent compulsion of individual organisms to perpetuate their gene, ‘selfish’ or not, is obviously a dynamic of biological regularities that is not apparent in an isolated system. This contributive functional dynamic cannot be described as for the biochemical mechanisms of DNA or any other aspect of isolated organic matter. The specific evolutionary path followed by living organisms is unique and cannot be completely described as based on prior applications of the laws of physics.

The more complex organisms that evolve from a symbiotic union that is sometimes called in biology texts factories or machines, but, nonetheless, a machine, as Darwin’s model for the relationship part and whole suggest, is a unity of order and not of substance, and the order that exist in a machine is external to the parts. As the biologist Paul Weiss has pointed out, however, the part-whole relationship that exists within and between cells in complex life forms is not that of a machine.

The whole within the part that sets the boundary conditions of cells is DNA, and a complete strand of the master molecule of life exists in the nucleus of each cell. DNA evolved in an unbroken sequence from the earliest life form, and the evolution of even the most complex life forms cannot be separated from the co-evolution of microbial ancestors. DNA in the average cell codes for the production of about two thousand different enzymes, and each of these enzymes canalizes a particular chemical reaction. The boundary conditions within each cell resonate with the boundary condition of all other cells and maintain the integrity of uniqueness of whole organisms.

Volution, in biology, defines a complex process by which the characteristics of living organisms change over many generations as traits are passed from one generation to the next. The science of evolution seeks to understand the biological forces that caused ancient organisms to develop into the tremendous and ever-changing variety of life seen on Earth today. It addresses to how, over a time that by the various plant and animal species had branched away to become an entirely new species, in that, how these different species were related through the branching attributions of the family trees in those that extended over the span of millions of years.

Evolution provides an essential framework for studying the ongoing history of life on Earth. A central, and historically controversial, component of evolutionary theory is that all living organisms, from microscopic bacteria to plants, insects, and mammals, have the same ancestor. Species that are closely related share a recent common ancestor, while distantly related species have a common ancestor further in the past. The animal most closely related to humans, for example, is the chimpanzee. The common ancestor of humans and chimpanzees is believed to have lived approximately six million to seven million years ago. On the other hand, an ancestor common to humans and reptiles that had existed of some 300 million years ago, as these common ancestors were more distantly related forms that lived even farther in the past. Evolutionary biologists attempt to figure out the history of lineages as they diverge and how differences in characteristics developed over time.

Throughout history, philosophers, religious thinkers, and scientists have attempted to explain the history and variety of life on Earth. During the rise of modern science in western Europe in the 17th and 18th centuries, a predominant view held that God created every organism on Earth almost as it now exists. However, in that time of burgeoning interest in the study of apes and natural history, the beginnings of a modern evolutionary theory began to take shape. Early evolutionary theorists proposed that all of the life on Earth evolved gradually from simple organisms. Their knowledge of science was incomplete, however, and their theories left too many questions unanswered. Most prominent scientists of the day remained convinced that the variety of life on Earth could only result from an act of divine creation.

In the mid-19th century a modern theory of evolution took hold, thanks to British naturalist Charles Darwin. In his book, On the Origin of Species by Means of Natural Selection, Darwin described the evolution of life as a process of natural selection. Life, he suggested, is a competitive struggle to survive, often in the face of limited resources. Living things must compete for food and space. They must evade the ravages of predators and disease while dealing with unpredictable shifts in their environment, such as changes in the climate. Darwin offered that, within a given population in a given environment, certain individuals possess characteristics that make them more likely to survive and reproduce. These individuals will pass these critical characteristics onto their offspring. The number of organisms with these traits increases as each generation passes on the advantageous combination of traits. Out matched, individuals lacking the beneficial traits gradually decrease in number. Slowly, Darwin argued, natural selection tips the balance in a population toward those with the combination of traits, or adaptations, best fitted in with their environment.

While, On the Origin of Species were an instant sensation and best sellers, Darwin’s theories faced hostile reception by critics giving further information of those railed against his blasphemous ideas. Other critics pointed to questions left unresolved by Darwin’s careful arguments. For instance, Darwin could not explain the mechanism that caused life forms to change from generation to generation.

Hostility gave to a considerable degree the acclaim as scientists vigorously debated, explored, and built on Darwin’s theory of natural selection. As the 20th century unfolded, scientific advances revealed the detailed mechanisms missing from Darwin’s theory. A the study of the complex chemistry of all organisms unveiled the fundamental foundations that structurally surface the genes and in what manner they were duplicated, and passed from generation to generation. New statistical methods helped explain how genes in specific populations change over generations. These new methods provided insight into how populations remain adaptable to changing environmental circumstances and broadened our understanding of the genetic structure of populations. Advances in techniques used to find out the age of fossils provided clues about when extinct organisms existed and details about the circumstances surrounding their extinction. New molecular biology techniques compare the genetic structures of different species, enabling scientists to find specific undetectable evolutionary relationships between species. Today, evolution is recognized as the cornerstone of modern biology. Uniting such diverse scientific fields as cell biology, genetics, palaeontology, and even geology and statistics, the study of evolution reveals an exquisitely complex interaction of the forces that act upon every life form on Earth.

Natural selection is tied to traits that organisms pass from one generation to the next. In humans, these traits include hundreds of features such as eye colour, blood type, and height. Nature offers countless other examples of traits in living things, such as the pattern on a butterfly’s wings, the markings on a snail’s shell, the shape of a bird’s beak, or the colour of a flower’s petals.

Such traits are controlled by specific bits of biochemical instructions known as genes. Genes are composed of individual segments of the long, coiled molecule called deoxyribonucleic acid (DNA). They direct the synthesis of proteins, molecular labourers that serve as the constructive edifices to all building blocks of cells, control chemical reactions, and transport materials to and from cells. Proteins are themselves composed of long chains of amino acids, and the biochemical instructions found in DNA determine the arrangement of amino acids in a chain. The specific sequence of amino acids dictates the structure and resulting function of each protein.

All genetic traits result from different combinations of gene pairs, one gene inherited from the mother and one from the father. Each trait is thus represented by two genes, often in different forms. Different forms of the same gene are called alleles. Traits depend on very precise rules governing how genetic units are expressed through generations. According to the laws governing heredity, when a dominant allele (say, tongue rolling) and a recessive allele (no tongue rolling) combines, the trait will always be dictated by the dominant allele. The no tongue rolling trait, or any other recessive trait, will only occur in an individual introduced by those sustaining of getting hold of the two recessive alleles.

Evolutionary change takes place in populations over many generations. Since individual organisms cannot evolve in a single lifetime, evolutionary science focuses on a population of interbreeding individuals. All populations contain some variations in traits. In humans, for example, some people are tall, some are short, and some are of medium height.

In interbreeding populations, genes are randomly shuffled among members of the population through sexual reproduction, the process that produces genetically unique offspring. Individuals of different sexes develop specialized sex cells called gametes. In humans and other vertebrates (animals with backbones), these gametes are sperm in males and eggs in females. When males and females mate, these sex cells join in fertilization. A series of cell divisions creates individuals with a unique assembly of genes. No individual members of any population (except identical twins, which develop from a single egg) are alike in their genetic makeup. This diversity, called genetic diversity or variation, is essential to evolution. The greater a population’s genetic diversity, the more likely it is to evolve specific traits that enable it to adapt to new environmental pressures, such as climate change or disease. Expressing of some time, an expressing eventful place showing of a distinction of contrast of such pressures might drive a population with a low degree of genetic diversity to extinction.

Sexual reproduction ensures that the genes in a population are rearranged in each generation, a process termed recombination. Although the contributive combinations of genes in individuals change with each new generation, the gene frequency, or ratio of different alleles in the entire population, remains constant if no evolutionary forces act on the population. One such force is the introduction of new genes into the genetic material of the population, or gene pool.

When individuals move between one population and another, new genes may be introduced to populations. This phenomenon, known as gene flow, results from chance dispersal and intentional migration. Take, for example, two populations of related wildflowers, one red and one white, separated by a large tract of land. Under normal circumstances, the two groups do not interbreed because the wind does not blow hard enough to carry pollen between the populations so that pollination can occur. If in agreement that it happens of an unusually strong wind that carries pollen from the red wildflower population to the white wildflower population, the gene for red flowers may be introduced to the white population’s gene pool.

Genes themselves are constantly being modified through a process called mutation: a change in the structure of the DNA in an individual's cells. Mutations can occur during replication, the process in which a cell splits itself into two identical copies known as daughter cells. Normally each daughter cell receives an exact copy of the DNA from the parent cell. Occasionally, however, errors occur, resulting in a change in the gene. Such a change may affect the protein that the gene produces and, ultimately, change an individual’s traits. While some mutations occur spontaneously, others are caused by factors in the environment, known as mutagens. Examples of mutagens that affect human DNA include ultraviolet light, X-rays, and various chemicals.

Whatever their cause, mutations are a rare but slow and continuous sources of new genes in a gene pool, yet mutations are neutral-that is, they have no effect. Other mutations are detrimental to life, causing the immediate death of any organism that inherits them. Once in a great while, however, a mutation gives an organism an advantageous trait. A single organism with an advantageous trait is only half of the equation, however. For evolution to occur, the forces of natural selection must distribute that trait to other members of a population.

Natural selection sorts out the useful changes in the gene pool. When this happens, populations evolve. Beneficial new genes quickly spread through a population because members who carry them have a greater reproductive success, or evolutionary fitness, and consequently pass the beneficial genes to more offspring. Conversely, genes that are not as good for an organism are eliminated from the population, sometimes quickly and sometimes more gradually, depending on the severity of the gene, because the individuals who carry them do not survive and reproduce with individuals without the bad gene. Over several generations, the gene and most of its carriers are eliminated from the population. Severely detrimental genes may persist at very low levels in a population, however, because they can be reintroduced each generation by mutation.

Natural selection only allows organisms to adapt to their current environment. Should environmental conditions change, new traits may prevail. Moreover, natural selection does not always favour a single version of a trait. Occasionally, multiple versions of the same trait may instill their carriers with equal evolutionary benefit. Nor does natural selection always favour change. If environmental conditions so dictate, natural selection remains unchanged by eliminating extreme versions of a particular trait from the population.

Often, shifts in environmental conditions, such as climate change or the presence of a new disease or predator, can push a population toward one extreme for a trait. In periods of prolonged cold temperatures, for example, natural selection may favour larger animals because they are better able to withstand extreme temperatures. This mode of natural selection, known as directional selection, is evident in cheetahs. About four million years ago, cheetahs were more than twice as heavy as modern cheetahs. Still, quicker and lighter members of the population had greater reproductive success than did larger members of the population. Over time, natural selection takes to be smaller and smaller cheetahs.

Sometimes natural selection acts to preserve the status quo by favouring the intermediate version of a characteristic instead of one of two extremes. An example of this selective force, known as stabilizing selection, was evident in a study of the birth weight of human babies published in the middle of the 20th century. It showed that babies of intermediate weight, about 3.5 kg. (8 lb.), was more likely to survive. Babies with a heftier birth weight had lower chances for survival because they were more likely to cause complications during the delivery process, and lightweight babies were often born premature or with other health problems. Babies of intermediate birth weight, then, were more likely to survive to reproductive age.

Occasionally natural selection favours two extremes, causing alleles for intermediate forms of a trait to become less common in the gene pool. The African Mocker swallowtail butterfly has undergone this form of selection, known as disruptive selection. The Mocker swallowtail evades its predators by resembling poisonous butterflies in its ecosystem. Predators have learned to avoid these poisonous butterflies and to steer away from the look alike Mocker swallowtails. The Mocker swallowtail has a large range, and in different regions, the Mocker swallowtail looks very different, depending on which species of poisonous butterfly it mimics. In some areas the butterfly displays black markings on a white background; in others the markings float on an orange background. Since a Mocker swallowtail appears poisonous to predators, it has a greater chance of survival and therefore a higher evolutionary fitness. Mocker swallowtails that do not look poisonous have a much lower evolutionary fitness because predators quickly eat them. Disruptive selection, then, favours the extreme colour patterns of white or orange, and nothing between.

Speciation may occur even when no isolating mechanism is present. Here, a new species may form through the slow modification of a single group of organisms into an entirely new group. The evolving population gradually changes over generations until the organisms at the end of the line appear very different from the first. Foraminifera, a tiny species of marine animals that live in the Indian Ocean, displays this process, known as vertical or phyletic evolution. From about ten million to six million years ago, the species remained unchanged. These organisms then began a slow and gradual change, lasting about 600,000 years, that left them so unlike their ancestors that biologists consider them an entirely new species.

Whatever the cause of their reproductive isolation, independently evolving populations have propensities in following general patterns of evolutionary descent. Most often, environmental factors determine the pattern followed. A gradually cooling climate, for example, may result in a population of foxes developing progressively thicker coats over successive generations. This pattern of gradual evolutionary change occurs in a population of interbreeding organisms evolving together. When two or more populations diverge, they may evolve to be less alike or more alike, depending on the conditions of their divergence.

In the pattern known as divergent evolution, after two segments of a population diverge, each group follows an independent and gradual process of evolutionary change, leading them to grow increasingly different from each other over time. Over many generations, the two segments of the population look less and less like each other and their ancestor species. For example, when the Colorado River formed the Grand Canyon, a geographic barrier developed between two populations of antelope-squirrels. The groups diverged, resulting in two distinct species of antelope squirrel that have different physical characteristics. On the south rim of the canyon is Harris’s antelope squirrel, while just across the river on the north rim is the smaller, white-tailed antelope squirrel.

Sometimes divergence occurs simultaneously among several populations of a single species. In this process, known as adaptive radiation, members of the species quickly disperse to take advantage of the many different types of habitat niches, that is the different ways of obtaining food and shelter in their environment. Such specialization ultimately results in many genetically distinctive but similar-looking species. This commonly occurs when a species colonizes a new habitat in which it has little competition. For example, a flock of one species of birds may arrive on some sparsely populated islands. Finding little competition, the birds may evolve rapidly into several species. Each adapted to one available niche. Charles Darwin noted an instance of adaptive radiation on his visit to the Galápagos Islands off the coast of South America. He surmised that one species of the finch colonized the island’s thousands of years ago and produced the fourteen species of finch-like birds that exist there now. Darwin observed that the greatest differences in their appearance lay in the shapes of the bills, adapted for their mode of eating. Some species possessed large beaks for cracking seeds. Others had smaller beaks for eating vegetation, and still others featured long, thin beaks for eating insects.

Sometimes distantly related species evolve in ways that make themselves obtainably appears more closely related. This pattern, known as convergent evolution, occurs when members of distantly related species occupy similar ecological niches. Natural selection favours similar adaptations in each population.

Noticeable examples of convergent evolution are the marsupial mammals of Australia and their placental mammal counterparts on other continents. About fifty million years in the past, the Australian continent separated from the rest of the Earth’s continents. Biologists speculate that few if any placental mammals had migrated to Australia by the time the continents split. They also surmise that neither marsupial mammals, nor their placental counterparts could cross the ocean after the landmasses drifted apart. As a result, the animals evolved entirely independently. Yet many marsupial mammals in Australia strongly resemble many placental mammals found on other continents.

For example, the marsupial mole of Australia looks very much like the placental moles found on other continents, yet these animals have evolved entirely independent of one another. The explanation for the moles’ similar appearances lies in the principles of convergent evolution. Both species evolved to exploit similar ecological niches, and, here, the realm just beneath the surface of the ground. While millions of generations in both marsupial and placental moles, natural selection favoured adaptations suited for a life of burrowing: tube-shaped bodies, broad, shovel-like feet, and short, silky fur that sheds dirt or sand easily. The most striking difference between placental moles and marsupial moles is the colour of their fur. Placental moles are usually dark brown or gray, a colouration that enables them to blend in with the soil in their habitat. Marsupial moles burrow in the golden or reddish sand of Australia, so natural selection produced golden or golden-red fur.

Often two or more organisms in an ecosystem fall into evolutionary steps with one another, each adapting to changes in the other, a pattern known as coevolution. Coevolution is often apparent in flowers and their pollinators. Hummingbirds, for example, have long, narrow beaks and a poor sense of smell, and they are attracted to the colour red. Fuchsias, flowering plants that rely on hummingbirds for pollination, usually have long, slender flowers in various shades of red, and they have almost no fragrance. What at first may be a remarkable coincidence is, in fact, the product of thousands of generations of evolutionary fine-tuning. More likely to attract. hummingbirds than fuchsias with different colouration, red-flowered, individuals had greater reproductive success. Hummingbirds have a tendency to spend more time extracting nectar from the flower of fuchsias with shapes that matched the size of their slender beaks, thus increasing the likelihood of successful pollination. Similarly, those hummingbirds with long, slender beaks were best able to collect nectar from the long-necked flower. Over many generations, long-necked hummingbirds became the rule, rather than the exception, in hummingbird populations.

Species do not change overnight, or even during one lifetime. Evolutionary change usually occurs in tiny, almost imperceptible increments over thousands of generations -periods that range from decades to millions of years. To study the evolutionary relationships among organisms, scientists must take complex measures to exert effort in the detection of deriving indirect clues from the fossil record, patterns of animal distribution, comparative anatomy, molecular biology, and finally, direct observation in laboratories and the natural environment.

One way biologists learn about the evolutionary relationships between species is by examining fossils. These ancient remains of living things are created when a dead plant or animal is buried under layers of mud or sand that gradually turns into stone. Over time, the organism remains themselves may turn to stone, becoming preserved within the rock layer in which they came to rest. By measuring radioactivity in the rock in which a fossil is embedded, paleontologists (scientists who study the fossil record) can determine the age of a fossil.

Fossils present a vivid record of the earliest life on Earth, and of a progression over time from simple to more-complex life forms. The earliest fossils, for example, are those of the primitive bacteria. Some of which are 3.5 billion years’ old, and are embedded in more recent layers of rock. The first animal fossils appear as primitive jellyfish that assign of a date from 680 million years into the past. Still more-complex forms, such as the first vertebrates (animals with backbones), are documented by fossils some 570 million year’s old. Fossils also show that the first mammals appeared roughly 200 million years in the past.

Although these ancient forms of life have not existed on Earth for millions of years, scientists have been able, typically, to show a clear evolutionary line between extinct animals and their modern descendants. The horse’s lineage, for example, can be traced back about fifty million years to a four-toed animal about the size of a dog. Fossils provide evidence of several different transitional forms between this ancient horselike animal and the modern species. In another example, the extinct, winged creature Archaeopteryx lived next to 145 million years ago. Its fossil shows the skeleton of a dinosaur and the feathers of a bird. Many paleontologists consider this creature an intermediate step in the evolution of reptilian dinosaurs into modern birds. Fossils show clear evidence that the earliest human species had many apelike features. These features included large, strong jaws and teeth; short stature, long, curved fingers; and faces that protruded outward from the forehead. Later species evolved progressively more humanlike features.

Scientists, as well, learn about evolutionary principles studying to what degree the different species of plants and animals mate through the geographical distribution by which each in their natural states are the arousing stimulants and by how they relate to their environmental condition between each other. In particular, populations that exist on islands provide living clues of patterns of evolution. The study of these evolutionary relationships, known as island biogeography, has its roots in Darwin’s observations of the adaptive radiation of the Galapagos finches. The Hawaiian Islands provide similar examples, particularly in the species of birds known as honey reapers. Like the Galapagos finches, the honey-creepers of Hawaii evolved from a common ancestor and branched into several species, showing a striking variety of beak shapes adapted for obtaining different food sources in their various niches.

Detailed study of the internal and external features of different living things, a discipline known as comparative anatomy, also provides a wealth of information about evolution. The arm of a human, the flipper of a whale, the foreleg of a horse, and the wing of a bird have different forms and are adapted to different functions. Yet they correspond in some way, and this correspondence extends too many details. For the arm, flipper, foreleg, and wing, for example, each appendage shows a similar bone structure. The study of comparative anatomy has revealed many instances of correspondence within various groups of organisms and these bodily structures are said to be homologous. Evolutionary biologists suggest that such homologous structures originated in a common ancestor. The differences arose as each group diverged from the common ancestor and adapted to different ways of life. The more recent the common ancestor, the more similar the species.

The skeletons of humans, for instance, retain evidence of a tail-like structure that is probably a relic from previous mammalian ancestors. This feature, called the coccyx, or more commonly, the tail-bone, has little apparent function in modern humans. Relic features such as the coccyx are called vestigial organs. Another vestigial organ in humans is the appendix, under which a narrow tube attached to the large intestine. In some plant-eating mammals, the appendix is a functioning organ that helps to digest plant material. In humans, however, the organ lacks this purpose and is considerably reduced in size, serving only as a minor source of certain white blood cells that guard against infection.

The field of embryology, the study of how organisms develop from a fertilized egg until they are ready for birth or hatching, also provides evolutionary clues. Scientists have noted that in the earliest stages of development, the embryos of organisms that share a recent common ancestor are very similar in appearance. As the embryos develop, they grow less similar. For example, the embryos of dogs and cats, both members of the mammal order Carnivora, are more similar in the early stages of development than just before birth. The same is true of human and ape embryos, biology in the last few decades, researchers seek evolutionary clues at the smallest level: within the molecules of living organisms. Despite the enormous variety of form and function seen in living things, the underlying genetic code, under which the molecular building material of life displays a striking uniformity. Most living organisms have DNA, and in each case it consists of different pairings of the same building blocks: four nucleotide bases called adenine, thymine, guanine, and cytosine. Using different combinations of these bases, DNA directs the assembly of amino acids into functional proteins. The same uniform code operates within all living things.

These molecules contain more than the master plan for living organisms, but each is a record of an organism's evolutionary history. By examining the makeup of such molecules, scientists gain insights into how different species are related. For example, scientists compare the protein cytochrome from different species. In closely related species, the proteins have amino-acid sequences that are very similar, perhaps varying by one or a few amino acids. More distantly related organisms generally have proteins with fewer similarities. The more distant the relationship, the less alike the proteins.

The idea that species become genetically more different as they diverge from a common ancestor laid the groundwork for the concept of the molecular clock. Scientists know that, statistically, neutral mutations tend to accumulate at a regular rate, like ticks of a clock. Therefore, the number of molecular differences in a shared molecule is proportional to the time that has elapsed since the species had the same ancestor. This calculation has provided new knowledge of the evolutionary relationship between modern apes and modern humans. The ‘molecular clock’ concept is controversial, however, and has caused much disagreement between evolutionary scientists who study molecules and those who study fossils. This disagreement arises particularly when the molecular clock time estimates do not agree with the estimates derived from studying the fossil record.

Information about evolutionary processes is also obtained by direct observation of species that undergo rapid modification in only a few generations. One of the most powerful tools in the study of evolutionary mechanisms is also one of the tiniest common fruit flies. These insects have short life spans and, therefore, short generations. This enables researchers to observe and manipulate fruit fly reproduction in

the laboratory and learn about evolutionary change in the process.

Scientists also study organisms in their natural environments to learn about evolutionary processes, for example, how insects develop genetic resistance too human -made pesticides, such as DDT. While pesticides are often initially effective in killing crop-destroying pests, sometimes the insect populations bounce back. In every insect population a few individual insects are not affected by the pesticide. The pesticide wipes out most of the population, leaving only the genetically resistant individuals to multiply and flourish. Gradually, resistant individuals predominate in the population, and the pesticide loses its effectiveness. The same phenomenon has been observed in strains of disease-causing bacteria that have become resistant to even the most powerful antibiotics. Bacterial resistance forces scientists to develop new antibacterial compounds continuously. Scientists have hoped that curbing overuse of antibiotics might cause the drugs to become effective again. Recent research, however, suggests that bacteria may retain their resistance to antibiotics over many generations, even if they have not been exposed to the agent.

How life changes and diversifies over time, some evolutionary biologists are trying to understand how life originated on Earth. This too requires the careful examination and interpretation of many indirect clues. In one well-known series of experiments in 1953, the American chemists’ Stanley L. Miller and Harold C. Urey attempted to reproduce the atmosphere of the primitive Earth nearly four billion years ago. They circulated a mixture of gases believed to have been present at the time (hydrogen, methane, ammonia, and water vapour) over water in a sterile glass container. They then subjected the gases to the energy of electrical sparks, simulating the action of lightning on the primitive Earth. After about a week, the fluid turned brown and found to contain amino acids: the constructively stabling blocks of proteins. Subsequent work by these scientists and others also succeeded in producing nucleotides, the all-important constructions to accompany the building blocks of DNA and other nucleic acids. While the artificial generation of these molecules in laboratories did not produce a living organism, this research offers some support that the first building blocks of life could have arisen from raw materials that were present in the environment of the primitive Earth.

Other theories regarding the origin of life on Earth point to outer space. Molecules formerly believed to be produced only by living systems have been found spontaneously to form in great abundance in space. Some scientists speculate that the building blocks of early life might have reached the primitive Earth on meteorites or from the dust of a comet tail.

Once all the raw materials were in place, nucleic acids, proteins, and the other components of simple cells,-it is not clear how the first self-replicating life forms came about. Recent theories centre on the role of a particular nucleic acid -ribonucleic acid (RNA), which, in modern cells, carries out the task of translating the instructions coded in DNA for the assembling of proteins. RNA also acts as a catalyst, that is, to cause other chemical reactions and perhaps most significantly, to make copies of itself. Some scientists believe that the first self-replicating organisms were based on RNA.

According to the fossil record, the first single-celled bacteria appeared some 3.5 billion to 3.9 billion years ago. These microscopic creatures lived in the water, converting the Sun's light into chemical energy. This metabolic process, called photosynthesis, released oxygen gas as a byproduct. Photosyn thesis slowly changed the composition of the early atmosphere, adding more oxygen to what scientists believe was a mixture of sulfur and carbon gases and watered vapour. Perhaps two billion years ago, more-complex cells appeared. These were the first eukaryotic cells, containing a nucleus and other organized internal structures. At around the same time, the oxygen in the Earth's atmosphere increased to nearly what it is today, which was yet another step that was crucial to the development of early life. Around one billion years ago, the first multicellular life forms began to appear. The beginning of the Cambrian Period (around 540 million years in the past), known as the Cambrian explosion, marked an enormous expansion in the diversity and complexity of life. Following this great diversification, plant life found its way to land, while the first fishes evolved, ultimately producing amphibians. Later came reptiles and, later still, mammals. The tumult of evolution was in full swing, as it remains today.

The origins of life on Earth have been a source of speculation among philosophers, religious thinkers, and scientists for thousands of years. Many human civilizations used rich and complex creation stories and myths to explain the presence of living organisms. Ancient Greek philosophers and scientists were among the earliest to apply the principles of modern science to the mysterious complexity and variety of life around them. During early Christian times, ancient Greek ideas gave way to Creationism, the view that a single God created the universe, the world, and all life on Earth. For the next 1,500 years, evolutionary science was at a standstill. The dawn of the Renaissance in the early 14th century brought a renewed interest in science and medicine. Advances in anatomy highlighted physical similarities in the features of widely different organisms. Fossils provided evidence that life on this planet was vastly different millions of years ago. With each development came new ideas and theories about the nature of life.

The Greek philosopher Anaximander, who lived in the 500's Bc, is generally credited as the earliest evolutionist. Anaximander believed that the Earth first existed in a liquid state. Further, he believed that humans evolved from fish-like aquatic beings who left the water once they had developed sufficiently to survive on land. Greek scientist Empedocles speculated in 400 Bc that plant life arose first on Earth, followed by animals. Empedocles proposed that humans and animals arose not as complete individuals but as various body parts that joined randomly to form strange, fantastic creatures. Some of these creatures, being unable to reproduce, became extinct, while others thrived. Outlandish as his ideas seem today, Empedocles’ thinking anticipates the fundamental principles of natural selection.

The Greek philosopher and scientist Aristotle, who lived in the 300's Bc, referred to a ‘ladder of nature’, a progression of life forms from lower too higher, but his ladder was a static hierarchy of levels of perfection, not an evolutionary concept. Each rung on this ladder was occupied by organisms of higher complexity than the rung before it, with humans occupying the top rung. Aristotle acknowledged that some organisms are incapable of meeting the challenges of nature and so cease to exist. As he saw it, successful creatures possessed a gift, or perfecting principle, that enabled them to rise to meet the demands of their world. Creatures without the perfecting principle died out. In Aristotle’s view it was this principle-not evolution, which accounted for the progression of forms in nature.

Many centuries later, the idea of a perfect and unchanging natural world. The product of divine creation was predominant, not only in religion and philosophy but in science. Gradually, however, as knowledge accumulated from seemingly disparate areas, the beginnings of modern evolutionary theory began to take shape. A key figure in this regard was the Swedish naturalist Carolus Linnaeus, who became known as the father of modern taxonomy, the science of classifying organisms.

In his major work Systema Naturae (The System of Nature), first published in 1735, Linnaeus devised a system of classification of organisms that is still in use today. This system places living things within increasingly specific categories based on common attributions from a general grouping (kingdom) down to the specific individual (species). Using this system, Linnaeus named nearly 10,000 plant and animal species in his lifetime. Not an evolutionist by any means, Linnaeus believed that each species was created by God and was incapable of change. Nevertheless, his orderly groupings of living things provided important insights for later theorists. Perhaps the most prominent of those who embraced the idea of progressive change in the living world was the early 19th-century French biologist Jean-Baptiste Lamarck. Whom of which as, Lamarckism theory, now known as Lamarckism and based in part on his study of the fossils of marine invertebrates, was that species do change over time. He believed, furthermore, that animals evolve because unfavourable conditions produce needs that animals try to satisfy. For example, short-necked ancestors of the modern giraffe voluntarily stretched their necks to reach leaves high in trees during times when food was scarce. Proponents of Lamarckism thought that this voluntary employment slightly changed the hereditary characteristics controlling neck growth: The giraffe then transmitted these alterations to its offspring as what Lamarck called acquired characteristics. Modern scientists know that adaptation and natural selection are far more complicated than Lamarck supposed, relate to an animal's voluntary efforts. Nonetheless, the idea of acquired characteristics, with Lamarck as its most famous proponent, persisted for many years.

French naturalist and paleontologist Georges Cuvier feuded with Lamarck. Unearthing the fossils of mastodons and other disappeared or vanquished species. Cuvier produced proof of long-extinct life forms on Earth. Unlike Lamarck, however, Cuvier did not believe in evolution. Instead, Cuvier believed that floods and other cataclysms destroyed such ancient species. He suggested that after each cataclysmic event, God created a new set of organisms.

At around the same time that Cuvier and Lamarck were squabbling, British economist Thomas Robert Malthus proposed ideas extremely influential in evolutionary theory. In his 1798 work, An Essay on the Principle of Population Malthus theorized that the human population would increase at a much greater rate than its food sources. This theory introduced the key idea of competition for limited resources, that is, there is not enough food, water, and living space to go around, and organisms must somehow compete with each other to obtain resources necessary for survival. Another key idea came from Scottish geologist Charles Lyell, who supplied a deeper understanding of Earth’s history. In his book Principles of Geology (1830), Lyell set forth his case that the Earth was millions of years old rather than only a few thousand years old, as was maintained by those who accepted the biblical story of divine creation as fact.

In 1831, Charles Darwin, who was intending to become a country minister, had an opportunity to sail as ship’s naturalist aboard the HMS Beagle on a five-year, around-the-world mapmaking voyage. During the journey, as the ship anchored off South America and other distant shores, Darwin had the opportunity to travel inland and make observations of the natural world. In the Galápagos Islands, he noted how species on the various islands were similar but distinct from one another. He also observed fossils and other geological evidence of the Earth's great age. The observation’s Darwin made on that voyage seemed to suggest the evolution, rather than the creation, of the many local forms of life.

In 1837, shortly after returning to England, Darwin began a notebook of his observations and thoughts on evolution. Although Darwin had developed the major components of his theory of evolution by natural selection in a 1842 unpublished paper circulated among his friends, he was unwilling to publish the results until he could present as complement by which its case of the possibility. He laboured for almost twenty additional years on his theory of evolution and on its primary mechanism, natural selection. In 1858 he received a letter from British naturalist Alfred Russel Wallace, a professional collector of wildlife specimens. Much to Darwin’s surprise, Wallace had independently hit upon the idea of natural selection to explain how species are modified by adapting to different conditions. Not wanting Darwin to be unfairly deprived of his share of the credit for the theory. Some of Darwin's scientific colleagues presented extracts of Darwin's work along with Wallace's paper at a meeting of the Linnean Society, a London-based science organization, in June 1858. Wallace's paper stimulated Darwin to finish his work and get it into print. Darwin published, On the Origin of Species by Means of Natural Selection on November 24, 1859. All 1,250 copies of the first printing were sold on that day.

Darwin’s book and the theory it popularized evolution through natural selection, which set off a storm of controversy. Some protest came from the clergy and other religious thinkers. Other objections came from scientists. Many scientists continued to believe in Lamarckism, the idea that living things could consciously strive to accumulate modifications during a lifetime and could pass these traits onto their offspring. Other scientists objected to the seemingly random quality of natural selection. If natural selection depended upon random combinations of traits and variations, critics asked, how could it account for such refined and complex structures as the human eye? Perhaps the most serious question, one for which Wallace and Darwin had no answer -concerning the inheritance of traits. How exactly were traits passed along to offspring?

Darwin did not know it, but the answer was nearby -although it would not be acknowledged in his lifetime. In the Augustinian monastery at Brünn (now Brno in the Czech Republic), Austrian monk Gregor Mendel experimented with the breeding of garden peas, observing how their traits were passed down through generations. In crossbreeding pea plants to produce different combinations of traits’-colour, height, smoothness, and other characteristics,-Mendel noted that although a given trait might not appear in every generation, the trait did not disappear. Mendel discovered that the expression of traits hinged on whether the traits were dominant or recessive, and on how these dominant and recessive traits combined. He learned that contrary to what most scientists believed at the time. The mixing of traits in sexual reproduction did not result in a random blending. Traits were passed along in discrete units. These units are now known as genes. Mendel created hundreds of experiments and produced precise statistical models and principles of heredity, now known as Mendel’s Laws, showing how dominant and recessive traits are expressed over generations. However, no one appreciated the significance of Mendel’s work until after his death. However, his work ultimately gave birth to the modern field of genetics?

In 1900, the Dutch botanist Hugo Marie de Vries and others independently discovered Mendel’s laws. The following year, de Vries's book The Mutation Theory challenged Darwin's concept of gradual changes over long periods by proposing that evolution occurred in abrupt, radical steps. Having observed new varieties of the evening primrose plant coming into existence in a single generation, de Vries had subsequently determined that sudden change, or mutation, in the genetic material was responsible. As the debate over evolution continued in the early 20th century, some scientists came to believe that mutation, and not natural selection, was the driving force in evolution. In the face of these mutationists, Darwin's central theory threatened to fall out of favour.

As the science of genetics advanced during the 1920s and 1930s, several key scientists forged a link between Mendel's laws of inheritance and the theory of natural selection prentirely geometric unopposed by Darwin and Wallace. British mathematician Sir Ronald Fisher, British geneticist J.B.S. Haldane, and American geneticist Sewall Wright pioneered the field of population genetics. By mathematically annualizing the genetic variation in entire populations, these scientists showed that natural selection, and not just mutation, could result in evolutionary change.

Further investigation into population genetics and such fields as palaeontology, taxonomy, biogeography, and the biochemistry of genes eventually led to what is called the modern synthesis. This modern view of evolution integrated discoveries and ideas from many different disciplines. In so doing, this view reconciled the many disparate ideas about evolution into the all-encompassing evolutionary science studied today. The modern synthesis was advanced in such books as Genetics and the Origin of Species, published in 1937 by the Russian-born American geneticist Theodosius Dobzhansky; Evolution: The Modern Synthesis (1942) by British biologist Sir Julian Huxley, least of mention, Systematics and the Origin of Species (1942) by German-born American evolutionary biologist Ernst Mayr. In 1942, American paleontologist George Gaylord Simpson showed from the fossil record that rates and modes of evolution are correlated: New kinds of organisms arise when their ancestors invade a new niche, and evolve rapidly to exploit the conditions in the new environment best. In the late 1940's American botanist G. Ledyard Stebbins showed that plants display evolutionary patterns similar to those of animals, and especially that plant evolution has shown diverse adaptive responses to environmental pressures and opportunities.

In addition, biologists reviewed a broad range of genetic, ecological, and anatomical evidence to show that observation and experimental evidence strongly supported the modern synthesis. The theory has formed the basis of evolutionary science since the 1950s. It has also led to an effort to classify organisms according to their evolutionary history, and their physical similarities. Modern scientists use the principles of genetics and molecular biology to study relationships first proposed by Carolus Linnaeus more than 200 years ago.

In 1953, American biochemist James Watson and British biophysicist Francis Crick described the three-dimensional shape of DNA, the molecule that contains hereditary information in nearly all living organisms. In the following decade, geneticists developed techniques to compare DNA and proteins from different organisms rapidly. In one such procedure, electrophoresis, geneticists evaluate different specimens of DNA or proteins by observing how they behave in the presence of a slight electric charge. Such techniques opened entirely new ways to study evolution. For the first time geneticists could quantitatively determine, for example, the genetic change that occurs during the formation of new species.

Electrophoresis and other biochemical techniques also proved to geneticists that populations varied extensively at the molecular level. They learned that much of the population variation at the molecular or biochemical level has no apparent benefit. In 1968 Japanese geneticist Motoo Kimura proposed that much of the variation at the molecular level results not from the forces of natural selection, but from chance mutations that do not affect an organism's fitness. Not all scientists agree with the neutral gene theory.

In recent decades, another branch of evolutionary theory has appeared, as researchers have explored the possibility that not only physical traits, but behaviour itself, might be inherited. Behavioural geneticists have studied how genes influence behaviour, and more recently, the role of biology in social behaviour has been explored. This field of investigation, known as Sociobiology, was inaugurated in 1975 with the publication of the book Sociobiology: The New Synthesis by American evolutionary biologist Edward O. Wilson. In this book, Wilson proposed that genes influence much of the animals and humanizing behaviours, and, least of mention, that these characteristics are also subject to natural selection.

Sociobiologists examine animal behaviours called altruistic, that is, unselfish, or demonstrating concern for the welfare of others. When birds feed on the ground, for example, one individual may notice a predator and sound an alarm. In so doing, the bird also calls the predator’s attention to itself. What can account for the behaviour of such a sentry, who seems to derive no evolutionary benefit from its unselfish behaviour and so seem to defy the laws of natural selection?

Darwin was aware of altruistic social behaviour in animals, and of how this phenomenon challenged his theory of natural selection. Among the different types of bees in a colony, for example, worker bees are responsible for collecting food, defending the colony, and caring for the nest and the young, but they are sterile and create no offspring. Only by her, that the beehive area of infactoring takes apart that which only the queen bee has inherently given that which she could reproduce. If natural selection rewards those who have the highest reproductive success, how could sterile worker bees come about by natural selection when worker bees devote themselves to others and do not reproduce?

Scientists now recognize that among social insects, such as bees, wasps, and ants, the sterile workers are more closely related genetically to one another and to their fertile sisters, the queens, than brothers and sisters are among other organisms. By helping to protect or nurture their sisters, the sterile worker’s bees preserve their own genes more so than if they reproduced themselves. Thus, the altruistic behaviour evolved by natural selection.

Evolutionary theory has undergone many further refinements in recent years. One such theory challenges the central idea that evolution goes on by gradual change. In 1972 the American paleontologist’s Stephen Jay Gould and Niles’ Eldredge proposed the theory of punctuated equilibria. According to this theory, trends in the fossil record cannot be attributed to gradual transformation within a lineage, but result from quick bursts of rapid evolutionary change. In Darwinian theory, new species arise by gradual, but not necessarily uniform, accumulation of many small genetic changes over long periods of geologic time. In the fossil record, however, new species generally appear suddenly after long periods of the stasis-that are, no change. Gould and Eldredge recognized that Speciation more likely occurs in small, isolated, peripheral populations than in the main population of the species, and that the unchanging nature of large populations contributes to the stasis of most fossil species over millions of years. Occasionally, when conditions are right, the equilibrium state becomes ‘punctuated’ by one or more Speciation events. While these events probably require thousands or tens of thousands of years to establish effective reproductive isolation and distinctive characteristics, this is but an instant in geologic time compared with an average life span of more than ten million years for most fossil species. Proponents of this theory envision a trend in evolutionary development to be more like climbing a flight of stairs (punctuations followed by stasis) than rolling up an inclined plane.

In the last several decades, scientists have questioned the role of extinction in evolution. Of the millions of species that have existed on this planet, more than 99 percent are extinct. Historically, biologists regarded extinction as a natural outcome of competition between newly evolved adaptively superior species and they are older, more primitive ancestors. Recently, however, paleontologists have discovered that many different, unrelated species living in, and large ecosystems tend to become extinct at nearly the same time. The cause is always some sort of climate change or catastrophic event that produces conditions too severe for most organisms to endure. Moreover, new species evolve after the wave of extinction removes many species that previously occupied a region for millions of years. Thus extinction does not result from evolution, but causes it.

Scientists have identified several instances of mass extinction, when species apparently died out on a huge scale. The greatest of these episodes occurred during the end of the Permian Period, by some odd 245 million years ago. Then, according to estimates, more then 95 percent of species, nearly all life on the planet-died out. Another extensively studied, but extinction took place at the boundary of the Cretaceous Period and the Tertiary Period, roughly sixty-five million years ago, when the dinosaurs disappeared. In all, more than twenty global mass extinctions have been identified. Some scientists theorize that such events may even be cyclical, occurring at regular intervals.

In that made or broke into the genetic chain no less the chromosomal cells that carry the DNA and inhibiting functions in the transmission of hereditary information, for which the helical hereditary information is necessary for cell growth.

Other theories have entered on abrupt changes in the levels of the world’s oceans, for example, or on the effect of changing salinity on early sea life. Another theory blames catastrophic events for mass extinction. Strong evidence, for example, supports the theory that a meteorite some 10 km. (6 mi.) in diameter struck the Earth 65 million years in the past. The dust cloud from the collision, according to this impact theory, shrouded the Earth for months, blocking the sunlight that plants need to survive. Without plants to eat, the dinosaurs and many other species of land animals were wiped out.

Extinction as a cause of evolution rather than the result of it is perhaps best shown as for our own ancestors,-ancient mammals. During the time of the dinosaurs, mammals made up only several the animals that roamed the planet. The demise of dinosaurs provided an opportunity for mammals to expand their numbers and ultimately to become the dominant land animal. Without the catastrophe that took place sixty-five million years into the past, mammals may have remained in the shadow of the dinosaurs is not exclusively a natural phenomenon. For thousands of years, as the human species has grown in number and technological sophistication, we have shown our power to cause extinction and to upset the world's ecological balance. In North America alone, for example, about forty species of birds and more than thirty-five species of mammals have become extinct in the last few hundred years, mostly from human activity. Humans default upon the plants and animals by their extermination through their relentless hunting or harvesting them. What is more, by destroying and replacing their habitat with farms and other forms of development, they also have allowed to introduce the foreign species that hunt or compete with local species, and by poisoning them with chemicals and other pollutants.

The rain forests of South America and other tropical regions offer a particularly troubling scenario. Upwards of fifty million acres of rain forest disappear every year as humans raze trees to make room for agriculture and livestock. Given that a single acre of rain forest may contain thousands of irreplaceable species of plant and animal life, the threat to bio-diversity is severe. The conservation of wildlife is now an international concern, as evidenced by treaties and agreements enacted at the 1992 Earth Summit in Rio De Janeiro, Brazil. In the United States, federal laws protect endangered species. The problem, nonetheless, of dwindling bio-diversity seems certain to worsen as the human population continues to expand, and no one knows for sure how it will affect evolution.

Advances in medical technology may also affect natural selection. The study from the mid-20th century showing that babies of medium birth weights were more likely to survive than their heavier or lighter counterparts would be difficult to reproduce today. Advances in neonatal medical technology have made it possible for small or premature babies to survive in a great deal higher of numbers.

Recent genetic analysis shows the human population contains harmful mutations in unprecedented levels. Researchers attribute this to genetic drift acting on small human populations throughout history. They also expect that improved medical technology may exacerbate the problem. Better medicine enables more people to survive to reproductive age, even if they carry mutations that in past generations would have caused their early death. The genetic repercussions of this are still unknown, but biologists speculate that many minor problems, such as poor eyesight, headaches, and stomach upsets may be attributable to our collection of harmful mutations.

Humans have also developed the potential to affect evolution at the most basic level,-the genes. The techniques of genetic engineering have become commonplace. Scientists can extract genes from living things, alter them by combining them with another segment of DNA, and then place this recombinant DNA back inside the organism. Genetic engineering has produced pest-resistant crops and larger cows and other livestock. To an increasing extent, genetic engineers fight human disease, such as cancer and heart disease. The investigation of gene therapy, in which scientists substitute functioning copies of a given gene for a defective gene, is an active field of medicine, and that in this way the tinkering with genetic material will affect evolutionary remains, yet to be determined.

The most contentious debates over evolution have involved religion. From Darwin's day to the present, members of some religious faiths have perceived the scientific theory of evolution to be in direct and objectionable conflict with religious doctrine regarding the creation of the world. Most religious denominations, however, see no conflict between the scientific study of evolution and religious teachings about creation. Christian Fundamentalists and others who believe literally in the biblical story of creation choose to reject evolutionary theory because it contradicts the book of Genesis, which describes how God created the world and all its plant and animal life in six days. Many such people maintain that the Earth is comparatively young-perhaps 6,000 to 8,000 years old-and that humans and all the worlds’ species have remained unchanged since their recent creation by a divine hand.

Opponents of evolution argue that only a divine intelligence, and not some comparatively random, undirected process, could have created the variety of the world's species, not to mention an organism as complex as a human being. Some people are upset by the oversimplification that humans evolved from monkeys. In the eyes of some, a divine being placed humans apart from the animal world. Proponents of this view find any attempt to place humans within the context of natural history deeply insulting.

For decades, the teaching of evolution in schools has been a flash point in the conflict between religious fundamentalism and science. During the 1920's, Fundamentalists lobbied against the teaching of evolution in public schools. Four states-Arkansas, Mississippi, Oklahoma, and Tennessee-passed laws outlawing public-school instruction in the principles of Darwinian evolution. In 1925 John Scopes, a biology teacher in Dayton, Tennessee, assigned his students readings about Darwinism, in direct violation of state law. Scopes was arrested and placed on trial. In what was the major trial of its time, American defence attorney Clarence Darrow represented Scopes, while American politician William Jennings Bryan argued for the prosecution. Ultimately, Scopes was convicted and customarily received a small fine. However, the ‘Monkey Trial,’ as it became called, was seen as a victory for evolution, since Darrow, in cross examining Bryan, succeeded in pointing out several serious inconsistencies in Fundamentalists belief.

Laws against the teaching of evolution were upheld for another forty years, until the Supreme Court of the United States, in a 1968 decision in the case Epperson V. Arkansas, ruled that such laws were an unconstitutional violation of the legally required separation of church and state. Over the next few years, Fundamentalists responded by de-emphasizing the religious content in their doctrine and instead casting their arguments as a scientific alternative to evolution called creation science, now also called intelligent design theory. In response to Fundamentalist pressure, twenty-six states debated laws that would require teachers to spend equal amounts of time teaching creation science and evolution. Only two states, Arkansas and Louisiana, passed such laws. The Arkansas law was struck down in federal district court, while proponents of the Louisiana law appealed all the way to the Supreme Court. In its 1987 decision in Edwards v Aquillard, the Court struck down such equal time laws, ruling that creation science is a religious idea and thus an illegal violation of the church-state separation. Despite these rulings, school board members and other government officials continue to grapple with the long-standing debate between creation and evolution scientists. Even so, efforts to permit the teaching of intelligent design theory in public schools have been unsuccessfully as scientists have sought-and found-evidence for evolution. The fossil record demonstrates that life on this planet was vastly different millions of years ago. Fossils, furthermore, provide evidence of how species change over time. The study of comparative anatomy has highlighted physical similarities in the features of widely different species-proof of common ancestry. Bacteria that mutate and develop resistance to antibiotics, along with other observable instances of adaptation, demonstrate evolutionary principles at work. The study of genes, proteins, and other molecular evidence has added to the understanding of evolutionary descent and the relationship among all living things. Research in all these areas has led to overwhelming support for evolution among scientists.

Nevertheless, evolutionary theory is still, in some cases, the cause of misconception or misunderstanding. People often misconstrue the phrase ‘survival of the fittest’. Some people interpret this to mean that survival is the reward for the strongest, the most vigorous, or the most dominant. In the Darwinian sense, however, fitness does not necessarily mean strength so much as the capacity to adapt successfully. This might mean developing adaptations for more efficiently obtaining food, or escaping predators, or enduring climate change-in short, for thriving in a given set of circumstances.

Yet it bears repeating that organisms do not change their characteristics in direct response to the environment. The key is genetic variation within a population,-and the potential for new combinations of traits. Nature will select those individuals that have developed the ideal characteristics with which to flourish in a given environment or niche. These individuals will have the greatest degree of reproductive success, passing their successful traits onto their descendants.

Another misconception is that evolution always progresses to better creatures. In fact, if species become too narrowly adapted to a given environment, they may ultimately lose the genetic variation necessary to survive sudden changes. Evolution, in such cases, will lead to extinction.

Once upon a time, in Human Evolution, now considered as pensively the process though which a lengthy period of change is admissively given by people who have originated from apelike ancestors. Scientific evidence shows that the physical and behavioural traits shared by all people evolved over a period of at least six million years.

One of the earliest defining human traits, Bipedalism -

walking on two legs as the primary form of locomotion-undergoing an evolution of more than four million years ago. Other important human characteristics’-such as a large and complex brain, the ability to make and use tools, and the capacity for language-developed more recently. Many advanced traits’,-including complex symbolic expression, such as art, and elaborate cultural diversity-emerged mainly during the past 100,000 years.

Our closest living relatives are three surviving species of great apes: the gorilla, the common chimpanzee, and the pygmy chimpanzee (also known as bonobo). Their confinement to Africa, along with abundant fossils evidence, suggests that the earliest stages of human evolution were also played out in Africa, human history, as sometimes separate from the history of animals, took the initiative in that location about seven million years ago (estimated range from five to nine million years ago). Around that time, a population of African apes split into several populations, of which one went on to evolve into modern gorillas, a second into the two modern chimps, and the third into humans. The gorilla line apparently split before the split between the chimp and the human lines.

Fossils indicate that the evolutionary line leading to us had achieved an upright posture by around four million years ago, then began to increase in body size and in relative brain size around 2.5 million years ago. That protohuman is generally known as Australopithecus africaanus. Homo habilis, and Homo erectus, which apparently evolved into each other in that sequence. Although the Homo erectus, the stage extends to around 1.7 million years ago, was close to us modern humans in body size, its brain size was still barely half of ours. Stone tools became common around 2.5 million years ago, but they were merely the crudest of flaked or battered stones. In zoological significance and distinction, The Homo erectus was more than an ape, but still much less than a modern human.

All of that human history, for the first five or six million years after our origins about seven million years ago, remained confined to Africa. The first human ancestor to spread beyond Africa was The Homo erectus, as it is attested by fossils discovered on the Southeast Asian island of Java and conventionally known as Java man the oldest Java ‘man’: archeological remains-of course, they may have belonged to a Java woman,-have usually been argued that they date from about a million years ago. However, it has recently been argued that they date from 1.8 million years ago. (Strictly speaking, the name Homo erectus belongs to these Javan fossils, and the African fossils classified as Homo erectus may warrant a different name). At present, the earliest unquestioned evidence for humans in Europe stems from around half a million years ago, but there are claims of an earlier presence. One would assume that the colonization of Asia also permitted the simultaneous colonization of Europe, since Eurasia is a single landmass not bisected by major barriers.

Nearly half a million years ago, human fossils had diverged from older Homo erectus skeletons in, they’re enlarged, rounder, and fewer angular skulls. African and European skulls of half a million years ago were sufficiently similar to skulls of a modern that they are classified in our species, Homo sapiens, instead of in Homo erectus. This distinction is arbitrary, since The Homo erectus evolved into The Homo sapiens. However, these early Homo sapiens still differed from us in skeletal details, had brains significantly smaller than ours, and were grossly different from us in their artifacts and behaviour. Modern stone-tool-making peoples, such as Yali’s great grandparents, would have scorned the stone tools of a half million years ago as very crude. The only significant addition to our ancestor’s cultural repertoire that can be documented with confidence around that time was the use of fire.

No art, bone tools, or anything else has come down to us from an early Homo sapiens except their skeletal remains, and those crude stone tools, there were still no humans in Australia, because it would have taken boats to get there from Southern Asia. There were also no humans anywhere in the Americas, because that would have required the occupation of the nearest part of the Eurasian continent (Siberia), and possibly boat-building skills as well. (The present, shallow Bering Strait separating Siberia from Alaska, alternated between a strait and a broad intercontinental bridge of dry land, as sea level repeatedly rose and fell during the Ice Ages). Nevertheless, boat building and survival in cold Siberia were both far beyond the capabilities of an early Homo sapiens. After half a million years ago, the human population of Africa and western Eurasia proceeded to diverge from each other and from East Asia populations in skeletal details. The population of Europe and western Asia between 130,000 and 40,000 years ago is recreated by especially many skeletons’ known as Neanderthals and sometimes classified as some separate spacies,

Yet their stone tools were still crude by comparison with modern New Guineans’ polished stone axes and were usually not yet made in standardized diverse shapes, each with a clearly recognizable function.

The few preserved African skeletal fragments contemporary with the Neanderthals are more similar to our modern skeletons than do Neanderthal skeletons. Even fewer preserved East Asian skeletal fragments are known, but they appear different again from both Africans and Neanderthals. As for the lifestyle at that time, the best-preserved evidence comes from stone artifacts and animal bones accumulated at southern African sites. Although those Africans of 100,000 years ago had more modern skeletons than did their Neanderthal contemporized, they made especially the same crude stone toots as Neanderthals, still lacking standardized shapes. They had no preserved art. To judge from the bone evidence of animal species under which their targeted prey and hunting skills were unimpressive and mainly directed at easy-to-kill, not-at-all-dangerous animals. They were not yet in the business of slaughtering buffalo, pig, and other dangerous prey. They could not even catch fish: their sites immediately on the seacoast lack fish bones and fishhook. They and their Neanderthal contemporaries still rank as less than fully human.

While Neanderthals lived in glacial times and were adapted to the cold, they penetrated no farther north than northern Germany and Kiev. Nonetheless, Neanderthals apparently lacked needles, sewn clothing, warm houses, and other technology essential to survival in the coldest climates. Anatomically modern peoples who did posses such technology had expanded into Siberia by around 20,000 years ago (there are the usual much older disputed claims). That expansion may have been responsible for the extinction of Eurasia’s wooly mammoth and wooly rhinoceroses likewise, to note, while the settlements of Australia/New Guinea, humans now occupied three of the five habitable continents, least that we omit Antartica because it was not reached by humans until the 19th century and has never had any self-supporting human population. That left only two continents, North America and South America. For obvious reason that reaching the Americas from the Old world required boats (for which either there is no evidence even in Indonesia until 40,000 years ago and none in Europe until much later) to cross by sea, or else it required the occupation of Siberia (unoccupied until about 20,000 years ago) to cross the Bering Strait. However, it is uncertain when, between about 14,000 and 35,000 years ago, the Americas were first colonized.

Meanwhile, human history at last took off around 50,000 years ago, while of the easiest definite signs had come from East African sites with standardized stone tools and the first preserved jewellery (ostrich-shell beads). Similar developments soon appear in the Near East and in southeastern Europe, then (some 40,000 years ago) in southwestern Europe, where abundant artefacts are associated with fully modern skeletons of people termed Cro-Magnons. Thereafter, the garbage preserved at archaeological sites rapidly becomes ever more interesting and leaves no doubt that we are dealing with biologically and behaviourally modern human, however.

Cro-Magnons’ garbage heaps yield not only stone tools but also tools of bone, whose suitability for shaping (for instance, into fish hooks) had apparently gone unrecognized by previous humans. Tools were produced in diverse. Distinctive shapes do modernly that their function as needles, awls, engraving tools, and so on are obvious to us. Instead of only single-piece tools such as hand-held scrapers, and multi-piece tools made their appearance. Recognizable multi-piece weapons at Cro-Magnon sites include harpoons, spear-throwers, and eventually bow and arrows, the precursors of rifles and other multi-piece modern weapons. Those efficient means of killing at a safe distance permitted the hunting of dangerous prey as rhinos and elephants, while the invention of rope for nets, lines, and snares allowed the addition of fish and bird to our diet. Remains of horses and sewn clothing testify to a greater improved ability to survive in cold climates, and remains of jewellery and carefully buried skeletons indicate revolutionary aesthetic and spiritual development.

Of the Cro-Magnons’ products preserved, the best known are their artworks: Their magnificent cave paintings, statues, and musical instruments, which we still appreciate as art today. Anyone who has experienced firsthand the overwhelming power of the life-sized painted bulls and hoses in the Lascaux Cave of southern France will understand, if not imagine, that their creators must have been as modern in their minds as they were in their skeletons.

Obviously, some momentous change took place in our ancestors’ capabilities between about 100,000 and 50,000 years ago. Presenting us with two major unresolved questions, regarding its triggering cause and its geographic location. As for its case, it can be argued for the perfection of the voiced box and hence for the anatomical basis of modern language, on which the exercise of human creativity is so dependent. Others have suggested instead that a change in brain organization around that time, without a change in brain size, made modern language possible.

As this occurring leap, and its location, did it take place primarily in one geographic area, in one group of humans, who were thereby enabled to expand and replace the former human populations of other parts of the world? Or did it occur in parallel in different regions, in each of which the human populations living today would be descendants of the populations living there before the connective leap? The conventionally advanced-looking human skull from Africa around 100,000 years ago has been taken to support the former view, within occurring specifically in Africa. Molecular studies (of so-called mitochondrial DNA) were initially also interpreted about an African origin of modern humans, though the meaning of those molecular findings is currently in doubt. On the other hand, skulls of humans living in China and Indonesia hundreds of thousands of years ago are considered by some physical anthropologists to exhibit features still found in modern Chinese and in Aboriginal Australians, respectfully. If true, that in the finding would suggest parallel evolution and multi-regional origins of modern humans, rather than origins in a single Garden of Eden. The issue remains unresolved.

The evidence for a localized origin of modern humans, followed by their spread and then their replacement of other types of humans elsewhere, seems strongly for Europe. Some 40,000 years ago, into Europe came the Cro-Magnons, with their modern skeleton, superior weapons, and other advanced cultural traits. Within a few thousand years there were no more Neanderthals, who had been evolving as the sole occupants of Europe for hundreds of thousands of years. The sequence strongly suggests that the modern Cro-Magnon somehow used their far superior technology, and their language skills or brains, to infect, kill, or displace the Neanderthals, leaving behind no evidence of hybridization between Neanderthals and Cro-Magnons.

Physical and genetic similarities show that the modern human species, Homo sapiens, has a very close relationship to another group of primate species, the apes. Humans and the so -called great apes (large apes) of Africa-chimpanzees (including bonobos, or so-called pygmy chimpanzees). Gorilla’s,-share a common ancestor that lived sometime between eight million and six million years ago. The earliest humans evolved in Africa, and much of human evolution occurred on that continent. The fossils of early humans who lived between six million and two million years ago come entirely from Africa.

We should be reminded of the ways in which big domestic mammals were crucial to those human societies possessing them. Most notably, they provided meat, milk products, fertilizer, land transportation, leather, military assault, plow traction, and wool, and germs that killed previously unexposed peoples.

In addition, of course, small domestic mammals and domestic birds and insects have also been useful to humans. Many birds were domesticated for meat, eggs, and feathers: the chicken in China, various duck and goose species in parts of Eurasia, turkeys in Mesoamerica, guinea fowl in Africa, and the Muscovy duck in South America. Wolves were domesticated in Eurasia and North America to become our dogs used as hunting companions, sentinels, pets, and, in some societies, food. Rodent and other small mammals domesticated for food include the rabbit in Europe, the guinea pig in the Andes, a giant rat in West Africa, and possibly a rodent called the hutia on Caribbean islands. Ferrets were domesticated in Europe to hunt rabbits, and cats were domesticated in North Africa and Southern Asia to hunt rodent pests. Small mammals domesticated as recently as the 19th and 20th century include foxes, mink, and chinchillas grown for fur and hamsters as pets. Even some insects have been domesticated, not ably Europe’s honeybee and China’s silkworm moth, kept for hone y and silk, respectively.

Many of these small animals thus yielded food, clothing or warmth, but none of them pulled plows or wagons, none bore riders, none except dogs pulled sleds nor became war machines, and nine of them have been as important for food as have big domesticated mammals.

Most scientists distinguish among twelve to nineteen different species of early humans. Scientists do not all agree, however, about how the species are related or which ones simply died out. Many early human species’,- probably most of them left no descendants. Scientists also debate over how to identify and classify particular species of early humans, and about what factors influenced the evolution and extinction of each species.

Early humans first migrated out of Africa into Asia probably between two million and 1.7 million years ago. They entered Europe later, generally within the past one million years. Species of modern humans populated many parts of the world much later. For instance, people first came to Australia probably within the past 60,000 years, and to the Americas within the past 35,000 years. The beginnings of agriculture and the rise of the first civilizations occurred within the past 10,000 years.

The scientific study of human evolution is called palaeanthropology. Palaeanthropology is a Studfield of anthropology, the study of human culture, society, and biology. Paleoanthropologists search for the roots of human physical traits and behaviour. They seek to discover how evolution has shaped the potentials, tendencies, and limitations of all people. For many people, palaeanthropology is an exciting scientific field because it illuminates the origins of the defining traits of the human species, and the fundamental connections between humans and other living organisms on Earth. Scientists have abundant evidence of human evolution from fossils, artifacts, and genetic studies. However, some people find the concept of human evolution troubling because it can seem to conflict with religious and other traditional beliefs about how people, other living things, and the world became. Yet many people have come to reconcile such beliefs with the scientific evidence.

All species of organisms originate through the process of biological evolution. In this process, new species arise from a series of natural changes. In animals that reproduce sexually, including humans, the term species refers to a group whose adult members regularly interbreed, resulting in fertile offspring,-that is, offspring themselves capable of reproducing. Scientists classify each species with a unique, and two-party scientific name. In this system, modern humans are classified as Homo sapiens.

The mechanism for evolutionary change resides in genes’-the basic units of heredity. Genes affect how the body and behaviour of an organism develop during its life. The information contained within the genes can reserve the change of a process known as mutation. The way particular genes are expressed,-how they affect the body or behaviour of an organism can also change. Over time, genetic change can alter a species overall way of life, such as what it eats, how it grows, and where it can live.

Genetic changes can improve the ability of organisms to survive, reproduce, and, in animals, raise offspring. This process is called adaptation. Parents pass adaptive genetic changes to their offspring, and ultimately these changes become common throughout a population-a group of organisms of the same species that share a particular local habitat. Many factors can favour new adaptations, but changes in the environment often play a role. Ancestral human species adapted to new environments as their genes changed, altering their anatomy (physical body structure) physiology (bodily functions, such as digestion, and behaviour). Over long periods, evolution dramatically transformed humans and their ways of life.

Geneticists estimate that the human line began to diverge from that of the African apes between eight million and five million years ago (paleontologists have dated the earliest human fossils to at least six million years ago). This figure comes from comparing differences in the genetic makeup of humans and apes, and then calculating how long it probably took for those differences to develop. Using similar techniques and comparing the genetic variations among human populations around the world, scientists have calculated that all people may share common genetic ancestors that lived sometime between 290,000 and 130,000 years ago.

Humans belong to the scientific order named Primates, a group of more than 230 species of mammals that also includes lemurs, lorises, tarsiers, monkeys, and apes. Modern humans, early humans, and other species of primates all have many similarities and some important differences. Knowledge of these similarities and differences helps scientists to understand the roots of many human traits, and the significance of each step in human evolution.

All primates, including humans, share at least part of a set of common characteristics that distinguish them from other mammals. Many of these characteristics evolved as adaptations for life in the trees, the environment in which earlier primates evolved. These include more reliance on sight than smell; overlapping fields of vision, allowing stereoscopic (three-dimensional) appearance; limbs and hands adapted for clinging on, leaping from, and swinging on tree trunks and branches; the ability to grasp and manipulate small objects (using fingers with nails instead of claws); large brains in relation to body size; and complex social lives.

The scientific classification of primates reflects evolutionary relationships between individual species and groups of species. Strepsirhini (meaning ‘turned-nosed’) primate’s,-of which the living representatives include lemurs, lorises, and other groups of species all commonly known as prosimians evolved earliest and are the most primitive forms of primates. The earliest monkeys and apes evolved from ancestral haplorhine (meaning ‘simple-nosed’) primates, of which the most primitive living representative is the tarsier. Humans evolved from ape ancestors.

Tarsiers have traditionally been grouped with prosimians, but many scientists now recognize that tarsiers, monkeys, and apes share some distinct traits, and group the three together. Monkeys, apes, and humans-who share many traits not found in other primates-together make up the suborder Anthropoidea. Apes and humans together make up the super-family as contributive members of Hominoidea, a grouping that emphasizes the close relationship among the species of these two groups.

Strepsirhines are the most primitive types of living primates. The last common ancestors of Strepsirhines and other mammals creatures similar to tree shrews and classified as Plesiadapiformes,-evolved at least sixty-five million years ago. The earliest primates evolved about fifty-five million years ago, and fossil species similar to lemurs evolved during the Eocene Epoch (about fifty-five million to thirty-eight million years ago). Strepsirhines share all of the basic characteristics of primates, although their brains are not particularly large or complex and they have a more elaborate and sensitive olfactory system (sense of smell) than do other primates.

Tarsiers are the only living representatives of a primitive group of primates that ultimately led to monkeys, apes, and humans. Fossil species called Omomyid, with some traits similar to those of tarsiers, evolved near the beginning of the Eocene, followed by early tarsier-like primates. While the Omomyid and tarsiers are separate evolutionary branches (and there is no living Omomyid), they share features concerning a reduction in the olfactory system, a trait shared by all haplorhine primates, including humans.

The anthropoid primates are divided into New World (South America, Central America, and the Caribbean Islands) and Old World (Africa and Asia) groups. New World monkeys,- such as marmosets, capuchins, and spider monkeys,-belong to the infra-order of platyrrhine (broad-nosed) anthropoids. Old World monkeys and apes belong to the infra-order of catarrhine (downward-nosed) anthropoids. Since humans and apes together make up the hominoids, humans are also catarrhine anthropoids.

The first catarrhine primates evolved between fifty million and thirty-three million years ago. Most primate fossils from this period have been found in a region of northern Egypt known as Al Fayy? Um (or the Fayum). A primate group known as Propliopithecus, one lineage of which is sometimes called Aegyptopithecus, had primitive catarrhine features-that is, it had many basic features that Old World monkeys, apes, and humans share today. Scientists believe, therefore, that Propliopithecus resembles the common ancestor of all later Old World monkeys and apes. Thus, Propliopithecus may also be considered an ancestor or a close relative of an ancestor of humans evolved during the Miocene Epoch (twenty-four million to five million years in the past). Among the oldest known hominoids is a group of primates known by its genus name, Proconsul. Species of Proconsul had features that suggest a close link to the common ancestor of apes and humans,-for example, the lack of a tail. The species Proconsul heseloni lived in the trees of dense forests in eastern Africa about twenty million years ago. An agile climber, it had the flexible backbone and narrow chest characteristic of monkeys, but also a wide range of movement in the hip and thumb, traits characteristic of apes and humans.

Large ape species had originated in Africa by twenty-three million or twenty-two million years ago. By fifteen million years ago, some of these species had migrated to Asia and Europe over a land bridge formed between Africa-Arabian and Eurasian continents, which had previously been separated.

Early in their evolution, the large apes underwent several radiations-periods when new and diverse species branched off from common ancestors. Following Proconsul, the ape genus Afropithecus evolved about eighteen million years ago in Arabia and Africa and diversified into several species. Soon afterward, three other ape genera evolved,-Griphopithecus of western Asia about 16.5 million years ago, the earliest ape to have spread from Africa, as did the genus Kenyapithecus of Africa about fifteen million years ago, moreover the Dryopithecus of Europe about twelve million years ago. Scientists have not yet determined which of these groups of apes may have caused the common ancestor of modern African apes and humans.

Scientists do not all agree about the appropriate classification of hominoids. They group the living hominoids into either two or three families: Hylobatidae, Hominidae, and sometimes Pongidae. Hylobatidae consists of the small or so-called lesser apes of Southeast Asia, commonly known as gibbons and siamangs. The Hominidae (hominids) includes humans and, according to some scientists, the great apes. For those who include mere humans associated with the Hominidae, all of the great apes, including the orangutans of Southeast Asia, belong to the family Pongidae.

In the past only humans were considered to belong to the family Hominidae, and the term hominid referred only to species of humans. Today, however, genetic studies support placing all of the great apes and humans together in this family and the placing of African apes-chimpanzees and gorillas-together with humans at an even lower level, or subfamily

According to this reasoning, the evolutionary branch of Asian apes leading to orangutans, which separated from the other hominid branches nearly thirteen million years ago, belongs to the subfamily Ponginae. The ancestral and living representatives of the African ape and human branches together belong to the subfamily Homininae (sometimes called Hominines). Lastly, the line of early and modern humans belongs to the tribe (classificatory level above genus) Hominini, or hominins.

This order of classification corresponds with the genetic relationships between ape and human species. It groups humans and the African apes together at the same level in which scientists group together, for example, all types of foxes, all buffalo, or all flying squirrels. Within each of these groups, the species are very closely related. However, in the classification of apes and humans the similarities among those mention’s of hominoid, hominid, hominine, and hominin may admit to contradiction. In this context, the term early human refers to all species of the human family tree since the divergence from a common ancestor with the African apes. Popular writing often still uses the term hominid to mean the same thing.

About 98.5 percent of the genes in people and chimpanzees are identical, making chimps the closest living biological relatives of humans. This does not mean that humans evolved from chimpanzees, but it does indicate that both species evolved from a common ape ancestor. Orangutans, the great apes of Southeast Asia, differ much more from humans genetically, indicating a more distant evolutionary relationship.

Modern humans have several physical characteristics reflective of an ape ancestry. For instance, people have shoulders with a wide range of movement and fingers capable of strong grasping. In apes, these characteristics are highly developed as adaptations for brachiation, -swinging from branch to branch in trees. Although humans do not brachiate, the general anatomy from that earlier adaptation remains. Both people and apes also have larger brains and greater cognitive abilities than do most other mammals.

Human social life, too, shares similarities with that of African apes and other primates,-such as baboons and rhesus monkeys-that live in large and complex social groups. Group behaviour among chimpanzees, in particular, strongly resembles that of humans. For instance, chimps form long-lasting attachments with each other, participate in social bonding activities, such as grooming, feeding, and hunting; and form strategic coalitions with each other in order to increase their status and power. Early humans also probably had this kind of elaborate social life.

In whatever manner, modern humans fundamentally differ from apes in many significant ways. For example, as intelligent as apes are, people’s brains are much larger and more complex, and people have a unique intellectual capacity and elaborate forms of culture and communication. In addition, only people habitually walk upright, can precisely manipulate very small objects, and have a throat structure that makes speech possible.

By around six million years ago in Africa, an apelike species had evolved with two important traits that distinguished it from apes: (1) small canine, or eye, teeth (teeth next to the four incisors, or front teeth) and (2) Bipedalism,-that is, walking on two legs as the primary form of locomotion. Scientists refer to these earliest human species as australopithecines, or Australopiths for short. The earliest Australopiths species known today belong to three genera: Sahelanthropus, Orrorin, and Ardipithecus. Other species belong to the genus Australopithecus and, by some classifications, Paranthropus. The name australopithecine translates literally as ‘southern ape’, concerning South Africa, where the first known Australopiths fossils were found.

The Great Rift Valley, a region in eastern Africa in which past movements in Earth’s crust have exposed ancient deposits of fossils, has become famous for its Australopiths finds. Countries in which scientists have found Australopiths fossils include Ethiopia, Tanzania, Kenya, South Africa, and Chad. Thus, Australopiths ranged widely over the African continent.

Fossils from several different early Australopiths species that lived between four million and two million years ago clearly show a variety of adaptations that marks the transition from ape too human. The very early period of this transition, before four million years ago, remains poorly documented in the fossil record, but those fossils that do exist show the most primitive combinations of ape and human features.

Fossils reveal much about the physical build and activities of early Australopiths, but not everything about outward physical features such as the colour and texture of skin and hair, or about certain behaviours, such as methods of obtaining food or patterns of social interaction. For these reasons, scientists study the living great apes-particularly the African apes to understand better how early Australopiths might have looked and behaved, and how the transition from ape too human might have occurred. For example, Australopiths probably resembled the great apes in characteristics such as the shape of the face and the hair on the body. Australopiths also had brains roughly equal in size to those of the great apes, so they probably had apelike mental abilities. Their social life probably resembled that of chimpanzees.

Most of the distinctly human physical qualities in Australopiths related to their bipedal stance. Before Australopiths, no mammal had ever evolved an anatomy for habitual upright walking. Australopiths also had small canine teeth, as compared with long canines found in most other catarrhine primates.

Other characteristics of Australopiths reflected their ape ancestry. They had a low cranium behind a projecting face, and a brain size of 390 to 550 cu. cm. (24 to thirty-four cu. in.)-between an ape’s brain. The body weight of Australopiths, as estimated from their bones, ranged from twenty-seven to 49 kg. (sixty to 108 lb.), and they stood 1.1 to 1.5 m. (3.5 to 5 ft.) tall. Their weight and height compare closely to those of chimpanzees (chimp height measured standing). Some Australopiths species had a large degree of sexual dimorphism-males were much larger than females-a trait also found in gorillas, orangutans, and other primates.

Australopiths also had curved fingers and long thumbs with a wide range of movement. In comparison, the fingers of apes are longer, more powerful, and more curved, making them extremely well adapted for hanging and swinging from branches. Apes also have very short thumbs, which limits their ability to manipulate small objects. Paleoanthropologists speculate about whether the long and dexterous thumbs of Australopiths allowed them to use tools more efficiently than do apes.

The anatomy of Australopiths shows several adaptations for Bipedalism, in both the upper and lower body. Adaptations in the lower body included the following: The australopithilium, or pelvic bone, which rises above the hip joint, was much shorter and broader than it is in apes. This shape enabled the hip muscles to steady the body during each step. The Australopiths pelvis also had a bowl-like shape, which supported the internal organs in an upright stance. The upper legs angled inward from the hip joints, which positioned the knees better to support the body during upright walking. The legs of apes, on the other hand, are positioned almost straight down from the hip, so that when an ape walks upright for a short distance, its body sways from side to side. Australopiths also had short and fewer flexible toes than do apes. The toes worked as rigid levers for pushing off the ground during each bipedal step.

Other adaptations occurred above the pelvis. The Australopiths spine had a S-shaped curve, which shortened the overall length of the torso and gave it rigidity and balance when standing. By contrast, apes have a straight spine. The Australopiths skull also had an important adaptation related to Bipedalism. The opening at the bottom of the skull through which the spinal cord attaches to the brain, called the foramen magnum, was positioned more forward than it is in apes. This position set the head in balance over the upright spine.

Australopiths clearly walked upright on the ground, but paleoanthropologists debate whether the earliest humans also spent a significant amount of time in the trees. Certain physical features indicate that they spent at least some of their time climbing in trees. Such features included they’re curved and elongated fingers and elongated arms. However, their fingers, unlike those of apes, may not have been long enough to allow them to brachiate through the treetops. Study of fossil wrist bones suggests that early Australopiths could lock their wrists, preventing backward bending at the wrist when the body weight was placed on the knuckles of the hand. This could mean that the earliest bipeds had an ancestor that walked on its knuckles, as African apes do.

Compared with apes, humans have very small canine teeth. Apes-particularly males-have thick, projecting, sharp canines that they use for displays of aggression and as weapons to defend themselves. The oldest known bipeds, who lived at least six million years ago, still had large canines by human standards, though not as large as in apes. By four million years ago Australopiths had developed the human characteristic of having smaller, flatter canines. Canine reduction might have related to an increase in social cooperation between humans and an accompanying decrease in the need for males to make aggressive displays.

The Australopiths can be divided into an early group of species, known as gracile Australopiths, which arose before three million years ago, and a later group, known as robust Australopiths, which evolved after three million years ago. The gracile Australopiths,-of which several species evolved between 4.5 million and three million years in the past,-generally had smaller teeth and jaws. The later-evolving robust had larger faces with large jaws and molars (cheek teeth). These traits indicate powerful and prolonged chewing of food, and analyses of wear on the chewing surface of robust Australopiths molar teeth’s support this idea. Some fossils of early Australopiths have features resembling those of the later species, suggesting that the robustus evolved from one or more gracile ancestors.

Paleoanthropologists recognize at least eight species of early Australopiths. These include the three earliest established species, which belong to the genuses’ Sahelanthropus, Orrorin, and Ardipithecus, a species of the genus Kenyanthropus, and four species of the genus Australopithecus.

The oldest known Australopiths species is Sahelanthropus tchadensis. Fossils of this species were first discovered in 2001 in northern Chad, Central Africa, by a research team led by French paleontologist Michel Brunet. The researchers estimated the fossils to be between seven million and six million years old. One of the fossils has a fracture but nearly completes the cranium that shows a combination of apelike and humanlike features. Apelike features include small brain size, an elongated brain case, and areas of bone where strong neck muscles would have attached. Humanlike features are to include small, flat canine teeth, a short middle part of the face, and a massive brow ridge (a bony, protruding ridge above the eyes) similar to that of later human fossils. The opening where the spinal cord attaches to the brain is tucked under the brain case, which suggests that the head was balanced on an upright body. It is not certain that Sahelanthropus walked bipedally, however, because bones from the rest of its skeleton have yet to be discovered. Nonetheless, its age and humanlike characteristics suggest that the human and African ape lineages had divided from one another by at least six million years ago.

In addition to reigniting debate about human origins, the discovery of Sahelanthropus in Chad significantly expanded the known geographic range of the earliest humans. The Great Rift Valley and South Africa, from which most other discoveries of early human fossils came, are apparently not the only regions of the continent that preserve the oldest clues of human evolution.

Orrorin tugenensis lived about six million years ago. This species was discovered in 2000 by a research team led by French paleontologist Brigitte Sent and French geologist Martin Pickford in the Tugen Hills region of central Kenya. The researchers found more than a dozen early human fossils dating between 6.2 million and six million years old. Among the finds were two thighbones that possess a groove indicative of an upright stance and bipedal walking. Although the finds are still being studied, the researchers consider these thighbones to be the oldest evidence of habitual two-legged walking. Fossilized bones from other parts of the skeleton show apelike features, including long, curved finger bones useful for strong grasping and movement through trees, and apelike canine and premolar teeth. Because of this distinctive combination of ape and human traits, the researchers gave a new genus and species name to these fossils, Orrorin tugenensis, which in the local language means ‘original man in the Tugen region. The age of these fossils suggests that the divergence of humans from our common ancestor with chimpanzees occurred before six million years ago.

In 1994 an Ethiopian member of a research team led by American paleoanthropologists Tim White discovered human fossils estimated to be about 4.4 million year’s old. White and his colleagues gave their discovery the name Ardipithecus ramidus. Ramid means ‘root’ in the Afar language of Ethiopia and refers to the closeness of this new species to the roots of humanity. At the time of discovery, the genus Australopithecus was scientifically well established. White devised the genus name Ardipithecus to distinguish this new species from other Australopiths because its fossils had a very ancient combination of apelike and humanlike traits. More recent finds indicate that this species may have lived as early as 5.8 million to 5.2 million years ago.

The teeth of Ardipithecus ramidus had a thin outer layer of enamel,-a trait also seen in the African apes but not in other Australopiths species or older fossil apes. This trait suggests a close relationship with an ancestor of the African apes. In addition, the skeleton shows strong similarities to that of a chimpanzee but has slightly reduced canine teeth and adaptations for Bipedalism.

In 1965 a research team from Harvard University discovered a single arm bone of an early human at the site of Kanapoi in northern Kenya. The researchers estimated this bone to be four million years old, but could not identify the species to which it belonged or return at the time to look for related fossils. It was not until 1994 that a research team, led by British-born Kenyan paleoanthropologists Meave Leakey, found numerous teeth and fragments of bone at the site that could be linked to the previously discovered fossil. Leakey and her colleagues determined that the fossils were those of the very primitive species of Australopiths, which was given the name Australopithecus Anamensis. Researchers have since found other A. Anamensis fossils at nearby sites, dating between about 4.2 million and 3.9 million years old. The skull of this species appears apelike, while its enlarged tibia (lower leg bone) indicates that it supported its full body weight on one leg at a time, as in regular bipedal walking

Australopithecus Anamensis was quite similar to another, much better-known species, A. afarensis, a gracile Australopiths that thrived in eastern Africa between about 3.9 million and three million years ago. The most celebrated fossil of this species, known as Lucy, is a partial skeleton of a female discovered by American paleoanthropologists Donald Johanson in 1974 at Hadar, Ethiopia. Lucy lived 3.2 million years ago. Scientists have identified several hundred fossils of A. afarensis from Hadar, including a collection representing at least thirteen individuals of both sexes and various ages, all from a single site.

Researchers working in northern Tanzania have also found fossilized bones of A. afarensis at Laetoli. This site, dated at 3.6 million years old, is best known for its spectacular trails of bipedal human footprints. Preserved in hardened volcanic ash, these footprints were discovered in 1978 by a research team led by British paleoanthropologists Mary Leakey. They provide irrefutable evidence that Australopiths regularly walked bipedally.

Paleoanthropologists have debated interpretations of the characteristics of A. afarensis and its place in the human family tree. One controversy centres on the Laetoli footprints, which some scientists believe show that the foot anatomy and gait of A. afarensis did not exactly match those of the modern humans. This observation may indicate that early Australopiths did not live primarily on the ground or at least spent a significant amount of time in the trees. The skeleton of Lucy also indicates that A. afarensis had longer, more powerful arms than most later human species, suggesting that this species was adept at climbing trees. Another controversy relates to the scientific classification of the A. afarensis fossils, compared with Lucy, who stood only 1.1 m. (3.5 ft.) tall, other fossils identified as A. afarensis from Hadar and Laetoli came from individuals who stood up to 1.5 m. (5 ft.) tall. This great difference in size leads some scientists to suggest that the entire set of fossils now classified as A. afarensis represents two species. Most scientists, however, believe the fossils represent one highly dimorphic species,-that is, a species that has two distinct forms (in this case, two sizes). Supporters of this view may note that the two large (presumably male) and small (presumably female) adults occur together in one site at Hadar.

A third controversy arises from the claim that A. afarensis was the common ancestor of both later Australopiths and the modern human genus, Homo. While this idea remains a strong possibility, the similarity between this and another Australopiths species-one from southern Africa, named Australopithecus africanus-makes it difficult to decide which of the two species led to the genus Homo.

Australopithecus africanus thrived in the Transvaal region of what is now South Africa between about 3.3 million and 2.5 million years ago. Australian-born anatomist Raymond Dart discovered this species-the first known Australopiths-in 1924 at Taung, South Africa. The specimen that of a young child, became known as the Taung Child. For decades after this discovery, almost no one in the scientific community believed Dart’s claim that the skull came from an ancestral human. In the late 1930's teams led by Scottish-born South African paleontologist Robert Broom unearthed many more A. africanus skulls and other bones from the Transvaal site of Sterkfontein.

A. africanus generally had a more globular braincase and less primitive-looking face and teeth than did A. afarensis. Thus, some scientists consider the southern species of early Australopiths to be a likely ancestor of the genus Homo. According to other scientists, however, certain heavily built facial and cranial features of A. africanus from Sterkfontein identify it as an ancestor of the robust Australopiths that lived later in the same region. In 1998 a research team led by South African paleoanthropologists Ronald Clarke discovered an almost complete early Australopiths skeleton at Sterkfontein. This important find may resolve some of the questions about where A. africanus fits in the story of human evolution.

Working in the Lake Turkana’s region of northern Kenya, a research team led by paleontologist in which Meave Leakey uncovered 1999 a cranium and other bone remains of an early human that showed a mixture of features unseen in previous discoveries of early human fossils. The remains were estimated to be 3.5 million years old, and the cranium’s small brain and earhole was similar to those of the earliest humans. Its cheekbone, however, joined the rest of the face in a forward position, and the region beneath the nose opening was flat. These are traits found in later human fossils from around two million years ago, typically those classified in the genus Homo. Noting this unusual combination of traits, researchers named a new genus and species, Kenyanthropus platy ops, or ‘flat-faced human from Kenya.’ Before this discovery, it seemed that only a single early human species, Australopithecus afarensis, lived in East Africa between four million and three million years ago. Yet Kenyanthropus indicates that a diversity of species, including a more humanlike lineage than A. afarensis, lived in this period, just as in most other eras in human prehistory.

The human fossil record is poorly known between three million and two million years ago, from which carries over recent finds from the site of Bouri, Ethiopia, particularly important. From 1996 to 1998, a research team led by Ethiopian paleontologist Berhane Asfaw and American paleontologist Tim White found the skull and other skeletal remains of an early human specimen about 2.5 million years old. The researchers named it Australopithecus garhi; the word garhi means ‘surprise’ in the Afar language. The specimen is unique in having large incisors and molars in combination with an elongated forearm and thighbone. Its powerful arm bones suggest a tree-living ancestry, but its longer legs indicate the ability to walk upright on the ground. Fossils of A. garhi are associated with some of the oldest known stone tools, along with animal bones that were cut and cracked with tools. It is possible, then, that this species was among the first to make the transition to stone Toolmaking and to eating meat and bone marrow from large animals.

By 2.7 million years ago the later, robust Australopiths had evolved. These species had what scientists refer to as megadont cheek teeth-wide molars and premolars coated with thick enamel. Their incisors, by contrast, were small. The robusts also had an expanded, flattened, and more vertical face than did gracile Australopiths. This face shape helped to absorb the stresses of strong chewing. On the top of the head, robust Australopiths had a sagittal crest (ridge of bone along the top of the skull from front to back) to which thick jaw muscles attached. The zygomatic arches (which extend back from the cheek bones to the ears), curved out wide from the side of the face and cranium, forming very large openings for the massive chewing muscles to pass through near their attachment to the lower jaw. Together, these traits indicate that the robust Australopiths chewed their food powerfully and for long periods.

Other ancient animal species that specialized in eating plants, such as some types of wild pigs, had similar adaptations in their facial, dental, and cranial anatomy. Thus, scientists think that the robust Australopiths had a diet consisting partly of tough, fibrous plant foods, such as seed pods and underground tubers. Analyses of microscopic wear on the teeth of some robust Australopiths specimens appear to support the idea of a vegetarian diet, although chemical studies of fossils suggest that the southern robust species may also have eaten meat.

Scientists originally used the word robust to refer to the late Australopiths out of the belief that they had much larger bodies than did the early, gracile Australopiths. However, further research has revealed that the robust Australopiths stood about the same height and weighed roughly the same amount as Australopithecus afarensis and A. africanus.

The earliest known robust species, Australopithecus aethiopicus, lived in eastern Africa by 2.7 million years ago. In 1985 at West Turkana, Kenya, American paleoanthropologists Alan Walker discovered a 2.5-million-year-old fossil skull that helped to define this species. It became known as the ‘black skull’ because of the colour it had absorbed from minerals in the ground. The skull had a tall sagittal crest toward the back of its cranium and a face that projected far outward from the forehead. A. aethiopicus shared some primitive features with

A. afarensis,-that is, features that originated in the earlier East African Australopiths. This may indicate that A. aethiopicus evolved from A. afarensis.

Australopithecus boisei, the other well-known East African robust Australopiths, lived over a long period, between about 2.3 million and 1.2 million years ago. In 1959 Mary Leakey discovered the original fossil of this species-a nearly complete skull-at the site of Olduvai Gorge in Tanzania. Kenyan-born paleoanthropologists Louis Leakey, husband of Mary, originally named the new species Zinjanthropus boisei (Zinjanthropus translates as ‘East African man’). This skull-dating from 1.8 million years ago-has the most specialized features of all the robust species. It could withstand extreme chewing forces, and molars four times the size of those in modern humans. Since the discovery of Zinjanthropus, now recognized as an Australopiths, scientists have found many A. boisei fossils in Tanzania, Kenya, and Ethiopia.

The southern robust species, called Australopithecus robustus, lived between about 1.8 million and 1.3 million years ago in Transvaal, the same region that was home to A. africanus. In 1938 Robert Broom, who had found many A. africanus fossils, bought a fossil jaw and molar that looked distinctly different from those in A. africanus. After finding the site of Kromdraai, from which the fossil had come, Broom collected many more bones and teeth that together convinced him to name a new species, which he called Paranthropus robustus (Paranthropus meaning ‘beside man’). Later scientists dated this skull at about 1.5 million years old. In the late 1940's and 1950 Broom discovered many more fossils of this species at the Transvaal site of Swartkrans.

Many scientists believe that robust Australopiths represent a distinct evolutionary group of early humans because these species share features associated with heavy chewing. According to this view, Australopithecus aethiopicus diverged from other Australopiths and later produced A. boisei and A. robustus. Paleoanthropologists who strongly support this view think that the robusts should be classified in the genus Paranthropus, the original name given to the southern species. Thus, these three species are sometimes called, P. aethiopicus, P. boisei, and P. robustus.

Other paleoanthropologists believe that the eastern robust species, A. aethiopicus and A. boisei, may have evolved from an early Australopiths of the same region, perhaps A. afarensis. According to this view, A. africanus gave rise only to the southern species, A. robustus. Scientists refer to such a case -in that two or more independent species evolve similar characteristics in different places or at different times,-as parallel evolution. If parallel evolution occurred in Australopiths, the robust species would make up two separate branches of the human family tree.

The last robust Australopiths died out about 1.2 million years ago. At about this time, climate patterns around the world entered a period of fluctuation, and these changes may have reduced the food supply on which robusts depended. Interaction with larger-brained members of the genus Homo, such as Homo erectus, may also have contributed to the decline of late Australopiths, although no compelling evidence exists of such direct contact. Competition with several other species of plant-eating monkeys and pigs, which thrived in Africa at the time, may have been an even more important factor. Nevertheless, the reasons why the robust Australopiths became extinct after flourishing for such a long time are not yet known for sure.

Scientists have several ideas about why Australopiths first split from the apes, initiating the course of human evolution. Nearly all hypotheses suggest that environmental change was an important factor, specifically in influencing the evolution of Bipedalism. Some well-established ideas about why humans first evolved include (1) the savanna hypothesis, (2) the woodland-mosaic hypothesis, and (3) the variability hypothesis.

The global climate cooled and became drier between eight million and five million years ago, near the end of the Miocene Epoch. According to the savanna hypothesis, this climate change broke up and reduced the area of African forests. As the forests shrunk, an ape population in eastern Africa became separated from other populations of apes in the more heavily forested areas of western Africa. The eastern population had to adapt to its drier environment, which contained larger areas of grassy savanna.

The expansion of dry terrain favoured the evolution of terrestrial living, and made it more difficult to survive by living in trees. Terrestrial apes might have formed large social groups in order to improve their ability to find and collect food and to fend off predators-activities that also may have required the ability to communicate well. The challenges of savanna life might also have promoted the rise of tool use, for purposes such as scavenging meat from the kills of predators. These important evolutionary changes would have depended on increased mental abilities and, therefore, may have correlated with the development of larger brains in early humans.

Critics of the savanna hypothesis argue against it on several grounds, but particularly for two reasons. First, discoveries by a French scientific team of Australopiths fossils in Chad, in Central Africa, suggest that the environments of East Africa may not have been fully separated from those farther west. Second, recent research suggests that open savannas were not prominent in Africa until sometime after two million years ago

Criticism of the savanna hypothesis has spawned alternative ideas about early human evolution. The woodland-mosaic hypothesis proposes that the early Australopiths evolved in patchily wooded areas-a mosaic of woodland and grassland-that offered opportunities for feeding both on the ground and in the trees, and that ground feeding favoured Bipedalism.

The variability hypothesis suggests that early Australopiths experienced many changes in environment and ended up living in a range of habitats, including forests, open-canopy woodlands, and savannas. In response, their populations became adapted to a variety of surroundings. Scientists have found that this range of habitats existed at the time when the early Australopiths evolved. So the development of new anatomical characteristics,-particularly Bipedalism-combined with an ability to climb trees, may have given early humans the versatility to live in a variety of habitats.

Scientists also have many ideas about which benefits of Bipedalism may have influenced its evolution. Ideas about the benefits of regular Bipedalism include that it freed the hands, making it easier to carry food and tools; allowed early humans to see over tall grass to watch for predators; reduced vulnerability of the body and too hot of the sun, provided an increased exposure to cooling winds; improved the ability to hunt or use weapons, which became easier with an upright posture; and made extensive feeding from bushes and low branches easier than it would have been for a quadruped. Scientists do not overwhelmingly support any one of these ideas. Recent studies of chimpanzees suggest, though, that the ability to feed more easily might have particular relevance. Chimps carry through an action on two legs most often when they feed from the ground on the leaves and fruits of bushes and low branches. Chimps cannot, however, walk in this way over long distances.

Bipedalism in early humans would have enabled them to travel efficiently over long distances, giving them an advantage over quadrupedal apes in moving across barren open terrain between groves of trees. In addition, the earliest humans continued to have the advantage from their ape ancestry of being able to escape into the trees to avoid predators. The benefits of both Bipedalism and agility in the trees may explain the unique anatomy of Australopiths. Their long, powerful arms and curved fingers probably made them good climbers, while their pelvis and lower limb structure were reshaped for upright walking people belong to the genus Homo, which first evolved at least 2.3 million to 2.5 million years ago. The earliest members of this genus differed from the Australopiths in at least one important respect-they had larger brains than did their predecessors.

The evolution of the modern human genus can be divided roughly into three periods: during an early stage, an intermediate period and late. Species of early Homo resembled gracile Australopiths in many ways. Some early Homo species lived until possibly 1.6 million years ago. The period of the middle Homo began perhaps between two million and 1.8 million years ago, overlapping with the end of early Homo. Species of Middle Homo evolved an anatomy much more similar to that of modern humans but had comparatively small brains. The transition from Intermittent to late Homo probably occurred sometime around 200,000 years ago. Species of late Homo evolved large and complex brains and eventually language. Culture also became an increasingly important part of human life during the most recent period of evolution.

The origin of the genus Homo has long intrigued paleoanthropologists and prompted much debate. One of several known species of Australopiths, or one not yet discovered, could have caused the first species of Homo. Scientists also do not know exactly what factors favoured the evolution of a larger and more complex brain-the defining physical trait of modern humans.

Louis Leakey originally argued that the origin of Homo related directly to the development of Toolmaking,-specifically, the making of stone tools. Toolmaking requires certain mental skills and fine hand manipulation that may exist only in members of our own genus. The name Homo habilis (meaning ‘repairer’) refer directly to the making and use of tools

However, several species of Australopiths lived just when early Homo, making it unclear which species produced the earliest stone tools. Recent studies of Australopiths hand bones have suggested that at least a robust species, Australopithecus robustus, could have made tools. In addition, during the 1960's and 1970's researchers first observed that some nonhuman primates, such as chimpanzees, make and use tools, suggesting that Australopiths and the apes that preceded them probably also made some kinds of tools.

Scientists began to notice a high degree of variability in body size as they discovered more early Homo fossils. This could have indicated that H. habilis had a large amount of sexual dimorphism. For instance, the Olduvai female skeleton was dwarfed in comparison with other fossils,-exemplified by a sizable early Homo cranium from East Turkana in northern Kenya. However, the differences in size exceeded those expected between males and females of the same species, and this finding later helped convince scientists that another species of early Homo had lived in eastern Africa.

This second species of early Homo was given the name Homo rudolfensis, after Lake Rudolf (now Lake Turkana). The best-known fossils of H. rudolfensis come from the area surrounding this lake and date from about 1.9 million years ago. Paleoanthropologists have not determined the entire time range during which H. rudolfensis may have lived.

This species had a larger face and body than did H. habilis. The cranial capacity of H. rudolfensis averaged about 750 cu cm (46 cu. in.). Scientists need more evidence to know whether the brain of H. rudolfensis in relation to its body size was larger than that proportion in H. habilis. A larger brain-to-body-size, and ratio can indicate increased mental abilities. H. rudolfensis also had large teeth, approaching the size of those in robust Australopiths. The discovery of even a partial fossil skeleton would reveal whether this larger form of early Homo had apelike or more modern body proportions. Scientists have found several modern-looking thighbones that date from between two million and 1.8 million years ago and may belong to H. rudolfensis. These bones suggest a body size of 1.5 m. (5 ft.) and 52 kg. (114 lb.).

By about 1.9 million years ago, the period of middle Homo had begun in Africa. Until recently, paleoanthropologists recognized one species in this period, Homo erectus. Many now recognize three species of middle Homo: Homo. ergaster, Homo. erectus, and Homo. heidelbergensis. However, some still think Homo ergaster is an early African form of H. erectus, or that Homo heidelbergensis is a late form of the Homo erectus.

The skulls and teeth of early African populations of Middle Homo differed subtly from those of later H. erectus populations from China and the island of Java in Indonesia. H. ergaster makes a better candidate for an ancestor of the modern human line because Asian H. erectus has some specialized features not seen in some later humans, including our own species. H. heidelbergensis has similarities to both

H. erectus and the later species. The H. neanderthalensis, even if it may have been a transitional species between middle Homo and the line to which modern humans belong.

Homo ergaster probably first evolved in Africa around two million years ago. This species had a rounded cranium with a brain size of between 700 and 850 cu. cm. (49 to fifty-two cu. in), a prominent brow ridge, small teeth, and many other features that it shared with the later H. erectus. Many paleoanthropologists consider H. ergaster a good candidate for an ancestor of modern humans because it had several modern skull features, including thin cranial bones. Most H. ergaster fossils come from the time range of 1.8 million to 1.5 million years ago.

The most important fossil of this species yet found is a nearly complete skeleton of a young male from West Turkana, Kenya, which dates from about 1.55 million years ago. Scientists determined the sex of the skeleton from the shape of its pelvis. They also determined from patterns of tooth eruption and bone growth that the boy had died when he was between nine and twelve years old. The oldest humanlike fossils outside Africa have also been classified as H. ergaster, dated around 1.75 million year’s old. These finds, from the Dmanisi site in the southern Caucasus Mountains of Georgia, consist of several crania, jaws, and other fossilized bones. Some of these are strikingly like East African H. ergaster, but others are smaller or larger than H. ergaster, suggesting a high degree of variation within a single population

H. ergaster, H. rudolfensis, and H. habilis, in addition to possibly two robust Australopiths, all might have coexisted in Africa around 1.9 million years ago. This finding goes against a traditional paleoanthropological view that human evolution consisted of a single line that evolved progressively over time-an Australopiths species followed by early Homo, then Middle Homo, and finally H. sapiens. It appears that periods of species diversity and extinction have been common during human evolution, and that modern H. sapiens has the rare distinction of being the only living human species today.

Although H. ergaster appears to have coexisted with several other human species, they probably did not interbreed. Mating rarely succeeds between two species with significant skeletal differences, such as H. ergaster and H. habilis. Many paleoanthropologists now believe that H. ergaster descended from an earlier population of Homo-perhaps one of the two known species of early Homo-and that the modern human line descended from the H. ergaster.

Paleoanthropologists now know that humans first evolved in Africa and lived only on that continent for a few million years. The earliest human species known to have spread in large numbers beyond the African continent was first discovered in Southeast Asia. In 1891 Dutch physician Eugene Dubois found the cranium of an early human on the Indonesian island of Java. He named this early human Pithecanthropus erectus, or ‘erect ape-man.’ Today paleoanthropologists call this species Homo erectus.

H. erectus appears to have evolved in Africa from earlier populations of H. ergaster, and then spread to Asia sometime between 1.8 million and 1.5 million years ago. The youngest known fossils of this species, from the Solo River in Java, may date from about 50,000 years ago (although that dating is controversial). So, H. erectus was a very successful species,-both widespread, having lived in Africa and much of Asia, and long-lived, having survived for possibly more than 1.5 million years.

H. erectus had a low and rounded braincase that was elongated form front to back, a prominent brow ridge, and adult cranial capacity of 800 to 1,250 cu. cm. (50 to eighty cu. in.), an average twice that of the Australopiths. Its bones, including the cranium, were thicker than those of earlier species. Prominent muscle markings and thick, reinforced areas on the bones of H. erectus indicate that its body could withstand powerful movements and stresses. Although it had much smaller teeth than did the Australopiths, it had a heavy and strong jaw.

In the 1920's and 1930's German anatomist and physical anthropologist Franz Weidenreich excavated the most famous collections of H. erectus fossils from a cave at the site of Zhoukoudian (Chou-k̀ou-tien), China, near Beijing (Peking). Scientists dubbed these fossil humans Sinanthropus pekinensis, or Peking Man, but others later reclassified them as H. erectus. The Zhoukoudian cave yielded the fragmentary remains of more than 30 individuals, ranging from about 500,000 to 250,000 years old. These fossils were lost near the outbreak of World War II, but Weidenreich had made excellent casts of his finds. Further studies at the cave site have yielded more H. erectus remains.

Other important fossil sites for this species in China include Lantian, Yuanmou, Yunxian, and Hexian. Researchers have also recovered many tools made by H. erectus in China at sites such as Nihewan and Bose, and other sites of similar age (at least one million to 250,000 years old).

Ever since the discovery of H. erectus, scientists have debated whether this species was a direct ancestor of later humans, including H. sapiens. The last populations of H. erectus-such as those from the Solo River in Java-may have lived as recently as 50,000 years ago, while did populations of H. sapiens. Modern humans could not have evolved from these late populations of the H. erectus, a much more primitive type of human. However, earlier East Asian populations could have produced The

Homo sapiens.

Many paleoanthropologists believe that early humans migrated into Europe by 800,000 years ago, and that these populations were not Homo erectus. Most scientists refer to these early migrants into Europe,-who predated both Neanderthals and H. sapiens in the region, as

H. heidelbergensis. The species name comes from a 500,000-year-old jaw found near Heidelberg, Germany

Scientists have found few human fossils in Africa for the period between 1.2 million and 600,000 years ago, during which H. heidelbergensis or its ancestors first migrated into Europe. Populations of H. ergaster (or possibly H. erectus) appear to have lived until at least 800,000 years ago in Africa, and possibly until 500,000 years ago in northern Africa. When these populations disappeared, other massive-boned and larger-brained humans-possibly H. heidelbergensis-appear to have replaced them. Scientists have found fossils of these stockier humans at sites in Bodo Ethiopia, Saldanha (also known as Elandsfontein), South Africa, Ndutu, Tanzania, Kabwe, and Zimbabwe.

Scientists have come up with at least three different interpretations of these African fossils. Some scientists place the fossils in the species H. heidelbergensis and think that this species led to both the Neanderthals (in Europe) and H. sapiens (in Africa). Others think that the European and African fossils belong to two distinct species, and that the African populations that, in this view, was not

H. heidelbergensis but a separate species produced Homo sapiens. Yet other scientists advocate a long-head view that H. erectus similarly, Homo sapiens belong to a single evolving lineage, and that the African fossils belong in the category of archaic H. sapiens (archaic meaning not fully anatomically modern).

The fossil evidence does not clearly favour any of these three interpretations over another. Several fossils from Asia, Africa, and Europe have features that are intermediate between early H. ergaster and H. sapiens. This kind of variation makes it hard to decide how to identify distinct species and to determine which group of fossils represents the most likely ancestor of later humans.

Scientists once thought that advances in stone tools could have enabled early humans such as Homo erectus to move into Asia and Europe, perhaps by helping them to obtain new kinds of food, such as the meat of large mammals. If African human populations had developed tools that allowed them to hunt large game effectively, they would have had a good source of food wherever they went. In this view, humans first migrated into Eurasia based on a unique cultural adaptation.

By 1.5 million years ago, early humans had begun to make new kinds of tools, which scientists call Acheulean. Common Acheulean tools included large hand axes and cleavers. While these new tools might have helped early humans to hunt, the first known Acheulean tools in Africa date from later than the earliest known human presence in Asia. Also, most East Asian sites more than 200,000 years old contains only simply shaped cobble and flake tools. In contrast, Acheulean tools were more finely crafted, larger, and more symmetrical. Thus, the earliest settlers of Eurasia did not have a true Acheulean technology, and advances in Toolmaking alone cannot explain the spread out of Africa.

Another possibility is that the early spreads of humans to Eurasia were not unique, but part of a wider migration of meat-eating animals, such as lions and hyenas. The human migration out of Africa occurred during the early part of the Pleistocene Epoch, between 1.8 million and 780,000 years ago. Many African carnivores spread to Eurasia during the early Pleistocene, and humans could have moved along with them. In this view, H. erectus was one of many meat-eating species to expand into Eurasia from Africa, rather than a uniquely adapted species. Relying on meat as a primary food source might have allowed many meat-eating species, including humans, to move through many different environments without having to learn about unfamiliar and potentially poisonous plants quickly.

However, the migration of humans to eastern Asia may have occurred gradually and through lower latitudes and environments similar to those of Africa. If East African populations of H. erectus moved at only 1.6 km. (1 mi.) every twenty years, they could have reached Southeast Asia in 150,000 years. Over this amount of time, humans could have learned about and begun relying on edible plant foods. Thus, eating meat may not have played a crucial role in the first human migrations to new continents. Careful comparison of animal fossils, stone tools, and early human fossils from Africa, Asia, and Europe will help scientists better to determine what factors motivated and allowed humans to venture out of Africa for the first time.

The origin of our own species, Homo sapiens, is one of the most hotly debated topics in Paleoanthropology. This debate centres on whether or not modern humans have a direct relationship to H. erectus or to the Neanderthals, and to a great extent is acknowledged of the more modern group of humans who evolved within the past 250,000 years. Paleoanthropologists commonly use the term anatomically modern Homo sapiens to distinguish people of today from these similar predecessors.

Traditionally, paleoanthropologists classified as Homo sapiens any fossil human younger than 500,000 years old with a braincase larger than that of H. erectus. Thus, many scientists who believe that modern humans descend from a single line dating back to H. erectus use the name archaic Homo sapiens to refer to a variety of fossil humans that predate anatomically modern H. sapiens. The designate with archaic denote a set of physical features typical of Neanderthals and other species of late Homo before modern Homo sapiens. These features include a combination of a robust skeleton, a large but low braincase (positioned in a measure behind, rather than over, the face), and a lower jaw lacking a prominent chin. In this sense, Neanderthals are sometimes classified as a subspecies of archaic H. sapiens and H. Sapiens neanderthalensis. Other scientists think that the variation in archaic fossils falls into clearly identifiable sets of traits, and that any type of human fossil exhibiting a unique set of traits should have a new species name. According to this view, the Neanderthals belong to their own species, H. neanderthalensis.

The Neanderthals lived in areas ranging from western Europe through central Asia from about 200,000 to about 28,000 years ago. The name Neanderthal (sometimes spelled Neanderthal) comes from fossils found in 1856 in the Feldhofer Cave of the Neander Valley in Germany (tal,-a modern form of that-means ‘valley’ in German). Scientists realized several years later that prior discoveries, at Engis, Belgium, in 1829 and at Forbes Quarry, Gibraltar, in 1848,-also represented Neanderthal. These two earlier discoveries were the first early human fossils ever found.

In the past. Scientists claimed that Neanderthal differed greatly from modern humans. However, the basis for this claim came from a faulty reconstruction of a Neanderthal skeleton that showed it with bent knees and a slouching gait. This reconstruction gave the common but mistaken impression that Neanderthals were dim-witted brutes who lived a crude lifestyle. On the contrary, Neanderthals, like the species that preceded them, walked fully upright without a slouch or bent knees. In addition, their cranial capacity was quite large at about 1,500 cu. cm. (about ninety cu. in.), larger on average than that of modern humans. (The difference probably relates to the greater muscle mass of Neanderthals as compared with modern humans, which usually correlates with a larger brain size.)

Compared with earlier humans, Neanderthals had a high degree of cultural sophistication. They appear to have acted symbolic rituals, such as the burial of they’re dead. Neanderthal fossils,-including some complete skeletons is quite common compared with those of earlier forms of Homo, in part because of the Neanderthal practice of intentional burial. Neanderthals also produced sophisticated types of stone tools known as Mousterian, which involved creating blanks (rough forms) from which several types of tools could be made. Along with many physical similarities, Neanderthals differed from modern humans in several ways. The typical Neanderthal skull had a low forehead, a large nasal area (suggesting a large nose), a forward-projecting nasal and cheek region, a prominent brow ridge with a bony arch over each eye, a non-projecting chin, and obvious space behind the third molar (in front of the upward turn of the lower jaw).

Neanderthals, in addition, had a distinctively heavily built and large-boned skeleton than do modern humans. Other Neanderthal skeletal features included a bowing of the limb bones in some individuals, broad scapulae (shoulder blades), hip joints turned outward, a long and thin pubic bone, short lower leg and arm bones relative to the upper bones, and large surfaces on the joints of the toes and limb bones. Together, these traits made a powerful, compact body of short stature males averaged 1.7 m. (5 ft. 5 in.) tall and 84 kg. (185 lb.), and females averaged 1.5 m. (5 ft.) tall and 80 kg. (176 lb.).

The short, stocky build of Neanderthals conserved heat and helped them withstand extremely cold conditions that prevailed in temperate regions beginning about 70,000 years ago. The last known Neanderthal fossils come from western Europe and date from approximately 36,000 years ago.

Just when Neanderthal populations grew in number in Europe and parts of Asia, other populations of nearly modern humans arose in Africa and Asia. Scientists also commonly refer to these fossils, which are distinct from but similar to those of Neanderthals, as archaic. Fossils from the Chinese sites of Dali, Maba, and Xujiayao display the long, low cranium and large face typical of archaic humans, yet they also have features similar to those of modern people in the region. At the cave site of Jebel Irhoud, Morocco, scientists have found fossils with the long skull typical of archaic humans but also the modern traits of a moderately higher forehead and flatter mid face. Fossils of humans from East African sites older than 100,000 years, such as Ngaloba in Tanzania and Eliye Springs in Kenya-also seem to show a mixture of archaic and modern traits.

The oldest known fossils that possess skeletal features typical of modern humans date from between 130,000 and 90,000 years ago. Several key features distinguish the skulls of modern humans from those of archaic species. These features include a much smaller brow ridge, if any; a globe-shaped braincase; and a flat or only projecting face of reduced size, located under the front of the braincase. Among all mammals, only humans have a face positioned directly beneath the frontal lobe (forward-most area) of the brain. As a result, modern humans tend to have a higher forehead than did Neanderthals and other archaic humans. The cranial capacity of modern humans ranges from about 1,000 to 2,000 cu. cm. (60 to 120 cu. in.), with the average being about 1,350 cu. cm. (80 cu. in.).

Scientists have found both fragmentary and nearly complete cranial fossils of early anatomically modern Homo sapiens from the sites of Singha, Sudan; Omo, Ethiopia; Klasies River Mouth, South Africa, and Skh~l Cave unbounded of Israel. Based on these fossils, many scientists conclude that modern H. sapiens had evolved in Africa by 130,000 years ago and started spreading to diverse parts of the world beginning on a route through the Near East sometime before 90,000 years ago.

Paleoanthropologists are engaged in an ongoing debate about where modern humans evolved and how they spread around the world. Differences in opinion rest on the question of whether the evolution of modern humans took place in a small region of Africa or over a broad area of Africa and Eurasia. By extension, opinions differ as to whether modern human populations from Africa displaced all existing populations of earlier humans, eventually resulting in their extinction.

Those who think of modern humans originating only in Africa and then spreading around the world support as their thesis the out of Africa hypothesis. Those who think modern humans evolved over a large region of Eurasia and Africa support the so-called multi-regional hypothesis. The African origins of Humanity where Richard Leakey's work at Omo-Kibish gave scientists a fresh start in their study of Homo sapiens' origins. In fact, his finds gave them two beginnings. First, they led a few researchers in the 1970s to conclude that the Kibish man was a far more likely ancestor for the Cro-Magnons, a race of early Europeans who thrived about 25,000 years ago, than their immediate predecessors in Europe, the heavyset Neanderthals. Then in the 1980s, a new reconstruction and study of the Kibish man revealed an even more startling possibility-that, and he was a far better candidate as the forbear, not just for the Cro-Magnons but for every one of us in the wake of an ignited awareness for life today, not just Europeans but all the other peoples of the world, from the Eskimos of Greenland to the Twa people of Africa, and from Australian aborigines to Native Americans. In other words, the Kibish man acted as pathfinder for a new genesis for the human species.

In the past few years, many paleontologists, anthropologists, and geneticists have come to agree that this ancient resident of the riverbanks of Ethiopia and all his Kibish kin-both far and near-could are even among our ancestors. However, it has also become clear that the evolutionary pathway of these fledgling modern humans was not an easy one. At one stage, according to genetic data, our species became as endangered as the mountain gorilla is today, its population reduced to only about 10,000 adults. Restricted to one region of Africa, but tempered in the flames of near extinction, this population went on to make a remarkable comeback. It then spread across Africa until-nearly 100,000 years ago-it had colonized much of the continent's savannas and woodlands. We see the imprint of this spread in biological studies that have revealed that races within Africa are genetically the most disparate on the planet, indicating that modern humans have existed there in larger numbers for a longer time than anywhere else.

We can also observe intriguing clues about our African origins in other less obvious but equally exciting arenas. One example comes from Congo-Kinshasa. This huge tropical African country has never assumed much importance in the field of Paleoanthropology, the branch of anthropology concerned with the investigation of ancient humans. Unlike the countries to the east, Ethiopia, Kenya, and Tanzania, Congo-Kinshasa has provided few exciting fossil sites-until recently.

In the neglected western branch of the African Rift Valley, that giant geological slash that has played such a pivotal role in human evolution, the Semliki River runs northward between two large lakes, and its waters eventually from the source of the Nile. Along its banks, sediments are being exposed that were laid down 90,000 years ago, just as Homo sapiens was making its mark across Africa.

At the town of Katanda an archaeological treasure trove: thousands of artifacts, mostly stone tools, and a few bone implements that quite astonished the archaeologists, led by the husband-and-wife team of John Yellen, of the National Science Foundation, Washington, and Alison Brooks, of George Washington University. Among the wonders they have uncovered are sophisticated bone harpoons and knives. Previously it was thought that the Cro-Magnons were the first humans to develop such delicate carving skills. Yet this very much older grouped of Homo sapiens, living in the heartland of Africa, displayed the same extraordinary skills as craft’s workers. It was as if, said one observer, a prototype Pontiac car had been found in the attic of Leonardo da. Vinci.

There were other surprises for researchers, however. Apart from the finely carved implements, they found fish bones, including some from two-metre-long catfish. It seems the Katanda people were efficiently and repeatedly catching catfish during their spawning season, indicating that systematic fishing is quite an ancient human skill and not some recently acquired expertise, as many archaeologists had previously thought. In addition, the team found evidence that a Katanda site had at least two separate but similar clusters of stones and debris that looked like the residue of two distinct neighbouring groupings, signs of the possible impact of the nuclear family on society, a phenomenon that now defines the fabric of our lives.

Clearly, our African forbears were sophisticated people. Bands of them, armed with new proficiencies, like those men and women who had flourished on the banks of the Semliki, began an exodus from their African homeland. Slowly they trickled northward, and into the Levant, the region bordering the eastern Mediterranean. Then, by 80,000 years ago, small groups began spreading across the globe, via the Middle East, planting the seeds of modern humanity in Asia and later in Europe and Australia.

Today men and women conduct themselves in highly complex ways: some are uncovering the strange, indeterminate nature of matter, with its building blocks of quarks and leptons; some are probing the first few seconds of the origins of the universe fifteen billion years ago; while others are trying to develop artificial brains capable of staggering feats of calculation. Yet the intellectual tools that allow us to investigate the deepest secrets of our world are the ones that were forged during our fight for survival, in a very different set of circumstances from those that prevail today. How on earth could an animal that struggled for survival like any other creature, whose time was absorbed in a constant search for meat, nuts, and tubers, and who had to maintain constant vigilance against predators, develop the mental hardwiring needed by a nuclear physicist or an astronomer? This is a vexing issue that takes us to the very heart of our African exodus, to the journey that brought us from precarious survival on a single continent to global control.

If we can ever hope to understand the special attributes that delineate a modern human being we have to attempt to solve such puzzles. How was the Kibish man different from his Neanderthal cousins in Europe, and what evolutionary pressures led the Katanda people to develop in such crucially different ways-ironically in the heart of a continent that has for far too long been stigmatized as backward?

Nonetheless, it remains bewildering, but French researchers announced at a press conference on May 22, 1996, the discovery of a new fossil hominid species in central Chad, estimated to have lived between three million and 3.5 million years ago. The fossilized remains of a lower jaw and seven teeth were found in 1995 near Koro Toro, in the desert about 2500 km (about 1500 mi) east of the Great Rift Valley in Africa, the site of many major hominid fossil finds. The leader of the French team that discovered the fossils at Bahr-el-Ghazal, Chad-paleontologist Michel Brunet of the University of Poitiers-named the species Australopithecus bahrelghazali (from the Arabic name of the nearby River of the Gazelles). The research team published its findings in the May 20 bulletin of the French Academy of Sciences. In a letter to the journal Nature published November 16, 1995, the researchers initially classified the fossil as an example of Australopithecus afarensis, the 3.4-million-year-old species that walked upright in eastern Africa. In the letter, Brunet said that more detailed comparisons with other fossils were necessary before he could determine that the jaw came from another species, and he noted that geographic separation can produce differences among animals of the same species. After the letter was published, Brunet travelled to museums to compare the jaw with other hominid bones. The fossil combines both primitive and modern hominid features. The jaw includes the right and left premolars, both canines, and the right lateral incisor. Brunet said the strong canine teeth and the shape of the incisor resemble human teeth more than ape teeth. The chin area is more vertical than the backward-sloping chin of A. afarensis, and it lacks the strong reinforcement for chewing power found among other early hominids. However, the premolars retain primitive characteristics, such as three roots, and modern humans have only one root. Scientists said they needed more fossil material before they can place the species on the evolutionary tree. Brunet cited the find as the first evidence of hominid occupation of areas outside the Great Rift Valley and South Africa, where anthropologists have concentrated their search for hominid fossils. Other experts noted that the eroding volcanic soils in the Great Rift Valley are simply better for preserving and exposing fossils than the soils in most other regions in Africa. Although many digs have occurred in the Great Rift Valley, most scientists believe that hominids existed throughout Africa.

Researchers have conducted many genetic studies and carefully assessed fossils to determine which of these hypotheses agrees more with scientific evidence. The results of this research do not entirely confirm or reject either one. Therefore, some scientists think a compromise between the two hypotheses is the best explanation. The debate between these views has implications for how scientists understand the concept of race in humans. The question raised is whether the physical differences among modern humans evolved deep in the past or most recently, according to the out of Africa hypothesis, also known as the replacement hypothesis, early populations of modern humans from Africa migrated to other regions and entirely replaced existing populations of archaic humans. The replaced populations would have included the Neanderthals and any surviving groups of Homo erectus. Supporters of this view note that many modern human skeletal traits evolved recently,-within the past 200,000 years or so suggesting a single, common origin. In addition, the anatomical similarities shared by all modern human populations far outweigh those shared by premodern and modern humans within particular geographic regions. Furthermore, biological research indicated that most new species of organisms, including mammals, arose from small, geographically isolated populations.

According to the multi-regional hypothesis, also known as the continuity hypothesis, the evolution of modern humans began when Homo erectus spread throughout much of Eurasia around one million years ago. Regional populations retained some unique anatomical features for hundreds of thousands of years, but they also mated with populations from neighbouring regions, exchanging heritable traits with each other. This exchange of heritable traits is known as gene flow.

Through gene flow, populations of H. erectus passed on a variety of increasingly modern characteristics, such as increases in brain size, across their geographic range. Gradually this would have resulted in the evolution of more modern looking humans throughout Africa and Eurasia. The substantial differences among our citizenries today, are, then, a sortal result from hundreds of thousands of years of regional evolution. This is the concept of continuity. For instance, modern East Asian populations have some skull features that scientists also see in H. erectus fossils from that region.

Some critics of the multi-regional hypothesis claim that it wrongly advocates a scientific belief in race and could be used to encourage racism. Supporters of the theory point out, however, that their position does not imply that modern races evolved in isolation from each other, or that racial differences justify racism. Instead, the theory holds that gene flow linked different populations together. These links allowed progressively more modern features, no matter where they arose, to spread from region to region and eventually become universal among humans.

Scientists have weighed the out of Africa and multi-regional hypotheses against both genetic and fossil evidence. The results do not unanimously support either one, but weigh more heavily in favour of the out of Africa hypothesis.

Geneticists have studied difference in the DNA (deoxyribonucleic acid) of different populations of humans. DNA is the molecule that contains our heritable genetic code. Differences in human DNA result from mutations in DNA structure. Mutations may result from exposure to external elements such as solar radiation or certain chemical compounds, while others occur naturally at random.

Geneticists have calculated rates at which mutations can be expected to occur over time. Dividing the total number of genetic differences between two populations by an expected rate of mutation provides an estimate of the time when the two shared a common ancestor. Many estimates of evolutionary ancestry rely on studies of the DNA in cell structures called mitochondria. This DNA is called mtDNA (mitochondrial DNA). Unlike DNA from the nucleus of a cell, which codes for most of the traits an organism inherits from both parents, mtDNA inheritance passes only from a mother to her offspring. MtDNA also accumulates mutations about ten times faster than does DNA in the cell nucleus (the location of most DNA). The structure of mtDNA changes so quickly that scientists can easily measure the differences between one human population and another. Two closely related populations should have only minor differences in their mtDNA. Conversely, two very distantly related populations should have large differences in their mtDNA

MtDNA research into modern human origins has produced two major findings. First, the entire amount of variation in mtDNA across human populations is small in comparison with that of other animal species. This means that all human mtDNA originated from a single since which ancestral lineage-specifically, a single female-of late has been mutating ever. Most estimates of the mutation rate of mtDNA suggest that this female ancestor lived about 200,000 years ago. In addition, the mtDNA of African populations varies more than that of peoples in other continents. This suggests that the mtDNA of African populations sustained of change for a longer time than in populations of any other region. That all living people inherited their mtDNA from one woman in Africa, who is sometimes called the Mitochondrial Eve. Some geneticists and anthropologists have concluded from this evidence that modern humans originated in a small population in Africa and spread from there.

MtDNA studies have weaknesses, however, including the following four. First, the estimated rate of mtDNA mutation varies from study to study, and some estimates put the date of origin closer to 850,000 years ago, the time of Homo erectus. Second, mtDNA makes up a small part of the total genetic material that humans inherit. The rest of our genetic material-about 400,000 times more than the mtDNA,-came from many individuals living at the time of the African Eve, conceivably from many different regions. Third, the time at which modern mtDNA began to diversify does not necessarily coincide with the origin of modern human biological traits and cultural abilities. Fourth, the smaller amount of modern mtDNA diversity but Africa could result from times when European and Asian populations declined in numbers, perhaps due to climate changes.

Despite these criticisms, many geneticists continue to favour the out of Africa hypothesis of modern human origins. Studies of nuclear DNA also suggest an African origin for a variety of genes. Furthermore, in a remarkable series of studies in the late 1990's, scientists recovered mtDNA from the first Neanderthal fossil found in Germany and two other Neanderthal fossils. In each case, the mtDNA does not closely match that of modern humans. This finding suggests that at least some Neanderthal populations had diverged from the line to modern humans by 500,000 to 600,000 years ago. This also suggests that Neanderthals represent a separate species from modern H. sapiens. In another study, however, mtDNA extracted from a 62,000 -year-old Australian H. sapiens fossil was found to differ significantly from modern human mtDNA, suggesting a much wider range of mtDNA variation within H. sapiens than was previously believed. According to the Australian researchers, this finding lends support to the multi-regional hypothesis because it shows that different populations of H. sapiens, possibly including Neanderthals, could have evolved independently in different parts of the world.

As with genetic research, fossil evidence also does not entirely support or refute either of the competing hypotheses of modern human origins. However, many scientists see the balance of evidence favouring an African origin of modern H. sapiens within the past 200,000 years. The oldest known modern-looking skulls come from Africa and date from perhaps 130,000 years ago. The next oldest comes from the Near East, where they date from about 90,000 years ago. Fossils of modern humans in Europe do not exist in advancing their precedence, in as much as lacking generative qualities that extend no further than 40,000 years ago. In addition, the first modern humans in Europe-often called Cro-Magnon people -had elongated lower leg bones, as did African populations adapted too warm, tropical climates. This suggests that populations from warmer regions replaced those in colder European regions, such as the Neanderthals.

Fossils also show that populations of modern humans lived when and in the same regions as did populations of Neanderthals and Homo erectus, but that each retained its distinctive physical features. The different groups overlapped in the Near East and Southeast Asia for between about 30,000 and 50,000 years. The maintenance of physical differences for this amount of time implies that archaically and modern humans could either not or generally did not interbreed. To some scientists, this also means that the Neanderthals belong to a separate species, H. neanderthalensis, and that migratory populations of modern humans entirely replaced archaic humans in both Europe and eastern Asia.

On the other hand, fossils of archaic and modern humans in some regions show continuity in certain physical characteristics. These similarities may indicate multi-regional evolution. For example, both archaic and modern skulls of eastern Asia have flatter cheek and nasal areas than do skulls from other regions. By contrast, the same parts of the face project forward in the skulls of both archaic and modern humans of Europe. If these traits were influenced primarily by genetic inheritance rather than environmental factors, archaic humans may have produced modern humans in some regions or at least interbred with migrant modern-looking humans.

Each of the competing major hypotheses of modern human origins has its strengths and weaknesses. Genetic evidence appears to support the out of Africa hypothesis. In the western half of Eurasia and in Africa, this hypothesis also seems the better explanation, particularly as for the apparent replacement of Neanderthals by modern populations. Also, the multi-regional hypothesis appears to explain some of the regional continuity found in East Asian populations.

Therefore, many paleoanthropologists advocate a theory of modern human origins that combine elements of the out of Africa and the multi-regional hypotheses. Humans with modern features may have initiatively emerged in Africa or come together there as a result of gene flow with populations from other regions. These African populations may then have replaced archaic humans in certain regions, such as western Europe and the Near East. Still, elsewhere,-especially in East Asia-gene flow may have occurred among local populations of archaic and modern humans, resulting in distinct and enduring regional characteristics.

All three of these views,-the two competing positions and the compromiser acknowledge the strong biological unity of all people. In the multi-regional hypothesis, this unity results from hundreds of thousands of years of continued gene flow among all human populations. According to the out of Africa hypothesis, on the other hand, similarities among all living human populations result from a recent common origin. The compromise position accepts both as reasonable and compatible explanations of modern human origins.

The story of human evolution is as much about the development of cultural behaviour as it is about changes in physical appearance. The term culture, in anthropology, traditionally refers to all human creations and activities governed by social customs and rules. It includes elements such as technology, language, and art. Human cultural behaviour depends on the social transfer of information from one generation to the next, which it depends on a sophisticated system of communication, such as language.

The term culture has often been used to distinguish the behaviour of humans from that of other animals. However, some nonhuman animals also appear to have forms of learned cultural behaviours. For instance, different groups of chimpanzees use different techniques to capture termites for food using sticks. Also, in some regions chimps use stones or pieces of wood for cracking open nuts. Chimps in other regions do not practice this behaviour, although their forests have similar nut trees and materials for making tools. These regional differences resemble traditions that people pass from generation to generation. Traditions are a fundamental aspect of culture, and paleoanthropologists assume that the earliest humans also had some types of traditions.

However, modern humans differ from other animals, and probably many early human species. In that, they actively teach each other and are able to pass on an accumulative amounts of resulting knowledge. People also have a uniquely long period of learning before adulthood, and the physical and mental capacity for language. Language of all forms spoken, signed, and written,-provides a medium for communicating vast amounts of information, much more than any other animal could probably transmit through gestures and vocalizations.

Scientists have traced the evolution of human cultural behaviour through the study of archaeological artifacts, such as tools, and related evidence, such as the charred remains of cooked food. Artifacts show that throughout much of human evolution, culture has developed slowly. During the Palaeolithic, or early Stone Age, basic techniques for making stone tools changed very little for periods of well more than a million years.

Human fossils also provide information about how culture has evolved and what effects it has had on human life. For example, over the past 30,000 years, the basic anatomy of humans has undergone only one prominent change: The bones of the average human skeleton have become much smaller and thinner. Innovations in the making and usage of tools and their obtaining food: results of cultural evolution may have led to more efficient and less physically taxing lifestyles, and thus caused changes in the skeleton.

Culture has played a prominent role in the evolution of Homo sapiens. Within the last 60,000 years, people have migrated to settle most unoccupied regions of the world, such as small island chains and the continents of Australia and the Americas. These migrations depended on developments in transportation, hunting and fishing tools, shelter, and clothing. Within the past 30,000 years, cultural evolution has sped up dramatically. This change shows up in the archaeological record as a rapid expansion of stone tool types and Toolmaking techniques, and in works of art and indications of evolving religion, such as burials. By 10,000 years ago, people first began to harvest and cultivate grains and to domesticate animals-a fundamental change in the ecological relationship between human beings and other life on Earth. The development of agriculture gave people larger quantities and more stable supplies of food, which set the stage for the rise of the first civilizations. Today, culture and particularly technology dominates human life.

Paleoanthropologists and archaeologists have studied many topics in the evolution of human cultural behaviour. These have included the evolution of (1) social life; (2) subsistence (the acquisition and production of food); (3) the making and using of tools; (4) environmental adaptation; (5) symbolic thought and its expression through language, art, and religion; and (6) the development of agriculture and the rise of civilizations.

Most primate species, including the African apes, live in social groups of varying size and complexity. Within their groups, individuals often have multifaceted roles, based on age, sex, status, social skills, and personality. The discovery in 1975 at Hadar, Ethiopia, of a group of several Australopithecus afarensis individuals who died together 3.2 million years ago appears to confirm that early humans lived in social groups. Scientists have referred to this collection of fossils as The First Family.

One of the first physicals changes in the evolution of humans from apes-a decrease in the size of male canine teeth-also, indicating a change in social relations. Male apes sometimes use their large canines to threaten (or sometimes fight with) other males of their species, usually over access to females, territory, or food. The evolution of small canines in Australopiths implies that males had either developed other methods of threatening each other or become more cooperative. In addition, both male and female Australopiths had small canines, indicating a reduction of sexual dimorphism from that in apes. Yet, although sexual dimorphism in canine size decreased in Australopiths, males were still much larger than females. Thus, male Australopiths might have competed aggressively with each other based on sheer size and strength, and the social life of humans may not have differed much from that of apes until later times.

Scientists believe that several of the most important changes from apelike to characteristically human social life occurred in species of the genus Homo, whose members show even less sexual dimorphism. These changes, which may have occurred at different times, included, (1) prolonged maturation of infants, including an extended period during which they required intensive care from their parents; (2) special bonds of sharing and exclusive mating between particular males and females, called pair-bonding; and (3) the focus of social activity at a home base, a safe refuge in a special location known to family or group members.

Humans, who have a large brain, has a prolonged period of infant development and childhood because the brain takes a long time too mature. Since the Australopiths brain was not much larger than that of a chimp, some scientists think that the earliest humans had a more apelike rate of growth, which is far more rapid than that of modern humans. This view is supported by studies of Australopiths fossils looking at tooth development-a good indicator of overall body development.

In addition, the human brain becomes very large as it develops, so a woman must give birth at an early stage of development in order for the infant’s head to fit through her birth canal. Thus, human babies require a long period of care to reach a stage of development at which they depend less on their parents. In contrast with a modern female, a female Australopiths could give birth to a baby at an advanced stage of development because its brain would not be too large to pass through the birth canal. The need to give birth early-and therefore to provide more infant care,-may have evolved around the time of the middle Homo’s species Homo’s ergaster. This species had a brain significantly larger than that of the Australopiths, but a narrow birth canal.

Pair-bonding, usually of a short duration, occurs in a variety of primate species. Some scientists speculate that prolonged bonds developed in humans along with increased sharing of food. Among primates, humans have a distinct type of food-sharing behaviour. People will delay eating food until they have returned with it to the location of other members of their social group. This type of food sharing may have arisen at the same time as the need for intensive infant care, probably by the time of H. ergaster. By devoting himself to a particular female and sharing food with her, a male could increase the chances of survival for his own offspring.

Humans have lived as foragers for millions of years. Foragers obtain food when and where it is available over a broad territory. Modern-day foragers (also known as hunter-gatherers)-such as the San people in the Kalahari Desert of southern Africa,-also set up central campsites, or home bases, and divide work duties between men and women. Women gather readily available plant and animal foods, while men take on the often less successful task of hunting. Female and male family members and relatives bring together their food to share at their home base. The modern form of the home base-that also serves as a haven for raising children and caring for the sick and elderly-may have first developed with middle Homo after about 1.7 million years ago. However, the first evidence of hearths and shelters common to all modern home bases,-comes from only after 500,000 years ago. Thus, a modern form of social life may not have developed until late in human evolution.

Human subsistence refers to the types of food humans eat, the technology used in and methods of obtaining or producing food, and the ways in which social groups or societies organize them for getting, making, and distributing food. For millions of years, humans probably fed on-the-go, much as other primates do. The lifestyle associated with this feeding strategy is generally organized around small, family-based social groups that take advantage of different food sources at different times of year.

The early human diet probably resembled that of closely related primate species. The great apes eat mostly plant foods. Many primates also eat easily obtained animal foods such as insects and bird eggs. Among the few primates that hunt, chimpanzees will prey on monkeys and even small gazelles. The first humans probably also had a diet based mostly on plant foods. In addition, they undoubtedly ate some animal foods and might have done some hunting. Human subsistence began to diverge from that of other primates with the production and use of the first stone tools. With this development, the meat and marrow (the inner, fat-rich tissue of bones) of large mammals became a part of the human diet. Thus, with the advent of stone tools, the diet of early humans became distinguished in an important way from that of apes.

Scientists have found broken and butchered fossil bones of antelopes, zebras, and other comparably sized animals at the oldest archaeological sites, which go of a date from some 2.5 million years ago. With the evolution of late Homo, humans began to hunt even the largest animals on Earth, including mastodons and mammoths, members of the elephant family. Agriculture and the of animals arose only in the recent past, with H. sapiens.

Paleoanthropologists have debated whether early members of the modern human genus were aggressive hunters, peaceful plant gatherers, or opportunistic scavengers. Many scientists once thought that predation and the eating of meat had strong effects on early human evolution. This hunting hypothesis suggested that early humans in Africa survived particularly arid periods by aggressively hunting animals with primitive stone or bone tools. Supporters of this hypothesis thought that hunting and competition with carnivores powerfully influenced the evolution of human social organization and behaviour; Toolmaking; anatomy, such as the unique structure of the human hand; and intelligence.

Beginning in the 1960's, studies of apes cast doubt on the hunting hypothesis. Researchers discovered that chimpanzees cooperate in hunts of at least small animals, such as monkeys. Hunting did not, therefore, entirely distinguish early humans from apes, and therefore hunting alone may not have determined the path of early human evolution. Some scientists instead argued in favour of the importance of food-sharing in early human life. According to a food-sharing hypothesis, cooperation and sharing within family groups-instead of aggressive hunting-strongly influenced the path of human evolution.

Scientists once thought that archaeological sites as much as two million years old provided evidence to support the food-sharing hypothesis. Some of the oldest archaeological sites were places where humans brought food and stone tools together. Scientists thought that these sites represented home bases, with many social features of modern hunter-gatherers campsites, including the sharing of food between pair-bonded males and females.

Critique of the food-sharing hypothesis resulted from more careful study of animal bones from the early archaeological sites. Microscopic analysis of these bones revealed the marks of human tools and carnivore teeth, indicating that both humans and potential predators,-such as, hyenas, cats, and jackals-were active at these sites. This evidence suggested that what scientists had thought were home bases where early humans shared food were in fact food-processing sites that humans abandoned to predators. Thus, evidence did not clearly support the idea of food-sharing among early humans.

The new research also suggested a different view of early human subsistence-that early humans scavenged meat and bone marrow from dead animals and did little hunting. According to this scavenging hypothesis, early humans opportunistically took parts of animal carcasses left by predators, and then used stone tools to remove marrow from the bones.

Observations that many animals, such as antelope, often die off in the dry season make the scavenging hypothesis quite plausible. Early Toolmaker would have had plenty of opportunity to scavenge animal fat and meat during dry times of the year. However, other archaeological studies,-and a better appreciation of the importance of hunting among chimpanzees suggests that the scavenging hypothesis be too narrow. Many scientists now believe that early humans both scavenged and hunted. Evidence of carnivore tooth marks on bones cut by early human Toolmaker suggests that the humans scavenged at least the larger of the animals they ate. They also ate a variety of plant foods. Some disagreement remains, however, about how much early humans relied on hunting, especially the hunting of smaller animals.

Scientists debate when humans first began hunting on a regular basis. For instance, elephant fossils were made-known to be found existent with tools made by Middle Homo once led researchers to the idea that members of this species were hunters of big game. However, the simple association of animal bones and tools at the same site does not necessarily mean that early humans had killed the animals or eaten their meat. Animals may die in many ways, and natural forces can accidentally place fossils next to tools. Recent excavations at Olorgesailie, Kenya, show that H. erectus cut meat from elephant carcasses but do not reveal whether these humans were regular or specialized hunters

Humans who lived outside Africa,-especially in colder temperate climates almost needed to eat more meat than their African counterparts. Humans in temperate Eurasia would have had to learn about which plants they could safely eat, and the number of available plant foods would drop significantly during the winter. Still, although scientists have found very few fossils of edible or eaten plants at early human sites, early inhabitants of Europe and Asia probably did eat plant foods besides meat.

Sites that provide the clearest evidence of early hunting include Boxgrove, England, where about 500,000 years ago people trapped several large game animals between a watering hole and the side of a cliff and then slaughtered them. At Schningen, Germany, a site about 400,000 years old, scientists have found wooden spears with sharp ends that were well designed for throwing and probably used in hunting large animals.

Neanderthals and other archaic humans seem to have eaten whatever animals were available at a particular time and place. So, for example, in European Neanderthal sites, the number of bones of reindeer (a cold-weather animal) and red deer (a warm-weather animal) changed depending on what the climate had been like. Neanderthals probably also combined hunting and scavenging to obtain animal protein and fat.

For at least the past 100,000 years, various human groups have eaten foods from the ocean or coast, such as shellfish and some sea mammals and birds. Others began fishing in interior rivers and lakes. Between probably 90,000 and 80,000 years ago people in Katanda, in what is now the Democratic Republic of the Congo, caught large catfish using a set of barbed bone points, the oldest known specialized fishing implements. The oldest stone tips for arrows or spears date from about 50,000 to 40,000 years ago. These technological advances, probably first developed by early modern humans, indicate an expansion in the kinds of foods humans could obtain. Beginning 40,000 years ago humans began making even more significant advances in hunting dangerous animals and large herds, and in exploiting ocean resources. People cooperated in large hunting expeditions in which they killed many reindeer, bison, horses, and other animals of the expansive grasslands that existed at that time. In some regions, people became specialists in hunting certain kinds of animals. The familiarity these people had with the animals they hunted appears in sketches and paintings on cave walls, dating from as much as 32,000 years ago. Hunters also used the bones, ivory, and antlers of their prey to create art and beautiful tools. In some areas, such as the central plains of North America that once teemed with a now-extinct type of large bison (Bison occidentalis), hunting may have contributed to the extinction of entire species.

The making and use of tools alone probably did not distinguish early humans from their ape predecessors. Instead, humans made the important breakthrough of using one tool to make another. Specifically, they developed the technique of precisely hitting one stone against another, known as knapping. Stone Toolmaking characterized the period that on give occasion to have to do with the Stone Age, which began at least 2.5 million years ago in Africa and lasted until the development of metal tools within the last 7,000 years (at different times in different parts of the world). Although early humans may have made stone tools before 2.5 million years ago, Toolmaker may not have remained long enough in one spot to leave clusters of tools that an archaeologist would notice today.

The earliest simple form of stone Toolmaking involved breaking and shaping an angular rock by hitting it with a palm-sized round rock known as a hammerstone. Scientists refer to tools made in this way as Oldowan, after Olduvai Gorge in Tanzania, a site from which many such tools have come. The Oldowan tradition lasted for about one million years. Oldowan tools include large stones with a chopping edge, and small, sharp flakes that could be used to scrape and slice. Sometimes Oldowan Toolmaker used anvil stones (flat rocks found or placed on the ground) on which hard fruits or nuts could be broken open. Chimpanzees are known to do this today.

Humans have always adapted to their environments by adjusting their behaviour. For instance, early Australopiths moved both in the trees and on the ground, which probably helped them survive environmental fluctuations between wooded and more open habitats. Early Homo adapted by making stone tools and transporting their food over long distances, thereby increasing the variety and quantities of different foods they could eat. An expanded and flexible diet would have helped these Toolmaker survive unexpected changes in their environment and food supply

When populations of H. erectus moved into the temperate regions of Eurasia, but they faced unseasoned challenges to survival. During the colder seasons they had to either move away or seek shelter, such as in caves. Some of the earliest definitive evidence of cave dwellers dates from around 800,000 years ago at the site of Atapuerca in northern Spain. This site may have been home too early H. heidelbergensis populations. H. erectus also used caves for shelter.

Eventually, early humans learned to control fire and to use it to create warmth, cook food, and protect themselves from other animals. The oldest known fire hearths date from between 450,000 and 300,000 years ago, at sites such as Bilzingsleben, Germany; Verteszöllös, Hungary; and Zhoukoudian (Chou-k̀ou-tien), China. African sites as old as 1.6 million to 1.2 million years contain burned bones and reddened sediments, but many scientists find such evidence too ambiguous to prove that humans controlled fire. Early populations in Europe and Asia may also have worn animal hides for warmth during glacial periods. The oldest known bone needles, which indicate the development of sewing and tailored clothing, date from about 30,000 to 26,000 years ago.

Behaviour relates directly to the development of the human brain, and particularly the cerebral cortex, the part of the brain that allows abstract thought, beliefs, and expression through language. Humans communicate through the use of symbols-ways of referring to things, ideas, and feelings that communicate meaning from one individual to another but that need not have any direct connection to what they identify. For instance, a word, or utterance is only one type of symbolization, in that of doing or not as the usually related directional thing or, perhaps, as an ideal symbol represents; it is nonrepresentational English-speaking people use the word lion to describe a lion, not because a dangerous feline looks like the letters I i-o-n, but because these letters together have a meaning created and understood by people.

People can also paint abstract pictures or play pieces of music that evoke emotions or ideas, even though emotions and ideas have no form or sound. In addition, people can conceive of and believe in supernatural beings and powers-abstract concepts that symbolize real-world events such as the creation of Earth and the universe, the weather, and the healing of the sick. Thus, symbolic thought lies at the heart of three hallmarks of modern human culture: language, art, and religion.

In language, people creatively join words together in an endless variety of sentences, hopefully graduating to phrases and perhaps, the paragraphs and lastly with the grandiosities fulfilled in writing a book. Each set-category has a distinct meaning as accorded to its set-classification by mental rules, or grammar. Language provides the ability to communicate complex concepts. It also allows people to exchange information about both past and future events, about objects that are not present, and about complex philosophical or technical concepts

Language gives people many adaptive advantages, including the ability to plan, to communicate the location of food or dangers to other members of a social group, and to tell stories that unify a group, such as mythologies and histories. However, words, sentences, and languages cannot be preserved like bones or tools, so the evolution of language is one of the most difficult topics to investigate through scientific study.

It appears that modern humans have an inborn instinct for language. Under normal conditions not developing language is almost impossible for a person, and people everywhere go through the same stages of increasing language skill at about the same ages. While people appear to have inborn genetic information for developing language, they learn specific languages based on the cultures from which they come and the experiences they have in life.

The ability of humans to have language depends on the complex structure of the modern brain, which has many interconnected, specific areas dedicated to the development and control of language. The complexity of the brain structures necessary for language suggests that it probably took a long time to evolve. While paleoanthropologists would like to know when these important parts of the brain evolved, endocasts (inside impressions) of early human skulls do not provide enough detail to show this.

Some scientists think that even the early Australopiths had some ability to understand and use symbols. Support for this view comes from studies with chimpanzees. A few chimps and other apes have been taught to use picture symbols or American Sign Language for simple communication. Nevertheless, it appears that language-as well as art and religious ritual-became vital aspects of human life only during the past 100,000 years, primarily within our own species.

Humans also express symbolic thought through many forms of art, including painting, sculpture, and music. The oldest known object of possible symbolic and artistic value dates from about 250,000 years ago and comes from the site of Berekhat Ram, Israel. Scientists have interpreted this object, a figure carved into a small piece of volcanic rock, as a representation of the outline of a female body. Only a few other possible art objects are known from between 200,000 and 50,000 years ago. These items, from western Europe and usually attributed to Neanderthals, include two simple pendants-a tooth and a bone with bored holes and several grooved or polished fragments of tooth and bone.

Sites dating from at least 400,000 years ago contain fragments of red and black pigment. Humans might have used these pigments to decorate bodies or perishable items, such as wooden tools or clothing of animal hides, but this evidence would not have survived to today. Solid evidence of the sophisticated use of pigments for symbolic purposes-such as in religious rituals,-comes only from after 40,000 years ago. From early in this period, researchers have found carefully made types of crayons used in painting and evidence that humans burned pigments to create a range of colours.

People began to create and use advanced types of symbolic objects between about 50,000 and 30,000 years ago. Much of this art appears to have been used in rituals-possibly ceremonies to ask spirit beings for a successful hunt. The archaeological record shows a tremendous blossoming of art between 30,000 and 15,000 years ago. During this period people adorned themselves with intricate jewellery of ivory, bone, and stone. They carved beautiful figurines representing animals and human forms. Many carvings, sculptures, and paintings depict stylized images of the female body. Some scientists think such female figurines represent fertility.

Early wall paintings made sophisticated use of texture and colour. The area of what is now. Southern France contains many famous sites of such paintings. These include the caves of Chauvet, which contain art more than 30,000 years old, and Lascaux, in which paintings date from as much as 18,000 years ago. In some cases, artists painted on walls that can be reached only with special effort, such as by crawling. The act of getting to these paintings gives them a sense of mystery and ritual, as it must have to the people who originally viewed them, and archaeologists refer to some of the most extraordinary painted chambers as sanctuaries. Yet no one knows for sure what meanings these early paintings and engravings had for the people who made them.

Graves from Europe and western Asia indicate that the Neanderthals were the first humans to bury their dead. Some sites contain very shallow graves, which group or family members may have dug simply to remove corpses from sight. In other cases it appears that groups may have observed rituals of grieving for the dead or communicating with spirits. Some researchers have claimed that grave goods, such as meaty animal bones or flowers, had been placed with buried bodies, suggesting that some Neanderthal groups might have believed in an afterlife. In a large proportion of Neanderthal burials, the corpse had its legs and arms drawn in close to its chest, which could indicate a ritual burial position.

Other researchers have challenged these interpretations, however. They suggest that perhaps the Neanderthals had practically rather than religious reasons for positioning dead bodies. For instance, a body manipulated into a fetal position would need only a small hole for burial, making the job of digging a grave easier. In addition, the animal bones and flower pollen near corpses could have been deposited by accident or without religious intention.

Many scientists once thought that fossilized bones of cave bears (a now-extinct species of large bear) found in Neanderthal caves indicated that these people had what has been referred to as a cave bear cult, in which they worshipped the bears as powerful spirits. However, after careful study researchers concluded that the cave bears probably died while hibernating and that Neanderthals did not collect their bones or worship them. Considering current evidence, the case for religion among Neanderthal prevails upon disputatiousness.

One of the most important developments in human cultural behaviours occurred when people began to domesticate (control the breeding of) plants and animals. and the advent of agriculture led to the development of dozens of staple crops (foods that forms the basis of an entire diet) in temperate and tropical regions around the world. Almost the entire population of the world today depends on just four of these major crops: wheat, rice, corn, and potatoes.

The growth of farming and animal herding initiated one of the most remarkable changes ever in the relationship between humans and the natural environment. The change first began just 10,000 years ago in the Near East and has accelerated very rapidly since then. It also occurred independently in other places, including areas of Mexico, China, and South America. Since the first of plants and animals, many species over large areas of the planet have come under human control. The overall number of plant and animal species has decreased, while the populations of a few species needed to support large human populations have grown immensely. In areas dominated by people, interactions between plants and animals usually fall under the control of a single species-Homo sapiens.

By the time of the initial transition to plant and animal, the cold, glacial landscapes of 18,000 years ago had long since given way to warmer and wetter environments. At first, people adapted to these changes by using a wider range of natural resources. Later they began to focus on a few of the most abundant and hardy types of plants and animals. The plant’s people began to use in large quantities included cereal grains, such as wheat in western Asia; wild varieties of rice in eastern Asia; and maize, of which corn is one variety, in what is now Mexico. Some of the animals people first began to herd included wild goats in western Asia, wild ancestors of chickens in eastern Asia, and llamas in South America.

By carefully collecting plants and controlling wild herd animals, people encouraged the development of species with characteristics favourable for growing, herding, and eating. This process of selecting certain species and controlling their breeding eventually created new species of plants, such as oats, barley, and potatoes, eatable animals, including cattle, sheep, and pigs. From these domesticated plant and animal species, people obtained important products, such as flour, milk, and wool.

By harvesting and herding domesticated species, people could store large quantities of plant foods, such as seeds and tubers, and have a ready supply of meat and milk. These readily available supplies gave people an abounding overindulgence-designate with a term food security. In contrast, the foraging lifestyle of earlier human populations never provided them with a significant store of food. With increased food supplies, agricultural peoples could settle into villages and have more children. The new reliance on agriculture and change to settled village life also had some negative effects. As the average diet became more dependent on large quantities of one or a few staple crops, people became more susceptible to diseases brought on by a lack of certain nutrients. A settled lifestyle also increased contact between people and between people and their refuse and waste matter, both of which acted to increase the incidence and transmission of disease.

People responded to the increasing population density-and a resulting overuse of farming and grazing lands-in several ways. Some people moved to settle entirely new regions. Others devised ways of producing food in larger quantities and more quickly. The simplest way was to expand onto new fields for planting and new pastures to support growing herds of livestock. Many populations also developed systems of irrigation and fertilization that allowed them to reuse crop-land and to produce greater amounts of food on existing fields.

The rise of civilizations-the large and complex types of societies in which most people still live today-developed along with surplus food production. People of high status eventually used food surpluses as a way to pay for labour and to create alliances among groups, often against other groups. In this way, large villages could grow into city-states (urban centres that governed them) and eventually empires covering vast territories. With surplus food production, many people could work exclusively in political, religious, or military positions, or in artistic and various skilled vocations. Command of food surpluses also enabled rulers to control labourers, such as in slavery. All civilizations developed based on such hierarchical divisions of status and vocation.

The earliest civilization arose more than 7,000 years ago in Sumer in what is now Iraq. Sumer grew powerful and prosperous by 5,000 years ago, when it centred on the city-state of Ur. The region containing Sumer, known as Mesopotamia, was the same area in which people had first domesticated animals and plants. Other centres of early civilizations include the Nile Valley of Northeast Africa, the Indus. Valley of South Asia, the Yellow River Valley of East Asia, the Oaxaca and Mexico valleys and the Yucatán region of Central America, and the Andean region of South America, China and Inca Empire

All early civilizations had some common features. Some of these included a bureaucratic political body, the military, a body of religious leadership, large urban centres, monumental buildings and other works of architecture, networks of trade, and food surpluses created through extensive systems of farming. Many early civilizations also had systems of writing, numbers and mathematics, and astronomy (with calendars); road systems; a formalized body of law; and facilities for education and the punishment of crimes. With the rise of civilizations, human evolution entered a phase vastly different from all before which came. Before this time, humans had lived in small, family-centred groups essentially exposed to and controlled by forces of nature. Several thousand years after the rise of the first civilizations, most people now live in societies of millions of unrelated people, all separated from the natural environment by houses, buildings, automobiles, and numerous other inventions and technologies. Culture will continue to evolve quickly and in unforeseen directions, and these changes will, in turn, influence the physical evolution of Homo sapiens and any other human species to come.

During the fist two billion years of evolution, bacteria were the sole inhabitants of the earth, and the emergence of a more complex form is associated with networking and symbiosis. During these two billion years, prokaryote, or organisms composed of cells with no nucleus (namely bacteria), transformed he earth’s surface and atmosphere. It was the interaction of these simple organisms that resulted in te complex processes of fermentation, photosynthesis, oxygen breathing, and the removal of nitrogen gas from the air. Such processes would not have evolved, however, if these organisms were atomized in the Darwinian sense or if the force of interaction between parts existed only outside the parts.

In the life of bacteria, bits of genetic material within organisms are routinely and rapidly transferred to other organisms. At any given time, an individual bacteria have the use of accessory gene, often from very different strains, which execute unprepared functions are not carried through by its own DNA. Some of this genetic material can be incorporated into the DNA of the bacterium and some may be passed on to other bacteria. What this picture indicates, as Margulis and Sagan put it, is that “all the worlds’ bacteria have access to a single gene pool and hence to the adaptive mechanisms of the entire bacterial kingdom.”

Since the whole of this gene pool operates in some sense within the parts, the speed of recombination is much greater than that allowed by mutation alone, or by random changes inside parts that alter interaction between parts. The existence of the whole within parts explains why bacteria can accommodate change on a worldwide cale in a few years. If the only mechanism at work were mutation inside organisms, millions of years would require for bacteria to adapt to a global change in the conditions for survival. “By constantly and rapidly adapting to environmental conditions,” wrote Margukis and Sagan, “the organisms of the microcosm support the entire biota, their global exchange network ultimately affecting every living plant and animal.”

The discovery of symbiotic alliance between organisms that become permanent is other aspect of the modern understanding of evolution that appears to challenge Darwin’s view of universal struggle between atomized individual organisms. For example, the mitochondria fond outside the nucleus of modern cells allows the cell to utilize oxygen and to exist in an oxygen-rich environment. Although mitochondria enacts upon integral and essential functions in the life of the cell, they have their own genes composed of DNA, reproduced by simple division, and did so at time different from the rest of the cells.

The most reasonable explanation for this extraordinary alliance between mitochondria and the rest of the cell that oxygen-breathing bacteria in primeval seas combined with the organisms. These ancestors of modern mitochondria provided waste disposal and oxygen-derived energy in exchange for food and shelter and evolved via symbiosis more complex forms of oxygen-breathing life, since the whole of these organisms was lager than the sum of their symbiotic pats, this allowed for life functions that could not be carried to completion by the mere collection of pasts. The existence of the whole within the parts coordinates metabolic functions and overall organization

Awaiting upon the unformidable future, of which the future has framed its proposed modern understanding of the relationship between mind and world within the larger content of the history of mathematical physics, the origin and extensions of the classical view of the functional preliminaries in association with scientific knowledge, and the various ways that physics has attempted to prevent previous challenges to the efficacy of classical epistemology. There is no basis in contemporary physics or biology for believing in the stark Cartesian division between mind and world that some have moderately described as ‘the disease of the Western mind.’ The dialectic orchestrations will serve as background for understanding a new relationship between parts and wholes in physics, with a similar view of that relationship that has emerged in the so-called ‘new-biology’ and in recent studies of the evolution of a scientific understanding to a more conceptualized representation of ideas, and includes its ally ‘content’.

Recent studies on the manner in which the brains of our ancestors evolved the capacity to acquire and use complex language systems also present us with a new view of the relationship between parts and wholes in the evolution of human consciousness. These studies suggest that cognitive narrations cannot fully explain the experience of consciousness about the physical substrates of consciousness, or that the whole that corresponds with any moment of conscious awareness is an emergent phenomenon that a stable and cohering cognizance cannot fully explain as to the sum of its constituent parts. This also suggests that the pre-adaptive change in the hominid brain that enhanced the capacity to use symbolic communication over a period of 2.5 million years cannot be fully explained as to the usual dynamics of Darwinian evolution.

Recent studies on the manner in which the brains of our ancestors evolved the capacity to acquire and use complex language systems also present us with a new view of the relationship between parts and wholes in the evolution of human consciousness. These studies suggest that the experience of consciousness cannot be fully explained through the physical substrates of consciousness, or that the whole that corresponds with any moment of conscious awareness is an emergent phenomenon that cannot be fully explained as to the sum of its constituent parts. This also suggests that the pre-adaptive change in the hominid brain that enhanced the capacity to use symbolic communication over a period of 2.5 million years cannot be fully explained as for the usual dynamics of Darwinian evolution

Part and wholes in Darwinian theory cannot reveal the actual character of living organisms because that organism exists only in relation to the whole of biological life. What Darwin did not anticipate, however, is that the whole that is a living organism appears to exist in some sense within the parts, and that more complex life forms evolved in precesses in which synergy and cooperation between parts (organisms) result in new wholes (more complex of parts) withe emergent properties that do not exist in the collection of parts. More remarkable, this new understanding of the relationship between part and whole in biology seems very analogous to the disclosed by the discovery of non-locality in physics. We should stress, however, that this view of the relationship between parts and wholes in biologic reality is most orthodox and may occasion some controversy in the community of biological scientists.

Since Darwin’s understanding of the relations between part and whole was essentially classical and mechanistic, the new understanding of this relationship is occasioning some revising of his theory of evolution. Darwin made his theory public for the first time in a paper derived to the Linnean Society in 1858. The paper began, ‘All nature hidden, extorted by its adhering agenda embedded to the primitivity of its shaken hostilities as once founded imbedded within the organisms of one another, or with other congestive appetites that gives to the characterology by some externalized nature. In the Origins of Species, Darwin speaks more specifically about the charter of this war: “There must be in every case a struggle for existence one individual either with another of the same species, or with the individual with another of the same species, and still, with the individuals of distinct species, or with physical condition of life.” All these assumptions are apparent in Darwin’s definition of natural selection: If under chancing conditions of life organic brings present individual differences in almost every part of their structure, and that all construing metabolisms cannot dispute this: If there be, owing to their geometrical rate of an increase, a severe struggle for life to some age, season, or year, and this certainty can then, be considered the infinite complexity of the relating of all organic being to each other and to their conditions of life causing an infinite diversity in structure, constitution, habits, to be advantageous as those that it would be most extraordinary fact if no variations had ever occurred usefully to each being’ own welfare. Nevertheless, in the variations useful any organic being ever d occurred, absurdly individuals thus characterized will have the best chance of being preserved in the struggle for life, and from the strong principle of inheritances, that often have a tendency to produce offsprings similarly characterized. Thus the principle of preservation, of resembling the survival of the fittest-is called Natural Selection.

Based on the assumption that the study of variation in domestic animals and plants, ‘afforded the best and safest clue’ to understanding evolution. Unforeseeably, the humans who domesticated animals were the first to fall victim to the newly evolved germs, but those humans then evolved substantial resistance to the new diseases. When such partly immune people came into contact with others who had no previous exposure to the germ, epidemics resulted in which up to 99 percent of the previously unexposed population was killed. Germs thus acquired ultimately from domestic animals played decisive roles in the European conquests of Native Americans, Australians, South Africans, and Pacific islanders.

Yet as before, the same pattern repeated itself elsewhere in the world, whenever peoples lacking native wild mammal species suitable for finally had the opportunity to acquire by Native Americans in both North and South America, within a generation of the escape of horses from Europe settlements. For example, by the 19th century North America’s Great Plain Indians were famous as expert horse-mounted warriors and bison hunters, but they did not even obtain horses until the late 17th century. Sheep acquired from Spaniards similarly transformed Navajo Indian society and led to, among other things, the weaving of the beautiful woolen blankets for which the Navajo have become renowned. Within a decade of Tasmania’s settlement by Europeans with dogs, Aboriginal Tasmanian’s who had never before seen dogs, began to breed them in large numbers for use in hunting. Thus, among the thousands of culturally diverse native peoples of Australia. The America, and Africa, no universal cultural taboo stood in the way of animal.

Surely, if some local wild mammal species of those continents had been domesticable, some Australian, American, and African peoples would have domesticated them and gained great advantage from them, just as they benefited from the Eurasian domestic animals that they immediately adopted when those became available. For instance, consider all the peoples of sub-Saharan Africa living within the range of wild zebras and buffalo. Why wasn’t there at least on African hunter-gatherer tribe that domesticated those zebras and buffalo and that thereby gained sway over other Africans, without having to await the arrival of Eurasian horses and cattle? All these facts show that the explanation for the lack of native mammal outside Eurasia lay with the locally available wild mammals themselves, nor with the local people.

To the point, evidence for the same interpretation comes from pets. Keeping wild animals as pets, and taming them. Constitute an initial stage in. However, pets have been reported from virtually all traditional human societies on all continents. The variety of wild animals thus tamed is far grater than the variety eventually domesticated, and includes some species that we would scarcely have imagined as pets.

Given our proximity to the animals we love, we must be getting constantly bombarded by their microbes. Those invaders get winnowed by natural selection, and only a few of them succeed in establishing themselves as human diseases.

The first stage is illustrated by dozens of diseases that we now and then pick up directly from our pets and domestic animals. They include cat-scratch fever from our cats, leptospirosis from our dogs, psittacosis from our chickens and parrots, and brucellosis from our cattle. We’re similarly liable to pick up diseases from wild animals, such as the tularaemia that hunters can get from skinning wild rabbits. All those microbes are still at an early stage in their evolution into specialized human pathogens. They still don’t get transmitted directly from one person to another, and even their transfer to us from animals remain uncommon.

In the second stage a former animal pathogen evolves to the point where it does get transmitted directly between people and causes epidemics. However, the epidemic dies out for any of several reasons, such for being cured by modern medicine, or being stopped when everybody around has already been infected and either becomes immune or dies. For example, a previously unknown fever termed O’nyong-nyong fever appeared in East Africa in 1959 and proceeded to infect several million Africans. It probably arose from a virus of monkeys and was transmitted humans by mosquitoes. The fact that patients recovered quickly and became immune too further attack helped the new disease die out quickly. Closer to home for Americans, Fort Gragg fever was the name applied to a new leptospiral disease that broke out in the United States an the summer of 1942 and soon disappeared.

A third stage in the evolution of our major diseases is represented by former animal pathogens that did establish themselves in humans, whom have not (not yet?) died out, and that may or may not still become major killers of humanity. The future remains very uncertain for Lassa fever, caused by a virus derived probably from rodents. Lassa fevers were first observed in 1969 in Nigeria, were it causes a fatal illness so contagious that Nigerian hospitals have been closed down if even a single case appears. Better established is Lyme disease, caused by a spirochete that we get from the bite of ticks carried by mice and deer. Although the first known human cases in the United States appeared only as recently as 1962, Lyme disease is already reaching epidemic proportions in many parts of our country. the future of AIDS, derived from monkey viruses and first documented in humans around 1959, is even secure (from the virus’s perspective).

The final stage of this evolution is represented by the major, long-established epidemic diseases confined to humans. These diseases must have been the evolutionary survivors of far more pathogens that tried to make the jump to us from animal, and mostly failed.

In short, diseases represent evolution in progress, and microbes adapt by natural selection to new hosts and vectors. Nonetheless, compared with cows’ bodies, ours offer different immune defences, lice, faeces, and chemistries. In that new environment, a microbe must evolve new ways to live and to propagate itself. In several instructive cases doctors or veterinarians have been able to observe microbes evolving those new ways.

Darwin concluded that nature could by crossbreeding and selection of traits, provide new species. His explanation of the mechanism in nature that results in a new specie took the form of a syllogism: (1) the principle of geometric increases indicated that more individuals in each species will have produced than can survive, (2) the struggle for existence occurs as one organism competes with another, (3) in this struggle for existence, slight variations, if they prove advantageous will accumulate to produce new species, in analogy with the animal breeder’s artificial selection of traits Darwin termed the elimination of the disadvantaged and the promotion of the advantaged natural selection.

In Darwin’s view, the struggle for existence occurs ‘between’ an atomized individual organism and of the atomized individual organisms in the same species: ‘between’ and ‘atomized’ individual organisms of new species with that of a different species, or ‘between’ an atomized individual organism and the physical conditions of life the whole as Darwin conceived it is the collection of all atomized individual organisms, or parts. The struggle for survival occurs ‘between’ or ‘outside’ the parts. Since Darwin’s viewing this struggle as the only limiting condition in which the accountable rate of an increase in organises, he assumed that rate will be geometrical when the force of a struggle between parts is weak and that the rate will decline with the force becomes stronger.

Natural selection occurred, said Darwin, when variations applicatively form; as each being accountable for through his own welfare,’ or useful to the welfare of an atomized individual organism, provides a survival advantage and the organism produces ‘offspring similarly characterized.’ Since the force that makes this selection operates ‘outside’ the totality of parts. For example, the ‘infinite complexities of relations of all organic beings to each other and to their condition of liveliness’ refers to dealing relations between parts, and the ‘infinite diversity in structure, constitute habit’ refers to remaining traits within the atomized part. It seems clear in our view that the atomized individual organism in Darwin’s biological machine reassembles classical atoms and that the force that drives the interactions of the atomized parts, the ‘struggle for life’ resembles Newton’s force of universal gravity. Although Darwin parted company with classical determinism in the claim that changes, of mutations, within organisms occurred randomly, his view of the relationship between parts and wholes essentially mechanistic.

Darwinism belief in the theory of ‘evolution’ by natural selection took form in its original formality from the observation of Malthus, although belonging principally to the history of science, as these encountering beliefs are met straight on into a philosophically influenced Malthus’s Essay on Population (1798) in undermining the Enlightenment belief in unlimited possibilities of human progress and perfection. The Origin of Species was principally successful in marshalling the evidence for evolution, than providing a convincing mechanism for genetic change; Darwin himself remained open to the search for additional in its mechanisms, while also remaining convinced that naturae section was at the heart of it. It was only with the later discovery of him ‘gene’ as the unit of inheritance hast the synthesis known as ‘neo-Darwinism’ became the orthodox theory of evolution in life science.

Human Evolution, is pressively the process through which a lengthy period of change is admissively given by people who have originated from apelike ancestors. Scientific evidence shows that the physical and behavioural traits shared by all people evolved over a period of at least six million years.

One of the earliest defining human traits, Bipedalism -walking on two legs as the primary form of locomotion-undergoes an evolution of more than four million years ago. Other important human characteristics-such as a large and complex brain, the ability to make and use tools, and the capacity for language-developed more recently. Many advanced traits,-including complex symbolic expression, such as art, and elaborate cultural diversity emerged mainly during the past 100,000 years.

Humans are primates. Physical and genetic similarities show that the modern human species, Homo sapiens, has a very close relationship to another group of primate species, the apes. Humans and the so-called great apes (large apes) of Africa-chimpanzees (including bonobos, or so-called pygmy chimpanzees) and gorillas,-share a common ancestor that lived sometime between eight million and six million years ago. The earliest humans evolved in Africa, and much of human evolution occurred on that continent. The fossils of early humans who lived between six million and two million years ago come entirely from Africa.

Early humans first migrated out of Africa into Asia probably between two million and 1.7 million years ago. They entered Europe so-so later, generally within the past one million years. Species of modern humans populated many parts of the world much later. For instance, people first came to Australia probably within the past 60,000 years, and to the Americas within the past 35,000 years. The beginnings of agriculture and the rise of the first civilizations occurred within the past 10,000 years.

The scientific study of human evolution is called Paleoanthropology. Paleoanthropology is a sub-field of anthropology, the study of human culture, society, and biology. Paleoanthropologists search for the roots of human physical traits and behaviour. They seek to discover how evolution has shaped the potentials, tendencies, and limitations of all people. For many people, Paleoanthropology is an exciting scientific field because it illuminates the origins of the defining traits of the human species, as well as the fundamental connections between humans and other living organisms on Earth. Scientists have abundant evidence of human evolution from fossils, artifacts, and genetic studies. However, some people find the concept of human evolution troubling because it can seem to conflict with religious and other traditional beliefs about how people, other living things, and the world came to be. Yet many people have come to reconcile such beliefs with the scientific evidence.

All species of organisms originate through the process of biological evolution. In this process, new species arise from a series of natural changes. In animals that reproduce sexually, including humans, the term species refers to a group whose adult members regularly interbreed, resulting in fertile offspring,-that is, offspring themselves capable of reproducing. Scientists classify each species with a unique, and two-part scientific name. In this system, modern humans are classified as Homo sapiens.

The mechanism for evolutionary change resides in genes-the basic units of heredity. Genes affect how the body and behaviour of an organism develop during its life. The information contained within genetical change is a latent process known as mutation. The way particular genes are expressive articulated as they affect the body or behaviour of an organism-can also change. Over time, genetic change can alter a species’s overall way of life, such as what it eats, how it grows, and where it can live.

Genetic changes can improve the ability of organisms to survive, reproduce, and, in animals, raise offspring. This process is called adaptation. Parents pass adaptive genetic changes to their offspring, and ultimately these changes become common throughout a population-a group of organisms of the same species that share a particular local habitat. Many factors can favour new adaptations, but changes in the environment often play a role. Ancestral human species adapted to new environments as their genes changed, altering their anatomy (physical body structure), physiology (bodily functions, such as digestion), and behaviour. Over long periods, evolution dramatically transformed humans and their ways of life.

Geneticists estimate that the human line began to diverge from that of the African apes between eight million and five million years ago (paleontologists have dated the earliest human fossils, too, at least, six million years ago). This figure comes from comparing differences in the genetic makeup of humans and apes, and then calculating how long it probably took for those differences to develop. Using similar techniques and comparing the genetic variations among human populations around the world, scientists have calculated that all people may share common genetic ancestors that lived sometime between 290,000 and 130,000 years ago.

Humans belong to the scientific order named Primates, a group of more than 230 species of mammals that also includes lemurs, lorises, tarsiers, monkeys, and apes. Modern humans, early humans, and other species of primates all have many similarities as well as some important differences. Knowledge of these similarities and differences helps scientists to understand the roots of many human traits, as well as the significance of each step in human evolution.

All primates, including humans, share at least part of a set of common characteristics that distinguish them from other mammals. Many of these characteristics evolved as adaptations for life in the trees, the environment in which earlier primates evolved. These include more reliance on sight than smell; overlapping fields of vision, allowing stereoscopic (three-dimensional) sight; limbs and hands adapted for clinging on, leaping from, and swinging on tree trunks and branches; the ability to grasp and manipulate small objects (using fingers with nails instead of claws); large brains in relation to body size; and complex social lives.

The scientific classification of primates reflects evolutionary relationships between individual species and groups of species. Strepsirhines (meaning ‘turned-nosed’) primates-of that the living representatives include lemurs, lorises, and other groups of species all commonly known as prosimians-evolved earliest and are the most primitive forms of primates. The earliest monkeys and apes evolved from ancestral haplorhine (meaning ‘simple-nosed’) primates, of which the most primitive living representative is the tarsier. Humans evolved from ape ancestors.

Tarsiers have traditionally been grouped with prosimians, but many scientists now recognize that tarsiers, monkeys, and apes share some distinct traits, and group the three together. Monkeys, apes, and humans-who share many traits not found in other primates-together make up the suborder Anthropoidea. Apes and humans together make up the super-family bestowed upon Hominoidea, a grouping that emphasizes the close relationship among the species of these two groups.

Strepsirhines are the most primitive types of living primates. The last common ancestors of Strepsirhines and other mammals-creatures similar to tree shrews and classified as Plesiadapiformes-evolved at least sixty-five million years ago. The earliest primates evolved by about fifty-five million years ago, and fossil species similar to lemurs evolved during the Eocene Epoch (about fifty-five million to thirty-eight million years ago). Strepsirhines share all of the basic characteristics of primates, although their brains are not particularly large or complex and they have a more elaborate and sensitive olfactory system (sense of smell) than do other primates are the only living representatives of a primitive group of primates that ultimately led to monkeys, apes, and humans. Fossil species called omomyids, with some traits similar to those of tarsiers, evolved near the beginning of the Eocene, followed by early tarsier-like primates. While the omomyids and tarsiers are separate evolutionary branches (and there are no living omomyids), they both share features having to do with a reduction in the olfactory system, a trait shared by all haplorhine primates, including humans.

The anthropoid primates are divided into New World (South America, Central America, and the Caribbean Islands) and Old World (Africa and Asia) groups. New World monkeys-such as marmosets, capuchins, and spider monkeys-belong to the infra-order of platyrrhine (broad-nosed) anthropoids. Old World monkeys and apes belong to the infra-order of catarrhine (downward-nosed) anthropoids. Since humans and apes together make up the hominoids, humans are also catarrhine anthropoids.

The first catarrhine primates evolved between fifty million and thirty-three million years ago. Most primate fossils from this period have been found in a region of northern Egypt known as Al fay y~? m (or the Fayum). A primate group known as Propliopithecus, one lineage of which is sometimes called Aegyptopithecus, had primitive catarrhine features-that is, it had many of the basic features that Old World monkeys, apes, and humans share today. Scientists believe, therefore, that Propliopithecus resembles the common ancestor of all later Old World monkeys and apes. Thus, Propliopithecus may also be considered an ancestor or a close relative of an ancestor of humans evolved during the Miocene Epoch (24 million to five million years ago). Among the oldest known hominoids is a group of primates known by its genus name, Proconsul. Species of Proconsul had features that suggest a close link to the common ancestor of apes and humans-for example, the lack of a tail. The species Proconsul heseloni lived in the trees of dense forests in eastern Africa about twenty million years ago. An agile climber, it had the flexible backbone and narrow chest characteristic of monkeys, but also a wide range of movement in the hip and thumb, traits characteristic of apes and humans.

Early in their evolution, the large apes underwent several radiations-periods when new and diverse species branched off from common ancestors. Following Proconsul, the ape genus Afropithecus evolved about eighteen million years ago in Arabia and Africa and diversified into several species. Soon afterward, three other ape genera evolved-Griphopithecus of western Asia about 16.5 million years ago, the earliest ape to have spread from Africa; Kenyapithecus of Africa about fifteen million years ago; moreover, Dryopithecus of Europe exceeds twelve million years ago. Scientists have not yet determined which of these groups of apes may have given rise to the common ancestor of modern African apes and humans.

Scientists do not all agree about the appropriate classification of hominoids. They group the living hominoids into either two or three families: Hylobatidae, Hominidae, and sometimes Pongidae. Hylobatidae consists of the small or so-called lesser apes of Southeast Asia, commonly known as gibbons and siamangs. The Hominidae (hominids) includes humans and, according to some scientists, the great apes. For those who categorize its properties of being only human among the Hominidae, are as yet, unconditionally positioned as out of place, and contained to the great apes, including the orangutans of Southeast Asia, from which belong to the family Pongidae.

In the past only humans were considered to belong to the family Hominidae, and the term hominid referred only to species of humans. Today, however, genetic studies support placing all of the great apes and humans together in this family and the placing of African apes-chimpanzees and gorillas-together with humans at an even lower level, or subfamily.

According to this reasoning, the evolutionary branch of Asian apes leading to orangutans, which separated from the other hominid branches by about thirteen million years ago, belongs to the subfamily Ponginae. The ancestral and living representatives of the African ape and human branches together belong to the subfamily Homininae (sometimes called Hominines). Lastly, the line of early and modern humans belongs to the tribe (classificatory level above genus) Hominini, or hominins.

This order of classification corresponds with the genetic relationships between ape and human species. It groups humans and the African apes together at the same level in which scientists group together, for example, all types of foxes, all buffalo, or all flying squirrels. Within each of these groups, the species are very closely related. However, in the classification of apes and humans the similarities between the name’s hominoid, hominid, hominine, and hominin can be confusing. In this article the term early human refers to all species of the human family tree since the divergence from a common ancestor with the African apes. Popular writing often still uses the term hominid to mean the same thing.

About 98.5 percent of the genes in people and chimpanzees are identical, making chimps the closest living biological relatives of humans. This does not mean that humans evolved from chimpanzees, but it does indicate that both species evolved from a common ape ancestor. Orangutans, the great apes of Southeast Asia, differ much more from humans genetically, indicating a more distant evolutionary relationship.

Modern humans have a number of physical characteristics reflective of an ape ancestry. For instance, people have shoulders with a wide range of movement and fingers capable of strong grasping. In apes, these characteristics are highly developed as adaptations for brachiation-swinging form branch to branch in trees. Although humans do not brachiate, the general anatomy from that earlier adaptation remains. Both people and apes also have larger brains and greater cognitive abilities than do most other mammals.

Human social life, too, shares similarities with that of African apes and other primates-such as baboons and rhesus monkeys-that live in large and complex social groups. Group behaviour among chimpanzees, in particular, strongly resembles that of humans. For instance, chimps form long-lasting attachments with each other; participate in social bonding activities, such as grooming, feeding, and hunting; and form strategic coalitions with each other in order to increase their status and power. Early humans also probably had this kind of elaborate social life.

However, modern humans fundamentally differ from apes in many significant ways. For example, as intelligent as apes are, people’s brains are much larger and more complex, and people have a unique intellectual capacity and elaborate forms of culture and communication. In addition, only people habitually walk upright, can precisely manipulate very small objects, and have a throat structure that makes speech possible.

By around six million years ago in Africa, an apelike species had evolved with two important traits that distinguished it from apes: (1) small canine, or eye, teeth (teeth next to the four incisors, or front teeth) and (2) Bipedalism, that is walking on two legs as the primary form of locomotion. Scientists refer to these earliest human species as australopithecines, or Australopiths for short. The earliest Australopiths species known today belong to three genera: Sahelanthropus, Orrorin, and Ardipithecus. Other species belong to the genus Australopithecus and, by some classifications, Paranthropus. The name australopithecine translates literally as ‘southern ape,’ in reference to South Africa, where the first known Australopiths fossils were found.

The Great Rift Valley, a region in eastern Africa in which past movements in Earth’s crust have exposed ancient deposits of fossils, has become famous for its Australopiths finds. Countries in which scientists have found Australopiths fossils include Ethiopia, Tanzania, Kenya, South Africa, and Chad. Thus, Australopiths ranged widely over the African continent.

Fossils from several different early Australopiths species that lived between four million and two million years ago clearly show a variety of adaptations that marks the transition from ape too human. The very early period of this transition, before four million years ago, remains poorly documented in the fossil record, but those fossils that do exist show the most primitive combinations of ape and human features.

Fossils reveal much about the physical build and activities of early Australopiths, but not everything about outward physical features such as the colour and texture of skin and hair, or about certain behaviours, such as methods of obtaining food or patterns of social interaction. For these reasons, scientists study the living great apes-specifically the African apes, particularly to familiarize a-bettering understanding of how early the Australopiths might have looked and behaved to his transition from ape too human might have occurred. For example, Australopiths probably resembled the great apes in characteristics such as the shape of the face and the amount of hair on the body. Australopiths also had brains roughly equal in size to those of the great apes, so they probably had apelike mental abilities. Their social life probably resembled that of chimpanzees.

Most of the distinctly human physical qualities in Australopiths related to their bipedal stance. Before Australopiths, no mammal had ever evolved an anatomy for habitual upright walking. Australopiths also had small canine teeth, as compared with long canines found in almost all other catarrhine primates.

Other characteristics of Australopiths reflected their ape ancestry. They had a low cranium behind a projecting face, and a brain size of 390 to 550 cu. cm. (24 to thirty-four cu. in.)-in the range of an ape’s brain. The body weight of Australopiths, as estimated from their bones, ranged from twenty-seven to 49 kg. (60 to 108 lb.), and they stood 1.1 to 1.5 m. (3.5 to 5 ft.) tall. Their weight and height compare closely to those of chimpanzees (chimp height measured standing). Some Australopiths species had a large degree of sexual dimorphism-males were much larger than females-a trait also found in gorillas, orangutans, and another primates.

Australopiths also had curved fingers and long thumbs with a wide range of movement. In comparison, the fingers of apes are longer, more powerful, and more curved, making them extremely well adapted for hanging and swinging from branches. Apes also have very short thumbs, which limits their ability to manipulate small objects. Paleoanthropologists speculate as to whether the long and dexterous thumbs of Australopiths allowed them to use tools more efficiently than do apes.

The anatomy of Australopiths shows a number of adaptations for Bipedalism, in both the upper and lower body. Adaptations in the lower body included the following: The Australopiths ilium, or pelvic bone, which rises above the hip joint, was much shorter and broader than it is in apes. This shape enabled the hip muscles to steady the body during each step. The Australopiths pelvis also had a bowl-like shape, which supported the internal organs in an upright stance. The upper legs angled inward from the hip joints, which positioned the knees better to support the body during upright walking. The legs of apes, on the other hand, are positioned almost straight down from the hip, so that when an ape walks upright for a short distance, its body sways from side to side. Australopiths also had short and fewer flexible toes than do apes. The toes worked as rigid levers for pushing off the ground during each bipedal step.

Other adaptations occurred above the pelvis. The Australopiths spine had a S-shaped curve, which shortened the overall length of the torso and gave it rigidity and balance when standing. By contrast, apes have a straight spine. The Australopiths skull also had an important adaptation related to Bipedalism. The opening at the bottom of the skull through which the spinal cord attaches to the brain, called the foramen magnum, was positioned more forward than it is in apes. This position set the head in balance over the upright spine.

Australopiths clearly walked upright on the ground, but paleoanthropologists debate whether the earliest humans also spent a significant amount of time in the trees. Certain physical features indicate that they spent at least some of their time climbing in trees. Such features included they’re curved and elongated fingers and elongated arms. However, their fingers, unlike those of apes, may not have been long enough to allow them to brachiate through the treetops. Study of fossil wrist bones suggests that early Australopiths had the ability to lock their wrists, preventing backward bending at the wrist when the body weight was placed on the knuckles of the hand. This could mean that the earliest bipeds had an ancestor that walked on its knuckles, as African apes do

Compared with apes, humans have very small canine teeth. Apes-particularly males-have thick, projecting, sharp canines that they use for displays of aggression and as weapons to defend themselves. The oldest known bipeds, who lived at least six million years ago, still had large canines by human standards, though not as large as in apes. By four million years ago Australopiths had developed the human characteristic of having smaller, flatter canines. Canine reduction might have related to an increase in social cooperation between humans and an accompanying decrease in the need for males to make aggressive displays.

The Australopiths can be divided into an early group of species, known as gracile Australopiths, which arose before three million years ago; and a later group, known as robust Australopiths, which evolved after three million years ago. The gracile Australopiths of that several species evolved between 4.5 million and three million years ago-generally had smaller teeth and jaws. The later-evolving robusts had larger faces with large jaws and molars (cheek teeth). These traits indicate powerful and prolonged chewing of food, and analyses of wear on the chewing surface of robust Australopiths molar teeth support this idea. Some fossils of early Australopiths have features resembling those of the later species, suggesting that the robusts evolved from one or more gracile ancestors.

Paleoanthropologists recognize at least eight species of early Australopiths. These include the three earliest established species, which belong to the genera Sahelanthropus, Orrorin, and Ardipithecus, a species of the genus Kenyanthropus, and four species of the genus Australopithecus.

The oldest known Australopiths species is Sahelanthropus tchadensis. Fossils of this species were first discovered in 2001 in northern Chad, Central Africa, by a research team led by French paleontologist Michel Brunet. The researchers estimated the fossils to be between seven million and six million years old. One of the fossils is a fracture, yet nearly completes cranium that shows a combination of apelike and humanlike features. Apelike features include small brain size, an elongated brain case, and areas of bone where strong neck muscles would have attached. Humanlike features made up of small, flat canine teeth, a short middle part of the face, and a massive brow ridge (a bony, protruding ridge above the eyes) similar to that of later human fossils. The opening where the spinal cord attaches to the brain is tucked under the brain case, which suggests that the head was balanced on an upright body. It is not certain that Sahelanthropus walked bipedally, however, because bones from the rest of its skeleton have yet to be discovered. Nonetheless, its age and humanlike characteristics suggest that the human and African ape lineages had divided from one another by at least six million years ago.

In addition to reigniting debate about human origins, the discovery of Sahelanthropus in Chad significantly expanded the known geographic range of the earliest humans. The Great Rift Valley and South Africa, from which almost all other discoveries of early human fossils came, are apparently not the only regions of the continent that preserve the oldest clues of human evolution.

Orrorin tugenensis lived about six million years ago. This species was discovered in 2000 by a research team led by French paleontologist Brigitte Senut and French geologist Martin Pickford in the Tugen Hills region of central Kenya. The researchers found more than a dozen early human fossils dating between 6.2 million and six million years old. Among the finds were two thighbones that possess a groove indicative of an upright stance and bipedal walking. Although the finds are still being studied, the researchers consider these thighbones to be the oldest evidence of habitual two-legged walking. Fossilized bones from other parts of the skeleton show apelike features, including long, curved finger bones useful for strong grasping and movement through trees, and apelike canine and premolar teeth. Because of this distinctive combination of ape and human traits, the researchers gave a new genus and species name to these fossils, Orrorin tugenensis, which in the local language means ‘original man in the Tugen region.’ The age of these fossils suggests that the divergence of humans from our common ancestor with chimpanzees occurred before six million years ago.

In 1994 an Ethiopian member of a research team led by American paleoanthropologists Tim White discovered human fossils estimated to be about 4.4 million year’s old. White and his colleagues gave their discovery the name Ardipithecus ramidus. Ramid means ‘root’ in the Afar language of Ethiopia and refers to the closeness of this new species to the roots of humanity. At the time of this discovery, the genus Australopithecus was scientifically well established. White devised the genus name Ardipithecus to distinguish this new species from other Australopiths because its fossils had a very ancient combination of apelike and humanlike traits. More recent finds indicate that this species may have lived as early as 5.8 million to 5.2 million years ago.

The teeth of Ardipithecus ramidus had a thin outer layer of enamel-a trait also seen in the African apes but not in other Australopiths species or older fossil apes. This trait suggests a close relationship with an ancestor of the African apes. In addition, the skeleton shows strong similarities to that of a chimpanzee but has slightly reduced canine teeth and adaptations for Bipedalism.

In 1965 a research team from Harvard University discovered a single arm bone of an early human at the site of Kanapoi in northern Kenya. The researchers estimated this bone to be four million years old, but could not identify the species to which it belonged or return at the time to look for related fossils. It was not until 1994 that a research team, led by British-born Kenyan paleoanthropologists Meave Leakey, found numerous teeth and fragments of bone at the site that could be linked to the previously discovered fossil. Leakey and her colleagues determined that the fossils were those of a species very primitives from those of the Australopiths, which was given the name Australopithecus Anamensis. Researchers have since found other A. Anamensis fossils at nearby sites, dating between about 4.2 million and 3.9 million years old. The skull of this species appears apelike, while its enlarged tibia (lower leg bone) indicates that it supported its full body weight on one leg at a time, as in regular bipedal walking

Australopithecus Anamensis was quite similar to another, much better-known species, A. afarensis, a gracile Australopiths that thrived in eastern Africa between about 3.9 million and three million years ago. The most celebrated fossil of this species, known as Lucy, is a partial skeleton of a female discovered by American paleoanthropologists Donald Johanson in 1974 at Hadar, Ethiopia. Lucy lived 3.2 million years ago. Scientists have identified several hundred fossils of A. afarensis from Hadar, including a collection representing at least thirteen individuals of both sexes and various ages, all from a single site.

Researchers working in northern Tanzania have also found fossilized bones of A. afarensis at Laetoli. This site, dated at 3.6 million years old, is best known for its spectacular trails of bipedal human footprints. Preserved in hardened volcanic ash, these footprints were discovered in 1978 by a research team led by British paleoanthropologists Mary Leakey. They provide irrefutable evidence that Australopiths regularly walked bipedally.

Paleoanthropologists have debated interpretations of the characteristics of A. afarensis and its place in the human family tree. One controversy centres on the Laetoli footprints, which some scientists believe show that the foot anatomy and gait of A. afarensis did not exactly match those of the modern humans. This observation may suggest that early Australopiths did not live primarily on the ground or at least spent a significant amount of time in the trees. The skeleton of Lucy also suggests that A. afarensis had longer, more powerful arms than most later human species, suggesting that this species was adept at climbing trees.

A third controversy arises from the claim that A. afarensis was the common ancestor of both later Australopiths and the modern human genus, Homo. While this idea remains a strong possibility, the similarity between this and another Australopiths species-one from southern Africa, named Australopithecus africanus-makes it difficult to decide which of the two species gave rise to the genus Homo.

Australopithecus africanus thrived in the Transvaal region of what is now South Africa between about 3.3 million and 2.5 million years ago. Australian-born anatomist Raymond Dart discovered this species-the first known Australopiths,-in 1924 at Taung, South Africa. The specimen that of a young child, came to be known as the Taung Child. For decades after this discovery, almost no one in the scientific community believed Dart’s claim that the skull came from an ancestral human. In the late 1930's teams led by Scottish-born South African paleontologist Robert Broom unearthed many more A. africanus skulls and other bones from the Transvaal site of Sterkfontein.

A. africanus generally had a more globular braincase and less primitive-looking face and teeth than did A. afarensis. Thus, some scientists consider the southern species of early Australopiths to be a likely ancestor of the genus Homo. According to other scientists, however, certain heavily built facial and cranial features of A. africanus from Sterkfontein identify it as an ancestor of the robust Australopiths that lived later in the same region. In 1998 a research team led by South African paleoanthropologists Ronald Clarke discovered an almost complete early Australopiths skeleton at Sterkfontein. This important find may resolve some of the questions about where A. africanus fits in the story of human evolution

Working in the Lake Turkana’s region of northern Kenya, a research team led by paleontologist Meave Leakey uncovered in 1999 a cranium and other bone remains of an early human that showed a mixture of features unseen in previous discoveries of early human fossils. The remains were estimated to be 3.5 million years old, and the cranium’s small brain and earhole was similar to those of the earliest humans. Its cheekbone, however, joined the rest of the face in a forward position, and the region beneath the nose opening was flat. These are traits found in later human fossils from around two million years ago, typically those classified in the genus Homo. Noting this unusual combination of traits, researchers named a new genus and species, Kenyanthropus platyops, or ‘flat-faced humans from Kenya.’ Before this discovery, it seemed that only a single early human species, Australopithecus afarensis, lived in East Africa between four million and three million years ago. Yet Kenyanthropus suggests that a diversity of species, including a more humanlike lineage then A. afarensis, lived in this time, just as in most other eras in human prehistory.

The human fossil record is poorly known between three million and two million years ago, from which estimates make recent results in finding from the site of Bouri, Ethiopia, particularly important. From 1996 to 1998, a research team led by Ethiopian paleontologist Berhane Asfaw and American paleontologist Tim White found the skull and other skeletal remains of an early human specimen about 2.5 million years old. The researchers named it Australopithecus garhi; the word garhi means ‘surprise’ in the Afar language. The specimen is unique in having large incisors and molars in combination with an elongated forearm and thighbone. Its powerful arm bones suggest a tree-living ancestry, but its longer legs show the ability to walk upright on the ground. Fossils of A. garhi are associated with some of the oldest known stone tools, along with animal bones that were cut and cracked with tools. It is possible, then, that this species was among the first to make the transition to stone Toolmaking and to eating meat and bone marrow from large animals

By 2.7 million years ago the later, robust Australopiths had evolved. These species had what scientists refer to as megadont cheek teeth-wide molars and premolars coated with thick enamel. Their incisors, by contrast, were small. The robusts also had an expanded, flattened, and more vertical face than did gracile Australopiths. This face shape helped to absorb the stresses of strong chewing. On the top of the head, robust Australopiths had a sagittal crest (ridge of bone along the top of the skull from front to back) to which thick jaw muscles attached. The zygomatic arches (which extend back from the cheek bones to the ears), curved out wide from the side of the face and cranium, forming very large openings for the massive chewing muscles to pass through near their attachment to the lower jaw. Together, these traits say that the robust Australopiths chewed their food powerfully and for long periods.

Other ancient animal species that specialized in eating plants, such as some types of wild pigs, had similar adaptations in their facial, dental, and cranial anatomy. Thus, scientists think that the robust Australopiths had a diet consisting partly of tough, fibrous plant foods, such as seed pods and underground tubers. Analyses of microscopic wear on the teeth of some robust Australopiths specimens appear to support the idea of a vegetarian diet, although chemical studies of fossils suggest that the southern robust species may also have eaten meat.

Scientists originally used the word robust to refer to the late Australopiths out of the belief that they had much larger bodies than did the early, gracile Australopiths. However, further research has revealed that the robust Australopiths stood about the same height and weighed roughly the same amount as Australopithecus afarensis and A. africanus.

The earliest known robust species, Australopithecus aethiopicus, lived in eastern Africa by 2.7 million years ago. In 1985 at West Turkana, Kenya, American paleoanthropologists Alan Walker discovered a 2.5-million-year- old fossil skull that helped to define this species. It became known as the ‘black skull’ because of the colour it had absorbed from minerals in the ground. The skull had a tall sagittal crest toward the back of its cranium and a face that projected far outward from the forehead. A. aethiopicus shared some primitive features with A. afarensis-that is, features that originated in the earlier East African Australopiths. This may suggest that A. aethiopicus evolved from A. afarensis.

Australopithecus boisei, the other well-known East African robust Australopiths, lived over a long period of time, between about 2.3 million and 1.2 million years ago. In 1959 Mary Leakey discovered the original fossil of this species-a nearly complete skull-at the site of Olduvai Gorge in Tanzania. Kenyan-born paleoanthropologists Louis Leakey, husband of Mary, originally named the new species Zinjanthropus boisei (Zinjanthropus translates as ‘East African man’). This skull-dating from 1.8 million years ago-has the most specialized features of all the robust species. It has a massive, wide and dished-in face capable of withstanding extreme chewing forces, and molars four times the size of those in modern humans. Since the discovery of Zinjanthropus, now recognized as an Australopiths, scientists have found great numbers of A. boisei fossils in Tanzania, Kenya, and Ethiopia.

The southern robust species, called Australopithecus robustus, lived between about 1.8 million and 1.3 million years ago in the Transvaal, the same region that was home to A. africanus. In 1938 Robert Broom, who had found many A. africanus fossils, bought a fossil jaw and molar that looked distinctly different from those in A. africanus. After finding the site of Kromdraai, from which the fossil had come, Broom collected many more bones and teeth that together convinced him to name a new species, which he called Paranthropus robustus (Paranthropus meaning ‘beside man’). Later scientists dated this skull at about 1.5 million years old. In the late 1940's and 1950 Broom discovered many more fossils of this species at the Transvaal site of Swartkrans.

Paleoanthropologists believe that the eastern robust species, A. aethiopicus and A. boisei, may have evolved from an early Australopiths of the same region, perhaps A. afarensis. According to this view, A. africanus gave rise only to the southern species A. robustus. Scientists refer to such a case characteristics in different places or at different times-as parallel evolution. If parallel evolution occurred in Australopiths, the robust species would make up two separate branches of the human family tree.

The last robust Australopiths died out about 1.2 million years ago. At about this time, climate patterns around the world entered a period of fluctuation, and these changes may have reduced the food supply on which robusts depended. Interaction with larger-brained members of the genus Homo, such as Homo erectus, may also have contributed to the decline of late Australopiths, although no compelling evidence exists of such direct contact. Competition with several other species of plant-eating monkeys and pigs, which thrived in Africa at the time, may have been an even more important factor. Nevertheless, the reason that the robust Australopiths became extinct after flourishing for such a long time is not yet known for sure.

Scientists have several ideas about why Australopiths first split off from the apes, initiating the course of human evolution. Virtually all hypotheses suggest that environmental change was an important factor, specifically in influencing the evolution of Bipedalism. Some well-established ideas about why humans first evolved include (1) the savanna hypothesis, (2) the woodland-mosaic hypothesis, and (3) the variability hypothesis.

The global climate cooled and became drier between eight million and five million years ago, near the end of the Miocene Epoch. According to the savanna hypothesis, this climate change broke up and reduced the area of African forests. As the forests shrunk, an ape population in eastern Africa became separated from other populations of apes in the more heavily forested areas of western Africa. The eastern population had to adapt to its drier environment, which contained larger areas of grassy savanna.

The expansion of dry terrain favoured the evolution of terrestrial living, and made it more difficult to survive by living in trees. Terrestrial apes might have formed large social groups in order to improve their ability to find and collect food and to fend off predators-activities that also may have required the ability to communicate well. The challenges of savanna life might also have promoted the rise of tool use, for purposes such as scavenging meat from the kills of predators. These important evolutionary changes would have depended on increased mental abilities and, therefore, may have correlated with the development of larger brains in early humans.

Critics of the savanna hypothesis argue against it on several grounds, but particularly for two reasons. First, discoveries by a French scientific team of Australopiths fossils in Chad, in Central Africa, suggest that the environments of East Africa may not have been fully separated from those farther west. Recent research suggests that open savannas were not prominent in Africa until sometime after two million years ago.

Criticism of the savanna hypothesis has spawned alternative ideas about early human evolution. The woodland-mosaic hypothesis proposes that the early Australopiths evolved in patchily wooded areas-a mosaic of woodland and grassland-that offered opportunities for feeding both on the ground and in the trees, and that ground feeding favoured Bipedalism.

The variability hypothesis suggests that early Australopiths experienced many changes in environment and ended up living in a range of habitats, including forests, open-canopy woodlands, and savannas. In response, their populations became adapted to a variety of surroundings. Scientists have found that this range of habitats existed at the time when the early Australopiths evolved. So the development of new anatomical characteristics,-particularly Bipedalism-combined with an ability to climb trees, may have given early humans the versatility to live in a variety of habitats.

Bipedalism in early humans would have enabled them to travel efficiently over long distances, giving them an advantage over quadrupedal apes in moving across barren open terrain between groves of trees. In addition, the earliest humans continued to have the advantage from their ape ancestry of being able to escape into the trees to avoid predators. The benefits of both Bipedalism and agility in the trees may explain the unique anatomy of Australopiths. Their long, powerful arms and curved fingers probably made them good climbers, while their pelvis and lower limb structure were reshaped for upright walking people belong to the genus Homo, which first evolved at least 2.3 million to 2.5 million years ago. The earliest members of this genus differed from the Australopiths in at least one important respect-they had larger brains than did their predecessors.

The evolution of the modern human genus can be divided roughly into three periods: early, middle, and late. Species of early Homo resembled gracile Australopiths in many ways. Some early Homo species lived until possibly 1.6 million years ago. The period of middle Homo began perhaps between two million and 1.8 million years ago, overlapping with the end of early Homo. Species of middle Homo evolved an anatomy much more similar to that of modern humans but had comparatively small brains. The transition from middle to late Homo probably occurred sometime around 200,000 years ago. Species of late Homo evolved large and complex brains and eventually language. Culture also became an increasingly important part of human life during the most recent period of evolution.

The origin of the genus Homo has long intrigued paleoanthropologists and prompted much debate. One of several known species of Australopiths, or one not yet discovered, could have given rise to the first species of Homo. Scientists also do not know exactly what factors favoured the evolution of a larger and more complex brain-the defining physical trait of modern humans.

Louis Leakey originally argued that the origin of Homo related directly to the development of Toolmaking -specifically, the making of stone tools. Toolmaking requires certain mental skills and fine hand manipulation that may exist only in members of our own genus. Literally, the name Homo habilis (meaning ‘handy man’) refer directly to the making and use of tools.

However, several species of Australopiths lived at the same time as early Homo, making it unclear which species produced the earliest stone tools. Recent studies of Australopiths hand bones have suggested that at least one of the robust species, Australopithecus robustus, could have made tools. In addition, during the 1960's and 1970's researchers first observed that some nonhuman primates, such as chimpanzees, make and use tools, suggesting that Australopiths and the apes that preceded them probably also made some kinds of tools.

According to some scientists, however, early Homo probably did make the first stone tools. The ability to cut and pound foods would have been most useful to these smaller-toothed humans, whereas the robust Australopiths could chew even very tough foods. Furthermore, early humans continued to make stone tools similar to the oldest known kinds for a time long after the gracile Australopiths died out. Some scientists think that a period of environmental cooling and drying in Africa set the stage for the evolution of Homo. According to this idea, many types of animals suited to the challenges of a drier environment originated during the period between about 2.8 million and 2.4 million years ago, including the first species of Homo.

A Toolmaking human might have had an advantage in obtaining alternative food sources as vegetation became sparse in increasingly dry environments. The new foods might have included underground roots and tubers, as well as meat obtained through scavenging or hunting. However, some scientists disagree with this idea, arguing that the period during which Homo evolved fluctuated between drier and wetter conditions, rather than just becoming dry. In this case, the making and use of stone tools and an expansion of the diet in early Homo-as well as an increase in brain size-may all have been adaptations to unpredictable and fluctuating environments. In either case, more scientific documentation is necessary to support strongly or refute the idea that early Homo arose as part of a larger trend of rapid species extinction and the evolution of many new species during a period of environmental change.

Paleoanthropologists generally recognize two species of early Homo-Homo habilis and H. rudolfensis (although other species may also have existed). The record is unclear because most of the early fossils that scientists have identified as species of Homo,-rather than robust Australopiths who lived at the same time occur as isolated fragments. In many places, only teeth, jawbones, and pieces of skull-without any other skeletal remains-suggest that new species of smaller-toothed humans had evolved as early as 2.5 million years ago. Scientists cannot always tell whether these fossils belong to late-surviving gracile Australopiths or early representatives of Homo. The two groups resemble each other because Homo likely descended directly from a species of gracile Australopiths.

In the early 1960's, at Olduvai Gorge, Tanzania, Louis Leakey, British primate researcher John Napier, and South African paleoanthropologists Philip Tobias discovered a group of early human fossils that showed a cranial capacity from 590 to 690 cu. cm. (36 to forty-two cu. in.). Based on this brain size, which was completely above the range of that in known Australopiths, the scientists argued that a new genus, Homo, and a new species, Homo habilis, should be recognized. Other scientists questioned whether this amount of brain enlargement was sufficient for defining a new genus, and even whether H. habilis were different from Australopithecus africanus, as the teeth of the two species look similar. However, scientists now widely accept both the genus and species names designated by the Olduvai team.

H. habilis lived in eastern and possibly southern Africa between about 1.9 million and 1.6 million years ago, and maybe as early as 2.4 million years ago. Although the fossils of this species moderately resemble those of Australopiths, H. habilis had smaller and narrower molar teeth, premolar teeth, and jaws than did its predecessors and contemporary robust Australopiths.

A fragmented skeleton of a female from Olduvai shows that she stood only about one m. (3.3 ft.) tall, and the ratio of the length of her arms to her legs was greater than that in the Australopiths Lucy. At least in the case of this individual, therefore, H. habilis had very apelike body proportions. However, H. habilis had more modern-looking feet and hands capable of producing tools. Some of the earliest stone tools from Olduvai have been found with H. habilis fossils, suggesting that this species made and used the tools at this site.

Scientists began to notice a high degree of variability in body size as they discovered more early Homo fossils. This could have suggested that H. habilis had a large amount of sexual dimorphism. For instance, the Olduvai female skeleton was dwarfed in comparison with other fossils-exemplified by a sizable early Homo cranium from East Turkana in northern Kenya. However, the differences in size exceeded those expected between males and females of the same species, and this finding later helped convince scientists that another species of early Homo had lived in eastern Africa.

This second species of early Homo was given the name Homo rudolfensis, after Lake Rudolf (now Lake Turkana). The best-known fossils of H. rudolfensis come from the area surrounding this lake and date from about 1.9 million years ago. Paleoanthropologists have not determined the entire time range during which H. rudolfensis may have lived.

This species had a larger face and body than did

H. habilis. The cranial capacity of H. rudolfensis averaged about 750 cu. cm. (46 cu. in.). Scientists need more evidence to know whether the brain of H. rudolfensis in relation to its body size was larger than that proportion in H. habilis. A larger brain-to-body-size ratio can suggest increased mental abilities. H. rudolfensis also had large teeth, approaching the size of those in robust Australopiths. The discovery of even a partial fossil skeleton would reveal whether this larger form of early Homo had apelike or more modern body proportions. Scientists have found several modern-looking thighbones that date from between two million and 1.8 million years ago and may belong to H. rudolfensis. These bones suggest a body size of 1.5 m. (5 ft.) and 52 kg. (114 lb.).

The skulls and teeth of early African populations of middle Homo differed subtly from those of later H. erectus populations from China and the island of Java in Indonesia. H. ergaster makes a better candidate for an ancestor of the modern human line because Asian H. erectus has some specialized features not seen in some later humans, including our own species. H. heidelbergensis has similarities to both H. erectus and the later species H. neanderthalensis, although it may have been a transitional species between middle Homo and the line to which modern humans belong.

Homo ergaster probably first evolved in Africa around two million years ago. This species had a rounded cranium with a brain size of between 700 and 850 cu. cm. (49 to fifty-two cu. in.), a prominent brow ridge, small teeth, and many other features that it shared with the later H. erectus. Many paleoanthropologists consider H. ergaster a good candidate for an ancestor of modern humans because it had several modern skull features, including a thin cranial bones. Most H. ergaster fossils come from the time range of 1.8 million to 1.5 million years ago.

The most important fossil of this species yet found is a nearly complete skeleton of a young male from West Turkana, Kenya, which dates from about 1.55 million years ago. Scientists determined the sex of the skeleton from the shape of its pelvis. They also found out from patterns of tooth eruption and bone growth that the boy had died when he was between nine and twelve years old.

The Turkana boy, as the skeleton is known, had elongated leg bones and arm, leg, and trunk proportions of which essentially match those of a modern humans, in sharp contrast with the apelike proportions H. habilis and Australopithecus afarensis. He appears to have been quite tall and slender. Scientists estimate that, had he grown into adulthood, the boy would have reached a height of 1.8 m. (6 ft.) and a weight of 68 kg (150 lb.). The anatomy of the Turkana boy shows that H. ergaster was particularly well adapted for walking and perhaps for running long distances in a hot environment (a tall and slender body dissipates heat well) but not for any significant amount of tree climbing.

The oldest humanlike fossils outside of Africa have also been classified as H. ergaster, dated around 1.75 million year’s old. These finds, from the Dmanisi site in the southern Caucasus Mountains of Georgia, consist of several crania, jaws, and other fossilized bones. Some of these are strikingly like East African H. ergaster, but others are smaller or larger than H. ergaster, suggesting a high degree of variation within a single population.

H. ergaster, H. rudolfensis, and H. habilis, in addition to possibly two robust Australopiths, all might have coexisted in Africa around 1.9 million years ago. This finding goes against a traditional paleoanthropological view that human evolution consisted of a single line that evolved progressively over time,- an Australopiths species followed by early Homo, then middle Homo, and finally H. sapiens. It appears that periods of species diversity and extinction have been common during human evolution, and that modern H. sapiens has the rare distinction of being the only living human species today.

Although H. ergaster appears to have coexisted with several other human species, they probably did not interbreed. Mating rarely succeeds between two species with significant skeletal differences, such as H. ergaster and H. habilis. Many paleoanthropologists now believe that H. ergaster descended from an earlier population of Homo-perhaps one of the two known species of early Homo-and that the modern human line descended from H. ergaster.

Paleoanthropologists now know that humans first evolved in Africa and lived only on that continent for a few million years. The earliest human species known to have spread in large numbers beyond the African continent was first discovered in Southeast Asia. In 1891 Dutch physician Eugene Dubois found the cranium of an early human on the Indonesian island of Java. He named this early human Pithecanthropus erectus, or ‘erect ape-man.’Today paleoanthropologists refer to this species as Homo erectus.

H. erectus appears to have evolved in Africa from earlier populations of H. ergaster, and then spread to Asia sometime between 1.8 million and 1.5 million years ago. The youngest known fossils of this species, from the Solo River in Java, may date from about 50,000 years ago (although that dating is controversial). So H. erectus was a very successful widespread species-as both having lived in Africa and much of Asia, and long-lived, having survived for possibly more than 1.5 million years.

Homo erectus had a low and rounded braincase that was elongated to example the peripheral frontage to measurements extending inward to the back, a prominent brow ridge, and adult cranial capacity of 800 to 1,250 cu. cm. (50 to 80 cu. in.), an average twice that of the Australopiths. Its bones, including the cranium, were thicker than those of earlier species. Prominent muscle markings and thick, reinforced areas on the bones of H. erectus indicate that its body could withstand powerful movements and stresses. Although it had much smaller teeth than did the Australopiths, it had a heavy and strong jaw.

In the 1920's and 1930's German anatomist and physical anthropologist Franz Weidenreich excavated the most famous collections of H. erectus fossils from a cave at the site of Zhoukoudian (Chou-k’ou-tien), China, near Beijing (Peking). Scientists dubbed these fossil humans Sinanthropus pekinensis, or Peking Man, but others later reclassified them as H. erectus. The Zhoukoudian cave yielded the fragmentary remains of more than thirty individuals, ranging from about 500,000 to 250,000 years old. These fossils were lost near the outbreak of World War II, but Weidenreich had made excellent casts of his finds. Further studies at the cave site have yielded more H. erectus remains.

Other important fossil sites for this species in China include Lantian, Yuanmou, Yunxian, and Hexian. Researchers have also recovered many tools made by H. erectus in China at sites such as Nihewan and Bose, and other sites of similar age (at least one million to 250,000 years old).

Ever since the discovery of Homo erectus, scientists have debated whether this species was a direct ancestor of later humans, including H. sapiens. The last populations of H. erectus-such as those from the Solo River in Java,-may have lived as recently as 50,000 years ago, at the same time as did populations of H. sapiens. Modern humans could not have evolved from these late populations of H. erectus, a much more primitive type of human. However, earlier East Asian populations could have given rise to H. sapiens.

Many paleoanthropologists believe that early humans migrated into Europe by 800,000 years ago, and that these populations were not Homo erectus. A growing number of scientists refer to these early migrants into Europe-who predated both Neanderthals and H. sapiens in the region,-as H. heidelbergensis. The species name comes from a 500,000-year-old jaw found near Heidelberg, Germany.

Scientists have found few human fossils in Africa for the period between 1.2 million and 600,000 years ago, during which

H. heidelbergensis or its ancestors first migrated into Europe. Populations of H. ergaster (or possibly H. erectus) appear to have lived until at least 800,000 years ago in Africa, and possibly until 500,000 years ago in northern Africa. When these populations disappeared, other massive-boned and larger-brained humans,-possibly H. heidelbergensis appears to have replaced them. Scientists have found fossils of these stockier humans at sites in Bodo, Ethiopia; Saldanha (also known as Elandsfontein), South Africa; Ndutu, Tanzania; and Kabwe, Zimbabwe.

Scientists have come up with at least three different interpretations of these African fossils. Some scientists place the fossils in the species H. heidelbergensis and think that this species gave rise to both the Neanderthals (in Europe) and H. sapiens (in Africa). Others think that the European and African fossils belong to two distinct species, and that the African populations that, in this view, was not H. heidelbergensis but a separate species gave rise to H. sapiens. Yet other scientists advocate a long-head view that H. erectus and H. sapiens belong to a single evolving lineage, and that the African fossils belong in the category of archaic H. sapiens (archaic meaning not fully anatomically modern).

The fossil evidence does not clearly favour any of these three interpretations over another. A growing number of fossils from Asia, Africa, and Europe have features that are intermediate between early H. ergaster and H. sapiens. This kind of variation makes it hard to decide how to identify distinct species and to find out which group of fossils represents the most likely ancestor of later humans.

Humans evolved in Africa. Lived took of their stand for only as long as four million years or more, so scientists wonder what finally triggered the first human migration out of Africa (a movement that coincided with the spread of early human populations throughout the African continent). The answer to this question depends, in part, on knowing exactly when that first migration occurred. Some studies claim that site in Asia and Europe contain crude stone tools and fossilized fragments of humanlike teeth that date from more than 1.8 million years ago. Although these claims remain unconfirmed, small populations of humans may have entered Asia before 1.8 million years ago, followed by a more substantial spread between 1.6 million and one million years ago. Early humans reached northeastern Asia by around 1.4 million years ago, inhabiting a region close to the perpetually dry deserts of northern China. The first major habitation of central and western Europe, on the other hand, does not appear to have occurred until between one million and 500,000 years ago.

Scientists once thought that advances in stone tools could have enabled early humans such as Homo erectus to move into Asia and Europe, perhaps by helping them to obtain new kinds of food, such as the meat of large mammals. If African human populations had developed tools that allowed them to hunt large game effectively, they would have had a good source of food wherever they went. In this view, humans first migrated into Eurasia based on a unique cultural adaptation.

By 1.5 million years ago, early humans had begun to make new kinds of tools, which scientists call Acheulean. Common Acheulean tools included large hand axes and cleavers. While these new tools might have helped early humans to hunt, the first known Acheulean tools in Africa date from later than the earliest known human presence in Asia. Also, most East Asian sites more than 200,000 years old contains only simply shaped cobble and flake tools. In contrast, Acheulean tools were more finely crafted, larger, and more symmetrical. Thus, the earliest settlers of Eurasia did not have a true Acheulean technology, and advances in Toolmaking alone cannot explain the spread out of Africa.

Another possibility is that the early spread of humans to Eurasia was not unique, but parts of a wider migration of meat -eating animals, such as lions and hyenas. The human migration out of Africa occurred during the early part of the Pleistocene Epoch, between 1.8 million and 780,000 years ago. Many African carnivores spread to Eurasia during the early Pleistocene, and humans could have moved along with them. In this view, H. erectus seems one of many meat-eating species to expand into Eurasia from Africa, rather than a uniquely adapted species. Relying on meat as a primary food source might have allowed many meat-eating species, including humans, to move through many different environments without having to learn about unfamiliar and potentially poisonous plants quickly.

However, the migration of humans to eastern Asia may have occurred gradually and through lower latitudes and environments similar to those of Africa. If East African populations of H. erectus moved at only 1.6 km. (1 mi.) every twenty years, they could have reached Southeast Asia in 150,000 years. Over this amount of time, humans could have learned about and begun relying on edible plant foods. Thus, eating meat may not have played a crucial role in the first human migrations to new continents. Careful comparison of animal fossils, stone tools, and early human fossils from Africa, Asia, and Europe will help scientists better to find what factors motivated and allowed humans to venture out of Africa for the first time.

The origin of our own species, Homo sapiens, is one of the most hotly debated topics in Paleoanthropology. This debate centres on whether or not modern humans have a direct relationship to H. erectus or to the Neanderthals, and to a great extent is acknowledged of the more modern group of humans who evolved within the past 250,000 years. Paleoanthropologists commonly use the term anatomically modern Homo sapiens to distinguish people of today from these similar predecessors.

Traditionally, paleoanthropologists classified as Homo sapiens any fossil human younger than 500,000 years old with a braincase larger than that of H. erectus. Thus, many scientists who believe that modern humans descend from a single line dating back to H. erectus use the name archaic Homo sapiens to refer to a wide variety of fossil humans that predate anatomically modern H. sapiens. The archaic term denotes a set of physical features typical of Neanderthals and other species of late Homo before modern Homo sapiens. These features include a combination of a robust skeleton, a large but low braincase (positioned in a measure behind, rather than over, the face), and a lower jaw lacking a prominent chin. In this sense, Neanderthals are sometimes classified as a subspecies of archaic H. sapiens-H. neanderthalensis. Other scientists think that the variation in archaic fossils falls into clearly identifiable sets of traits, and that any type of human fossil exhibiting a unique set of traits should have a new species name. According to this view, the Neanderthals belong to their own species, H. neanderthalensis.

In the past, scientists claimed that Neanderthals differed greatly from modern humans. However, the basis for this claim came from a faulty reconstruction of a Neanderthal skeleton that showed it with bent knees and a slouching gait. This reconstruction gave the common but mistaken impression that Neanderthals were dim-witted brutes who lived a crude lifestyle. On the contrary, Neanderthals, like the species that preceded them, walked fully upright without a slouch or bent knees. In addition, their cranial capacity was quite large at about 1,500 cu. cm. (about ninety cu. in.), larger on average than that of modern humans. (The difference probably relates to the greater muscle mass of Neanderthals as compared with modern humans, which usually correlates with a larger brain size.).

Compared with earlier humans, Neanderthals had a high degree of cultural sophistication. They appear to have encountered some informality, as perhaps something as primitive in construction showing symbolic rituals, such as the burial of they’re dead. Neanderthal fossils-including a number of fairly complete skeletons,-are quite common compared with those of earlier forms of Homo, in part because of the Neanderthal practice of intentional burial. Neanderthals also produced sophisticated types of stone tools known as Mousterian, which involved creating blanks (rough forms) from which several types of tools could be made.

Along with many physical similarities, Neanderthals differed from modern humans in several ways. The typical Neanderthal skull had a low forehead, a large nasal area (suggesting a large nose), a forward-projecting nasal and cheek region, a prominent brow ridge with a bony arch over each eye, a non-projecting chin, and obvious space behind the third molar (in front of the upward turn of the lower jaw).

Neanderthals were heavily built and had prominently-boned skeleton body structures than do modern humans. Other Neanderthal skeletal features included a bowing of the limb bones in some individuals, broad scapulae (shoulder blades), hip joints turned outward, a long and thin pubic bone, short lower leg and arm bones on the upper bones, and large surfaces on the joints of the toes and limb bones. Together, these traits made a powerful, compact body of short stature of males averaged 1.7 m. (5 ft. 5 in.) tall and 84 kg. (185 lb.), and females averaged 1.5 m. (5 ft.) tall and 80 kg. (176 lb.). The short, stocky build of Neanderthals conserved heat and helped them withstand extremely cold conditions that prevailed in temperate regions beginning about 70,000 years ago. The last known Neanderthal fossils come from western Europe and date from approximately 36,000 years ago.

At the same time as Neanderthal populations grew in number in Europe and parts of Asia, other populations of nearly modern humans arose in Africa and Asia. Scientists also commonly refer to these fossils, which are distinct from but similar to those of Neanderthals, as archaic. Fossils from the Chinese sites of Dali, Maba, and Xujiayao display the long, low cranium and large face typical of archaic humans, yet they also have features similar to those of modern people in the region. At the cave site of Jebel Irhoud, Morocco, scientists have found fossils with the long skull typical of archaic humans but also the modern traits of a higher forehead and flatter mid face. Fossils of humans from East African sites older than 100,000 years, such as Ngaloba in Tanzania and Eliye Springs in Kenya,-also seem to show a mixture of archaic and modern traits.

The oldest known fossils that possess skeletal features typical of modern humans date from between 130,000 and 90,000 years ago. Several key features distinguish the skulls of modern humans from those of archaic species. These features include a much smaller brow ridge, if any; a globe-shaped braincase; and a flat or only projecting face of reduced size, located under the front of the braincase. Among all mammals, only humans have a face positioned directly beneath the frontal lobe (forward-most area) of the brain. As a result, modern humans tend to have a higher forehead than did Neanderthals and other archaic humans. The cranial capacity of modern humans ranges from about 1,000 to 2,000 cu. cm. (60 to 120 cu. in.), with the average being about 1,350 cu. cm. (80 cu. in.).

Scientists have found both fragmentary and nearly complete cranial fossils of early anatomically modern Homo sapiens from the sites of Singha, Sudan; Omo, Ethiopia; Klasies River Mouth, South Africa; and Skhū-Cave, Israel. Based on these fossils, many scientists conclude that modern H. sapiens had evolved in Africa by 130,000 years ago and started spreading to diverse parts of the world beginning on a route through the Near East sometime before 90,000 years ago.

Paleoanthropologists are engaged in an ongoing debate about where modern humans evolved and how they spread around the world. Differences in opinion rest on the question of whether the evolution of modern humans took place in a small region of Africa or over a broad area of Africa and Eurasia. By extension, opinions differ as to whether modern human populations from Africa displaced all existing populations of earlier humans, eventually resulting in their extinction.

Those, who think modern humans originated exclusively in Africa, and then spread around the world support what is known as the out of Africa hypothesis. Those who think modern humans evolved over a large region of Eurasia and Africa support the so-called multi-regional hypothesis.

Researchers have conducted many genetic studies and carefully assessed fossils to figure out which of these hypotheses agrees more with scientific evidence. The results of this research do not entirely confirm or reject either one. Therefore, some scientists think a compromise between the two hypotheses is the best explanation. The debate between these views has implications for how scientists understand the concept of race in humans. The dubious question that raises is a distributed contribution guised in curiously of itself is to whether the physical differences among modern humans evolved deep in the past or recent, in which is accorded to the out of Africa hypothesis. It is also known as the replacement hypothesis, by which early populations of modern humans out from Africa migrated to other regions and entirely replaced existing populations of archaic humans. The replaced populations would have included the Neanderthals and any surviving groups of Homo erectus. Supporters of this view note that many modern human skeletal traits evolved most recently-within the past 200,000 years or so,-suggesting a single, common origin. Additionally, the anatomical similarities shared by all modern human populations far outweigh those shared by premodern and modern humans within particular geographic regions. Furthermore, biological research suggested that most new species of organisms, including mammals, arose from small, geographically isolated populations.

According to the multi-regional hypothesis, also known as the continuity hypothesis, the evolution of modern humans began when Homo erectus spread throughout much of Eurasia around one million years ago. Regional populations retained some unique anatomical features for hundreds of thousands of years, but they also mated with populations from neighbouring regions, exchanging heritable traits with each other. This exchange of heritable traits is known as gene flow.

Through gene flow, populations of H. erectus passed on a variety of increasingly modern characteristics, such as increases in brain size, across their geographic range. Gradually this would have resulted in the evolution of more modern looking humans throughout Africa and Eurasia. The physical differences among people on this day, then, would result from hundreds of thousands of years of regional evolution. This is the concept of continuity. For instance, modern East Asian populations have some skull features that scientists also see in H. erectus fossils from that region.

Noticeably critics of the multi-regional hypothesis claim that it wrongly advocates a scientific belief in race and could be used to encourage racism. Supporters of the theory point out, however, that their position does not imply that modern races evolved in isolation from each other, or that racial differences justify racism. Instead, the theory holds that gene flow linked different populations together. These links allowed progressively more modern features, no matter where they arose, to spread from region to region and eventually become universal among humans.

Scientists have weighed the out of Africa and multi-regional hypotheses against both genetic and fossil evidence. The results do not unanimously support either one, but weigh more heavily in favour of the out of Africa hypothesis.

Geneticists have studied the amount of difference in the DNA (deoxyribonucleic acid) of different populations of humans. DNA is the molecule that contains our heritable genetic code. Differences in human DNA result from mutations in DNA structure. Mutations may result from exposure to external elements such as solar radiation or certain chemical compounds, while others occur naturally at random.

Geneticists have calculated rates at which mutations can be expected to occur over time. Dividing the total number of genetic differences between two populations by an expected rate of mutation provides an estimate of the time when the two gave cause to be joined of a common ancestor. Many estimates of evolutionary ancestry rely on studies of the DNA in cell structures called mitochondria. This DNA is referred to as mtDNA (mitochondrial DNA). Unlike DNA from the nucleus of a cell, which codes for most of the traits an organism inherits from both parents, mtDNA inheritance passes only from a mother to her offspring. MtDNA also accumulates mutations about ten times faster than does DNA in the cell nucleus (the location of most DNA). The structure of mtDNA changes so quickly that scientists can easily measure the differences between one human population and another. Two closely related populations should have only minor differences in their mtDNA. Conversely, two very distantly related populations should have large differences in their mtDNA.

MtDNA research into modern human origins has produced two major findings. First, the entire amount of variation in mtDNA across human populations is small in comparison with that of other animal species. This significance, in that all human mtDNA originated from a single since which ancestral lineage-specifically, a single female-of late has been mutating ever. Most estimates of the mutation rate of mtDNA suggest that this female ancestor lived about 200,000 years ago. In addition, the mtDNA of African populations varies more than that of peoples in other continents. This suggests that the mtDNA of African populations have proven in identifying their place of a value on a longer time than it has in populations over any other region. In that all living people inherited their mtDNA from one woman in Africa, who is sometimes called the Mitochondrial Eve, in addition geneticists and anthropologists have concluded from this evidence that modern humans originated in a small population in Africa and spread out from there.

MtDNA studies have weaknesses, however, including the following four. First, the estimated rate of mtDNA mutation varies from study to study, and some estimates put the date of origin closer to 850,000 years ago, the time of Homo erectus. Second, mtDNA makes up a small part of the total genetic material that humans inherit. The rest of our genetic material-about 400,000 times more than the amount of mtDNA,-came from many individuals living at the time of the African Eve, conceivably from many different regions. This intermittent interval of which time modern mtDNA began to diversify does not necessarily coincide with the origin of modern human biological traits and cultural abilities. Fourth, the smaller amount of modern mtDNA diversity outside of Africa could result from times when European and Asian populations declined in numbers, perhaps due to climate changes.

Regardless of these criticisms, many geneticists continue to favour the out of Africa hypothesis of modern human origins. Studies of nuclear DNA also suggest an African origin for a variety of genes. Furthermore, in a remarkable series of studies in the late 1990s, scientists recovered mtDNA from the first Neanderthal fossil found in Germany and two other Neanderthal fossils. In each case, the mtDNA does not closely match that of modern humans. This finding suggests that at least some Neanderthal populations had diverged from the line to modern humans by 500,000 to 600,000 years ago, and the depriving of an augmented potential of possible occurrence is apprehensibly actualized, and which can be known as having an existence as categorized in virtue been no attributed thing but some substantiation by a form of something exacted to have happened. Also to suggest that Neanderthals represent a separate species from modern H. sapiens. In another study, however, mtDNA extracted from a 62,000-year-old Australian H. sapiens fossil was found to differ significantly from modern human mtDNA, suggesting a much wider range of mtDNA variation within H. sapiens than was previously believed. According to the Australian researchers, this finding lends support to the multi-regional hypothesis because it shows that different populations of H. sapiens, possibly including Neanderthals, could have evolved independently in different parts of the world.

As with genetic research, fossil evidence also does not entirely support or refute either of the competing hypotheses of modern human origins. However, many scientists see the balance of evidence favouring an African origin of modern H. sapiens within the past 200,000 years. The oldest known modern-looking skulls come from Africa and date from perhaps 130,000 years ago. The next oldest comes from the Near East, where they date from about 90,000 years ago. Fossils of modern humans in Europe do not exist between years from before 40,000 years ago. In addition, the first modern humans in Europe-often referred to as Cro-Magnon people had elongated lower leg bones, as did African populations that were adapted too warm, tropical climates. This suggests that populations from warmer regions replaced those in colder European regions, such as the Neanderthals.

Fossils also show that populations of modern humans lived at the same time and in the same regions as did populations of Neanderthals and Homo erectus, but that each retained its distinctive physical features. The different groups overlapped in the Near East and Southeast Asia for between about 30,000 and 50,000 years. The maintenance of physical differences for this amount of time implies that archaically and modern humans could either not or generally did not interbreed. To some scientists, this also means that the Neanderthals belong to a separate species, H. neanderthalensis, and that migratory populations of modern humans entirely replaced archaic humans in both Europe and eastern Asia.

On the other hand, fossils of archaic and modern humans in some regions show continuity in certain physical characteristics. These similarities may indicate multi-regional evolution. For example, both archaic and modern skulls of eastern Asia have flatter cheek and nasal areas than do skulls from other regions. By contrast, the same parts of the face project forward in the skulls of both archaic and modern humans of Europe. Assuming that these traits were influenced primarily by genetic inheritance rather than environmental factors, archaic humans may have given rise to modern humans in some regions or at least interbred with migrant modern-looking humans.

Each of the competing major hypotheses of modern human origins has its strengths and weaknesses. Genetic evidence appears to support the out of Africa hypothesis. In the western half of Eurasia and in Africa, this hypothesis also seems the better explanation, particularly in regard to the apparent replacement of Neanderthals by modern populations. At the same time, the multi-regional hypothesis appears to explain some of the regional continuity found in East Asian populations.

Therefore, many paleoanthropologists advocate a theory of modern human origins that combines elements of the out of Africa and the changing regional hypotheses. Humans with modern features may have first come forth in Africa or come together there as a result of gene flow with populations from other regions. These African populations may then have replaced archaic humans in certain regions, such as western Europe and the Near East. Nevertheless, elsewhere,-especially in East Asia-gene flow may have occurred among local populations of archaic and modern humans, resulting in distinct and enduring regional characteristics.

All three of these views-the two competing positions and the compromise; acknowledge the strong biological unity of all people. In the multi-regional hypothesis, this unity results from hundreds of thousands of years of continued gene flow among all human populations. According to the out of Africa hypothesis, on the other hand, similarities among all living human populations result from a recent common origin. The compromise position accepts both of these as reasonable and compatible explanations of modern human origins.

The story of human evolution is as much about the development of cultural behaviour as it is about changes in physical appearance. The term culture, in anthropology, traditionally refers to all human creations and activities governed by social customs and rules. It includes elements such as technology, language, and art. Human cultural behaviour depends on the social transfer of information from one generation to the next, which it depends on a sophisticated system of communication, such as language.

The term culture has often been used to distinguish the behaviour of humans from that of other animals. However, some nonhuman animals also appear to have forms of learned cultural behaviours. For instance, different groups of chimpanzees use different techniques to capture termites for food using sticks. Also, in some regions chimps use stones or pieces of wood for cracking open nuts. Chimps in other regions do not practice this behaviour, although their forests have similar nut trees and materials for making tools. These regional differences resemble traditions that people pass from generation to generation. Traditions are a fundamental aspect of culture, and paleoanthropologists assume that the earliest humans also had some types of traditions.

Nonetheless, modern humans differ from other animals, and probably many earlier human species, in that they actively teach each other and can pass on and accumulate unusually large amounts of knowledge. People also have a uniquely long period of learning before adulthood, and the physical and mental capacity for language. Language of all forms, spoken, signed, and written in provides a medium for communicating vast amounts of information, much more than any other animal appears to be able to transmit through gestures and vocalizations.

Scientists have traced the evolution of human cultural behaviour through the study of archaeological artifacts, such as tools, and related evidence, such as the charred remains of cooked food. Artifacts show that throughout much of human evolution, culture has developed slowly. During the Palaeolithic, or early Stone Age, basic techniques for making stone tools changed very little for periods of well more than a million years.

Human fossils also provide information about how culture has evolved and what effects it has had on human life. For example, over the past 30,000 years, the basic anatomy of humans has undergone only one prominent change: The bones of the average human skeleton have become much smaller and thinner. Innovations in the making and use of tools and in obtaining food.- results of cultural evolution may have led to more efficient and less physically taxing lifestyles, and thus caused changes in the skeleton.

Paleoanthropologists and archaeologists have studied many topics in the evolution of human cultural behaviour. These have included the evolution of (1) social life; (2) subsistence (the acquisition and production of food); (3) the making and using of tools; (4) environmental adaptation; (5) symbolic thought and its expression through language, art, and religion; and (6) the development of agriculture and the rise of civilizations.

One of the first physical changes in the evolution of humans from apes-a decrease in the size of male canine teeth

- also indicates a change in social relations. Male apes sometimes use their large canines to threaten (or sometimes fight with) other males of their species, usually over access to females, territory, or food. The evolution of small canines in Australopiths implies that males had either developed other methods of threatening each other or become more cooperative. In addition, both male and female Australopiths had small canines, indicating a reduction of sexual dimorphism from that in apes. Yet, although sexual dimorphism in canine size decreased in Australopiths, males were still much larger than females. Thus, male Australopiths might have competed aggressively with each other based on sheer size and strength, and the social life of humans may not have differed much from that of apes until later times.

Scientists believe that several of the most important changes from apelike to characteristically human social life occurred in species of the genus Homo, whose members show even less sexual dimorphism. These changes, which may have occurred at different times, included (1) prolonged maturation of infants, including an extended period during which they required intensive care from their parents; (2) special bonds of sharing and exclusive mating between particular males and females, called pair-bonding; and (3) the focus of social activity at a home base, a safe refuge in a special location known to family or group members.

Humans, who have a large brain, have a prolonged periods of infant development and childhood because the brain takes a long time too mature. Since the Australopiths brain was not much larger than that of a chimp, some scientists think that the earliest humans had a more apelike rate of growth, which is far more rapid than that of modern humans. This view is supported by studies of Australopiths fossils looking at tooth development-a good indicator of overall body development.

In addition, the human brain becomes very large as it develops, so a woman must give birth to a baby at an early stage of development in order for the infant’s head to fit through her birth canal. Thus, human babies require a long period of care to reach a stage of development at which they depend less on their parents. In contrast with a modern female, a female Australopiths could give birth to a baby at an advanced stage of development because its brain would not be too large to pass through the birth canal. The need to give birth early,-and therefore, to provide more infant care-may have evolved around the time of the middle Homo species Homo ergaster. This species had a brain significantly larger than that of the Australopiths, but a narrow birth canal.

Pair-bonding, usually of a short duration, occurs in a variety of primate species. Some scientists speculate that prolonged bonds developed in humans along with increased sharing of food. Among primates, humans have a distinct type of food-sharing behaviour. People will delay eating food until they have returned with it to the location of other members of their social group. This type of food sharing may have arisen at the same time as the need for intensive infant care, probably by the time of H. ergaster. By devoting himself to a particular female and sharing food with her, a male could increase the chances of survival for his own offspring.

Humans have lived as foragers for millions of years. Foragers obtain food when and where it is available over a broad territory. Modern-day foragers (also known as hunter-gatherers) such as, the San people in the Kalahari Desert of southern Africa who also set up central campsites, or home bases, and divide work duties between men and women. Women gather readily available plant and animal foods, while men take on the often less successful task of hunting. For most of the time since the ancestors of modern humans diverged from the ancestors of the living great apes, around seven million years ago, all humans on Earth f ed themselves exclusively by hunting wild animals and gathered wild planets, as the Blackfeet still did in thee 19th century. It was only within the last 11,000 years that some peoples turned to what is termed food production: that is, domesticating wild animals and planets and eating the resulting livestock and crops. Today, most people on Earth consume food that they produced themselves or that someone else produced for them. Some current rates of change, within the next decade the few remaining bands of hunter-gatherers will abandon their ways, disintegrate, or die out, thereby ending our million of the years of commitment to the hunter-gatherers lifestyle. Those few peoples who remained hunter-gatherers into the 20th century escaped replacement by food producers because they ere confined to areas not fit for food production, especially deserts and Arctic regions. Within the present decade, even they will have been seduced by the attractions of civilization, settled down under pressure from bureaucrats or missionaries, or succumbed to germs.

Nevertheless, female and male family members and relatives bring together their food to share at their home base. The modern form of the home base,-that also serves as a haven for raising children and caring for the sick and elderly-may have first developed with middle Homo after about 1.7 million years ago. However, the first evidence of hearths and shelters, -common to all modern home bases-comes from only after 500,000 years ago. Thus, a modern form of social life may not have developed until late in human evolution.

Human subsistence refers to the types of food humans eat, the technology used in and methods of obtaining or producing food, and the ways in which social groups or societies organize them for getting, making, and distributing food. For millions of years, humans probably fed on-the-go, much as other primates do. The lifestyle associated with this feeding strategy is generally organized around small, family-based social groups that take advantage of different food sources at different times of year.

The early human diet probably resembled that of closely related primate species. The great apes eat mostly plant foods. Many primates also eat easily obtained animal foods such as insects and bird eggs. Among the few primates that hunt, chimpanzees will prey on monkeys and even small gazelles. The first humans probably also had a diet based mostly on plant foods. In addition, they undoubtedly ate some animal foods and might have done some hunting. Human subsistence began to diverge from that of other primates with the production and use of the first stone tools. With this development, the meat and marrow (the inner, fat-rich tissue of bones) of large mammals became a part of the human diet. Thus, with the advent of stone tools, the diet of early humans became distinguished in an important way from that of apes.

Scientists have found broken and butchered fossil bones of antelopes, zebras, and other comparably sized animals at the oldest archaeological sites, which go on a date from about 2.5 million years ago. With the evolution of late Homo, humans began to hunt even the largest animals on Earth, including mastodons and mammoths, members of the elephant family. Agriculture and the of animals arose only in the recent past, with H. sapiens.

Paleoanthropologists have debated whether early members of the modern human genus were aggressive hunters, peaceful plant gatherers, or opportunistic scavengers. Many scientists once thought that predation and the eating of meat had strong effects on early human evolution. This hunting hypothesis suggested that early humans in Africa survived particularly arid periods by aggressively hunting animals with primitive stone or bone tools. Supporters of this hypothesis thought that hunting and competition with carnivores powerfully influenced the evolution of human social organization and behaviour; Toolmaking; anatomy, such as the unique structure of the human hand; and intelligence.

Beginning in the 1960s, studies of apes cast doubt on the hunting hypothesis. Researchers discovered that chimpanzees cooperate in hunts of at least small animals, such as monkeys. Hunting did not, therefore, entirely distinguish early humans from apes, and therefore hunting alone may not have determined the path of early human evolution. Some scientists instead argued in favour of the importance of food-sharing in early human life. According to a food-sharing hypothesis, cooperation and sharing within family groups,- instead of aggressive hunting-strongly influenced the path of human evolution.

Scientists once thought that archaeological sites as much as two million years old provided evidence to support the food-sharing hypothesis. Some of the oldest archaeological sites were places where humans brought food and stone tools together. Scientists thought that these sites represented home bases, with many social features of modern hunter-gatherers campsites, including the sharing of food between pair-bonded males and females.

A critique of the food-sharing hypothesis resulted from more careful study of animal bones from the early archaeological sites. Microscopic analysis of these bones revealed the marks of human tools and carnivore teeth, showing that both humans and potential predators, such as hyenas, cats, and jackals were active at these sites. This evidence suggested that what scientists had thought were home bases where early humans shared food were in fact food-processing sites that humans abandoned to predators. Thus, evidence did not clearly support the idea of food-sharing among early humans.

The new research also suggested a different view of early human subsistence that early humans scavenged meat and bone marrow from dead animals and did little hunting. According to this scavenging hypothesis, early humans opportunistically took parts of animal carcasses left by predators, and then used stone tools to remove marrow from the bones.

Observations that many animals, such as antelope, often die off in the dry season make the scavenging hypothesis quite plausible. Early Toolmaker would have had plenty of opportunity to scavenge animal fat and meat during dry times of the year. However, other archaeological studies-and a better appreciation of the importance of hunting among chimpanzees-suggest that the scavenging hypothesis is too narrow. Many scientists now believe that early humans both scavenged and hunted. Evidence of carnivore tooth marks on bones cut by early human Toolmaker suggests that the humans scavenged at least the larger of the animals they ate. They also ate a variety of plant foods. Some disagreement remains, however, as to how much early humans relied on hunting, especially the hunting of smaller animals.

Scientists debate when humans first began hunting on a regular basis. For instance, elephant fossils found with tools made by middle Homo once led researchers to the idea that members of this species were hunters of big game. However, the simple association of animal bones and tools at the same site does not necessarily mean that early humans had killed the animals or eaten their meat. Animals may die in many ways, and natural forces can accidentally place fossils next to tools. Recent excavations at Olorgesailie, Kenya, show that H. erectus cut meat from elephant carcasses but give rise of not revealing to whether these humans were regular or specialized hunters.

Humans who lived outside of Africa,-especially in colder temperate climates,-almost necessitated eating more meat than their African counterparts. Humans in temperate Eurasia would have had to learn about which plants they could safely eat, and the number of available plant foods would drop significantly during the winter. Still, although scientists have found very few fossils of edible or eaten plants at early human sites, early inhabitants of Europe and Asia probably did eat plant foods in addition to meat.

Sites that provide the clearest evidence of early hunting include Boxgrove, England, where about 500,000 years ago people trapped a great number of large game animals between a watering hole and the side of a cliff and then slaughtered them. At Schningen, Germany, a site about 400,000 years old, scientists have found wooden spears with sharp ends that were well designed for throwing and probably used in hunting large animals.

Neanderthals and other archaic humans seem to have eaten whatever animals were available at a particular time and place. So, for example, in European Neanderthal sites, the number of bones of reindeer (a cold-weather animal) and red deer (a warm-weather animal) changed depending on what the climate had been like. Neanderthals probably also combined hunting and scavenging to obtain animal protein and fat.

For at least the past 100,000 years, various human groups have eaten foods from the ocean or coast, such as shellfish and some sea mammals and birds. Others began fishing in interior rivers and lakes. Between probably 90,000 and 80,000 years ago people in Katanda, in what is now the Democratic Republic of the Congo, caught large catfish using a set of barbed bone points, the oldest known specialized fishing implements. The oldest stone tips for arrows or spears date from about 50,000 to 40,000 years ago. These technological advances, probably first developed by early modern humans, indicate an expansion in the kinds of foods humans could obtain.

Beginning 40,000 years ago humans began making even more significant advances in hunting dangerous animals and large herds, and in exploiting ocean resources. People cooperated in large hunting expeditions in which they killed great numbers of reindeer, bison, horses, and other animals of the expansive grasslands that existed at that time. In some regions, people became specialists in hunting certain kinds of animals. The familiarity these people had with the animals they hunted appears in sketches and paintings on cave walls, dating from as much as 32,000 years ago. Hunters also used the bones, ivory, and antlers of their prey to create art and beautiful tools. In some areas, such as the central plains of North America that once teemed with a now-extinct type of large bison (Bison occidentalis), hunting may have contributed to the extinction of entire species.

The making and use of tools alone probably did not distinguish early humans from their ape predecessors. Instead, humans made the important breakthrough of using one tool to make another. Specifically, they developed the technique of precisely hitting one stone against another, known as knapping. Stone Toolmaking characterized the period sometimes referred to as the Stone Age, which began at least 2.5 million years ago in Africa and lasted until the development of metal tools within the last 7,000 years (at different times in different parts of the world). Although early humans may have made stone tools before 2.5 million years ago, Toolmaker may not have remained long enough in one spot to leave clusters of tools that an archaeologist would notice today.

The earliest simple form of stone Toolmaking involved breaking and shaping an angular rock by hitting it with a palm-sized round rock known as a hammerstone. Scientists refer to tools made in this way as Oldowan, after Olduvai Gorge in Tanzania, a site from which many such tools have come. The Oldowan tradition lasted for about one million years. Oldowan tools include large stones with a chopping edge, and small, sharp flakes that could be used to scrape and slice. Sometimes Oldowan Toolmaker used anvil stones (flat rocks found or placed on the ground) on which hard fruits or nuts could be broken open. Chimpanzees are known to do this today.

Scientists once thought that Oldowan Toolmaker intentionally produced several different types of tools. It now appears that differences in the shapes of larger tools were some byproducts of detaching flakes from a variety of natural rock shapes. Learning the skill of Oldowan Toolmaking assiduously required observation, but not necessarily instruction or language. Thus, Oldowan tools were simple, and their makers used them for such purposes as cutting up animal carcasses, breaking bones to obtain marrow, cleaning hides, and sharpening sticks for digging up edible roots and tubers.

Oldowan Toolmaker sought out the best stones for making tools and carried them to food-processing sites. At these sites, the Toolmaker would butcher carcasses and eat the meat and marrow, thus avoiding any predators that might return to a kill. This behaviour of bringing food and tools together contrasts with an eat-as-you-go strategy of feeding commonly seen in other primates.

The Acheulean Toolmaking traditions, which began sometime between 1.7 million and 1.5 million years ago, consisted of increasingly symmetrical tools, most of which scientists refer as to hand-axes and cleavers. Acheulean Toolmaker, such as Homo erectus, also worked with much larger pieces of stone than did Oldowan Toolmaker. The symmetry and size of later Acheulean tools show increased planning and design-and thus probably increased intelligence-on the part of the Toolmaker. The Acheulean tradition continued for more than 1.35 million years.

The next significant advances in stone Toolmaking were made by at least 200,000 years ago. One of these methods of Toolmaking, known as the prepared core technique (and Levallois in Europe), involved carefully and exactingly knocking off small flakes around one surface of a stone and then striking it from the side to produce a preformed tool blank, which could then be worked further. Within the past 40,000 years, modern humans developed the most advanced stone Toolmaking techniques. The so-called prismatic-blade core Toolmaking technique involved removing the top from a stone, leaving a flat platform, and then breaking off multiple blades down the sides of the stone. Each blade had a triangular cross-section, giving it excellent strength. Using these blades as blanks, people made exquisitely shaped spearheads, knives, and numerous other kinds of tools. The most advanced stone tools also exhibit distinct and consistent regional differences in style, indicating a high degree of cultural diversity.

Early humans experienced dramatic shifts in their environments over time. Fossilized plant pollen and animal bones, along with the chemistry of soils and sediments, reveal much about the environmental conditions to which humans had to adapt.

By eight million years ago, the continents of the world, which move over very long periods, had come to the positions they now occupy. However, the crust of the Earth has continued to move since that time. These movements have dramatically altered landscapes around the world. Important geological changes that affected the course of human evolution include those in southern Asia that formed the Himalayan mountain chain and the Tibetan Plateau, and those in eastern Africa that formed the Great Rift Valley. The formation of major mountain ranges and valleys led to changes in wind and rainfall patterns. In many areas dry seasons became more pronounced, and in Africa conditions became generally cooler and drier.

By five million years ago, the amount of fluctuation in global climate had increased. Temperature fluctuations became quite pronounced during the Pliocene Epoch (five million to 1.6 million years ago). During this time the world entered a period of intense cooling called an ice age, which began from place to place of 2.8 million years ago. Ice ages cycle through colder phases known as glacial (times when glaciers form) and warmer phases known as interglacial (during which glaciers melt). During the Pliocene, glacial and interglacial each lasted about 40,000 years each. The Pleistocene Epoch (1.6 million to 10,000 years ago), in contrast, had much larger and longer ice age fluctuations. For instance, beginning about 700,000 years ago, these fluctuations repeated roughly every 100,000 years.

Between five million and two million years ago, a mixture of forests, woodlands, and grassy habitats covered most of Africa. Eastern Africa entered a significant drying period around 1.7 million years ago, and after one million years ago large parts of the African landscape turned to grassland. So the early Australopiths and early Homo lived in wooded places, whereas Homo ergaster and H. erectus lived in areas of Africa that were more open. Early human populations encountered many new and different environments when they spread beyond Africa, including colder temperatures in the Near East and bamboo forests in Southeast Asia. By about 1.4 million years ago, populations had moved into the temperate zone of northeast Asia, and by 800,000 years ago they had dispersed into the temperate latitudes of Europe. Although these first excursions to latitudes of 400 north and higher may have occurred during warm climate phases, these populations also must have encountered long seasons of cold weather.

All of these changes,-dramatic shifts in the landscape, changing rainfall and drying patterns, and temperature fluctuations posed challenges to the immediate and long-term survival of early human populations. Populations in different environments evolved different adaptations, which in part explains why more than one species existed at the same time during much of human evolution.

Some early human adaptations to new climates involved changes in physical (anatomical) form. For example, the physical adaptation of having a tall, lean body such as that of H. ergaster,-with lots of skin exposed to cooling winds-would have dissipated heat very well. This adaptation probably helped the species to survive in the hotter, more open environments of Africa around 1.7 million years ago. Conversely, the short, wide bodies of the Neanderthals would have conserved heat, helping them to survive in the ice age climates of Europe and western Asia

Increases in the size and complexity of the brain, however, made early humans progressively better at adapting through changes in cultural behaviour. The largest of these brain-size increases occurred over the past 700,000 years, a period during which global climates and environments fluctuated dramatically. Human cultural behaviour also evolved more quickly during this period, most likely in response to the challenges of coping with unpredictable and changeable surroundings

Humans have always adapted to their environments by adjusting their behaviour. For instance, early Australopiths moved both in the trees and on the ground, which probably helped them survive environmental fluctuations between wooded and more open habitats. Early Homo adapted by making stone tools and transporting their food over long distances, thereby increasing the variety and quantities of different foods they could eat. An expanded and flexible diet would have helped these Toolmaker survive unexpected changes in their environment and food supply

When populations of H. erectus moved into the temperate regions of Eurasia, but they faced new challenges to survival. During the colder seasons they had to either move away or seek shelter, such as in caves. Some of the earliest definitive evidence of cave dwellers dates from around 800,000 years ago at the site of Atapuerca in northern Spain. This site may have been home too early H. heidelbergensis populations. H. erectus also used caves for shelter.

Eventually, early humans learned to control fire and to use it to create warmth, cook food, and protect themselves from other animals. The oldest known fire hearths date from between 450,000 and 300,000 years ago, at sites such as Bilzingsleben, Germany; Verteszöllös, Hungary; and Zhoukoudian (Chou-k’ou-tien), China. African sites as old as 1.6 million to 1.2 million years contain burned bones and reddened sediments, but many scientists find such evidence too ambiguous to prove that humans controlled fire. Early populations in Europe and Asia may also have worn animal hides for warmth during glacial periods. The oldest known bone needles, which indicate the development of sewing and tailored clothing, date from about 30,000 to 26,000 years ago.

Behaviour relates directly to the development of the human brain, and particularly the cerebral cortex, the part of the brain that allows abstract thought, beliefs, and expression through language. Humans communicate through the use of symbols-ways of referring to things, ideas, and feelings that communicate meaning from one individual to another but that need not have any direct connection to what they identify. For instance, a word-one types of symbol-does not usually relate directly or actualized among the things or indexical to its held idea, but by its representation, it has only of itself for being abstractive.

People can also paint abstract pictures or play pieces of music that evoke emotions or ideas, even though emotions and ideas have no form or sound. In addition, people can conceive of and believe in supernatural beings and powers-abstract concepts that symbolize real-world events such as the creation of Earth and the universe, the weather, and the healing of the sick. Thus, symbolic thought lies at the heart of three hallmarks of modern human culture: language, art, and religion.

In language, people creatively join words together in an endless variety of sentences,-each with a noun, verb and with the collective distinction in meanings, according to a set of mental rules, or grammar. Language provides the ability to communicate complex concepts. It also allows people to exchange information about both past and future events, about objects that are not present, and about complex philosophical or technical concepts

Language gives people many adaptive advantages, including the ability to plan, to communicate the location of food or dangers to other members of a social group, and to tell stories that unify a group, such as mythologies and histories. However, words, sentences, and languages cannot be preserved like bones or tools, so the evolution of language is one of the most difficult topics to investigate through scientific study.

It appears that modern humans have an inborn instinct for language. Under normal conditions not developing language is almost impossible for a person, and people everywhere go through the same stages of increasing language skill at about the same ages. While people appear to have inborn genetic information for developing language, they learn specific languages based on the cultures from which they come and the experiences they have in life.

The ability of humans to have language depends on the complex structure of the modern brain, which has many interconnected, specific areas dedicated to the development and control of language. The complexity of the brain structures necessary for language suggests that it probably took a long time to evolve. While paleoanthropologists would like to know when these important parts of the brain evolved, endocasts (inside impressions) of early human skulls do not provide enough detail to show this.

Some scientists think that even the early Australopiths had some ability to understand and use symbols. Support for this view comes from studies with chimpanzees. A few chimps and other apes have been taught to use picture symbols or American Sign Language for simple communication. Nevertheless, it appears that language, -as well as art and religious rituals became vital aspects of human life only during the past 100,000 years, primarily within our own species.

Humans also express symbolic thought through many forms of art, including painting, sculpture, and music. The oldest known object of possible symbolic and artistic value dates from about 250,000 years ago and comes from the site of Berekhat Ram, Israel. Scientists have interpreted this object, a figure carved into a small piece of volcanic rock, as a representation of the outline of a female body. Only a few other possible art objects are known from between 200,000 and 50,000 years ago. These items, from western Europe and usually attributed to Neanderthals, include two simple pendants-a tooth and a bone with bored holes,-and several grooved or polished fragments of tooth and bone.

Sites dating from at least 400,000 years ago contain fragments of red and black pigment. Humans might have used these pigments to decorate bodies or perishable items, such as wooden tools or clothing of animal hides, but this evidence would not have survived to today. Solid evidence of the sophisticated use of pigments for symbolic purposes,-such as in religious rituals comes only from after 40,000 years ago. From early in this period, researchers have found carefully made types of crayons used in painting and evidence that humans burned pigments to create a range of colours.

People began to create and use advanced types of symbolic objects between about 50,000 and 30,000 years ago. Much of this art appears to have been used in rituals-possibly ceremonies to ask spirit beings for a successful hunt. The archaeological record shows a tremendous blossoming of art between 30,000 and 15,000 years ago. During this period people adorned themselves with intricate jewellery of ivory, bone, and stone. They carved beautiful figurines representing animals and human forms. Many carvings, sculptures, and paintings depict stylized images of the female body. Some scientists think such female figurines represent fertility.

Early wall paintings made sophisticated use of texture and colour. The area upon which is now Southern France contains many famous sites of such paintings. These include the caves of Chauvet, which contain art more than 30,000 years old, and Lascaux, in which paintings date from as much as 18,000 years ago. In some cases, artists painted on walls that can be reached only with special effort, such as by crawling. The act of getting to these paintings gives them a sense of mystery and ritual, as it must have to the people who originally viewed them, and archaeologists refer to some of the most extraordinary painted chambers as sanctuaries. Yet no one knows for sure what meanings these early paintings and engravings had for the people who made them.

Graves from Europe and western Asia indicate that the Neanderthals were the first humans to bury their dead. Some sites contain very shallow graves, which group or family members may have dug simply to remove corpses from sight. In other cases it appears that groups may have observed rituals of grieving for the dead or communicating with spirits. Some researchers have claimed that grave goods, such as meaty animal bones or flowers, had been placed with buried bodies, suggesting that some Neanderthal groups might have believed in an afterlife. In a large proportion of Neanderthal burials, the corpse had its legs and arms drawn in close to its chest, which could indicate a ritual burial position.

Other researchers have challenged these interpretations, however. They suggest that perhaps the Neanderthals had practically rather than religious reasons for positioning dead bodies. For instance, a body manipulated into a fetal position would need only a small hole for burial, making the job of digging a grave easier. In addition, the animal bones and flower pollen near corpses could have been deposited by accident or without religious intention.

Many scientists once thought that fossilized bones of cave bears (a now-extinct species of large bear) found in Neanderthal caves indicated that these people had what has been referred to as a cave bear cult, in which they worshipped the bears as powerful spirits. However, after careful study researchers concluded that the cave bears probably died while hibernating and that Neanderthals did not collect their bones or worship them. Considering current evidence, the case for religion among Neanderthals remains controversial.

One of the most important developments in human cultural behaviour occurred when people began to domesticate (control the breeding of) plants and animals. and the advent of agriculture led to the development of dozens of staple crops (foods that forms the basis of an entire diet) in temperate and tropical regions around the world. Almost the entire population of the world today depends on just four of these major crops: wheat, rice, corn, and potatoes.

The growth of farming and animal herding initiated one of the most remarkable changes ever in the relationship between humans and the natural environment. The change first began just 10,000 years ago in the Near East and has accelerated very rapidly since then. It also occurred independently in other places, including areas of Mexico, China, and South America. Since the first of plants and animals, many species over large areas of the planet have come under human control. The overall number of plant and animal species has decreased, while the populations of a few species needed to support large human populations have grown immensely. In areas dominated by people, interactions between plants and animals usually fall under the control of a single species,-Homo sapiens.

The rise of civilizations-the large and complex types of societies in which most people still live today-developed along with surplus food production. People of high status eventually used food surpluses as a way to pay for labour and to create alliances among groups, often against other groups. In this way, large villages could grow into city-states (urban centres that governed them) and eventually empires covering vast territories. With surplus food production, many people could work exclusively in political, religious, or military positions, or in artistic and various skilled vocations. Command of food surpluses also enabled rulers to control labourers, such as in slavery. All civilizations developed based on such hierarchical divisions of status and vocation.

The earliest civilization arose more than 7,000 years ago in Sumer in what is now Iraq. Sumer grew powerful and prosperous by 5,000 years ago, when it entered on the city-state of Ur. The region containing Sumer, known as Mesopotamia, was the same area in which people had first domesticated animals and plants. Other centres of early civilizations include the Nile Valley of Northeast Africa, the Indus. Valley of South Asia, the Yellow River Valley of East Asia, the Oaxaca and Mexico valleys and the Yucatán region of Central America, and the Andean region of South America, China and Inca Empire.

All early civilizations had some common features. Some of these included a bureaucratic political body, a military, a body of religious leadership, large urban centres, monumental buildings and other works of architecture, networks of trade, and food surpluses created through extensive systems of farming. Many early civilizations also had systems of writing, numbers and mathematics, and astronomy (with calendars); road systems; a formalized body of law; and facilities for education and the punishment of crimes. With the rise of civilizations, human evolution entered a phase vastly different from all before which came. Before this time, humans had lived in small, family-entered groups essentially exposed to and controlled by forces of nature. Several thousand years after the rise of the first civilizations, most people now live in societies of millions of unrelated people, all separated from the natural environment by houses, buildings, automobiles, and numerous other inventions and technologies. Culture will continue to evolve quickly and in unforeseen directions, and these changes will, in turn, influence the physical evolution of Homo sapiens and any other human species to come,-attempt to base ethical reasoning on the presumed fact about evolution. The movement is particularly associated with Spencer, the premise that later elements in an evolutionary path are better than earlier ones, the application of the principle then requires seeing western society, laissez faire capitalism, or another object of approval as more evolved than more ‘primitive’ social forms. Neither the principle nor the application commands much respect. The version of evolutionary ethics called ‘social Darwinism, emphasised the struggle for natural selection, and drew the conclusion that we should glorify and help such struggles, usually by enchaining competitive and aggressive relations between people in society, or between societies themselves. More recently subjective matters and opposing physical theories have rethought the relations between evolution and ethics in the light of biological discoveries concerning altruism and kin-selection.

It is, nevertheless, and, least of mention, that Sociobiology (the academic discipline best known through the work of Edward O. Alison who coined the tern in his Sociobiology: the New Synthesise, 1975). The approach to human behaviour is based on the premise that all social behaviour has a biological basis, and seeks to understand that logical basis as to genetic encoding for features that are themselves selected for through evolutionary history. The philosophical problem is essentially of methodology of finding criteria for identifying features that are objectively manifest in that they can usefully identify features, which classical epistemology can usefully explain in this way, and for finding criteria for assessing various genetic stories that might provide useful explanations among the features proposed for this kind of explanation are such things as male dominance, male promiscuity versus female fidelity, propensities to sympathy and other emotions, and the limited altruism characteristics accused of ignoring the influence of environmental and social factors in moulding people’s characteristics, e.g., at the limit of silliness, by postulating a ‘gene for poverty, however there is no need for the approach to committing such errors, since the feature explained sociobiologically may be indexical to environmental considerations: For instance, it may be a propensity to develop some feature in some social or order environment, or even a propensity to develop propensities . . . That man’s problem was to separate genuine explanation from speculatively methodological morally stories, which may or may not identify really selective mechanisms

Scientists are unbiased observers who use the scientific method to confirm conclusively and falsify various theories. These experts have no preconceptions in gathering the data and logically derive theories from these objective observations. One great strength of science is that its self-correcting, because scientists readily abandon theories when their use has been forfeited, and then again they have shown them to be irrational, although many people have accepted such eminent views of science, they are almost completely untrue. Data can neither conclusively confirm nor conclusively falsify theories, there really is no such thing as the scientific method, data become subjective in practice, and scientists have displayed a surprising fierce loyalty to their theories. There have been many misconceptions of what science is and what science is not.

Science, is, and should be the systematic study of anything that breathes, walk of its own locomotion, in a bipedal orthogonality, and has some effectual regard for its own responsibility of Beingness, and, of course, have to some degreeable form in living personal manner. In that others of science can examine, test, and verify. Not-knowing or knowing has derived the word science from the Latin word scribe meaning ‘to know.’ From its beginnings, science has developed into one of the greatest and most influential fields of human endeavour. Today different branches of science investigate almost everything that thumps in the night in that can observe or detect, and science as the whole shape in the way we understand the universe, our planet, ourselves, and other living things.

Science develops through objective analysis, instead of through personal belief. Knowledge gained in science accumulates as time goes by, building to a turn of work through with what has ben foregoing. Some of this knowledge, such as our understanding of numbers, stretches back to the time of ancient civilizations, when scientific thought first began. Other scientific knowledge,-such as our understanding of genes that cause cancer or of quarks (the smallest known building block of matter), dates back to less than fifty years. However, in all fields of science, old or new, researchers use the same systematic approach, known as the scientific method, to add to what governing evolutionary principles have known.

During scientific investigations, scientists put together and compare new discoveries and existing knowledge. Commonly, new discoveries extend what continuing phenomenons have currently accepted, providing further evidence that existing idea are correct. For example, in 1676 the English physicist Robert Hooke discovered those elastic objects, such as metal springs, stretches in proportion to the force that acts on them. Despite all the advances made in physics since 1676, this simple law still holds true.

Scientists use existing knowledge in new scientific investigations to predict how things will behave. For example, a scientist who knows the exact dimensions of a lens can predict how the lens will focus a beam of light. In the same way, by knowing the exact makeup and properties of two chemicals, a researcher can predict what will happen when they combine. Sometimes scientific predictions go much further by describing objects or events those existing object relations have not yet known. An outstanding instance occurred in 1869, when the Russian chemist Dmitry Mendeleyev drew up a periodic table of the elements arranged to illustrate patterns of recurring chemical and physical properties. Mendeleyev used this table to predict the existence and describe the properties of several elements unknown in his day, and when the mysteriousness of science began the possibilities of experimental simplicities in the discovering enactments whose elements, under which for the several years past, the later, predictions were correct.

In science, and only through experimentation can we find the sublime simplicities of our inherent world, however, by this similarity to theoretical implications can we manifest of what can also be made important as when current ideas are shown to be wrong. A classic case of this occurred early in the 20th century, when the German geologist Alfred Wegener suggested that the continents were at once connected, a theory known as continental drift. At the time, most geologists discounted Wegener's ideas, because the Earth's crust may be fixed. However, following the discovery of plate tectonics in the 1960's, in which scientists found that the Earth’s crust is made of moving plates, continental drift became an important part of geology.

Through advances like these, scientific knowledge is constantly added to and refined. As a result, science gives us an ever more detailed insight into the way the world around us works.

For a large part of recorded history, science had little bearing on people's everyday lives. Scientific knowledge was gathered for its own sake, and it had few practical applications. However, with the dawn of the Industrial Revolution in the 18th century, this rapidly changed. Today, science affects the way we live, largely through technology-the use of scientific knowledge for practical purposes.

Some forms of technology have become so well established that forgetting the great scientific achievements that they represent is easy. The refrigerator, for example, owes its existence to a discovery that liquids take in energy when they evaporate, a phenomenon known as latent heat. The principle of latent heat was first exploited in a practical way in 1876, and the refrigerator has played a major role in maintaining public health ever since. The first automobile, dating from the 1880's, used many advances in physics and engineering, including reliable ways of generating high-voltage sparks, while the first computers emerged in the 1940's from simultaneous advances in electronics and mathematics.

Other fields of science also play an important role in the things we use or consume every day. Research in food technology has created new ways of preserving and flavouring what we eat. Research in industrial chemistry has created a vast range of plastics and other synthetic materials, which have thousands of uses in the home and in industry. Synthetic materials are easily formed into complex shapes and can be used to make machine, electrical, and automotive parts, scientific and industrial instruments, decorative objects, containers, and many other items. Alongside these achievements, science has also caused technology that helps save human life. The kidney dialysis machine enables many people to survive kidney diseases that would once have proved fatal, and artificial valves allow sufferers of coronary heart disease to return to active living. Biochemical research is responsible for the antibiotics and vaccinations that protect us from infectious diseases, and for a wide range of other drugs used to combat specific health problems. As a result, the majority of people on the planet now live longer and healthier lives than ever before.

However, scientific discoveries can also have a negative impact in human affairs. Over the last hundred years, some technological advances that make life easier or more enjoyable have proved to have unwanted and often unexpected long-term effects. Industrial and agricultural chemicals pollute the global environment, even in places as remote as Antarctica, and city air is contaminated by toxic gases from vehicle exhausts. The increasing pace of innovation means that products become rapidly obsolete, adding to a rising tide of waste. Most significantly of all, the burning of fossil fuels such as coal, oil, and natural gas releases into the atmosphere carbon dioxide and other substances knew as greenhouse gases. These gases have altered the composition of the entire atmosphere, producing global warming and the prospect of major climate change in years to come.

Science has also been used to develop technology that raises complex ethical questions. This is particularly true in the fields of biology and medicine. Research involving genetic engineering, cloning, and in vitro fertilization gives scientists the unprecedented power to cause new life, or to devise new forms of living things. At the other extreme, science can also generate technology that is deliberately designed to harm or to kill. The fruits of this research include chemical and biological warfare, and nuclear weapons, by far the most destructive weapons that the world has ever known.

Scientific research can be divided into basic science, also known as pure science, and applied science. In basic science, scientists working primarily at academic institutions pursue research simply to satisfy the thirst for knowledge. In applied science, scientists at industrial corporations conduct research to achieve some kind of practical or profitable gain.

In practice, the division between basic and applied science is not always clear-cut. This is because discoveries that initially seem to have no practical use often develop one as time goes away. For example, superconductivity, the ability to conduct electricity with no resistance, was little more than a laboratory curiosity when Dutch physicist Heike Kamerlingh Omnes discovered it in 1911. Today superconducting electromagnets are used in several of important applications, from diagnostic medical equipment to powerful particle accelerators.

Scientists study the origin of the solar system by analysing meteorites and collecting data from satellites and space probes. They search for the secrets of life processes by observing the activity of individual molecules in living cells. They observe the patterns of human relationships in the customs of aboriginal tribes. In each of these varied investigations the questions asked and the means employed to find answers are different. All the inquiries, however, share a common approach to problem solving known as the scientific method. Scientists may work alone or they may collaborate with other scientists. Always, a scientist’s work must measure up to the standards of the scientific community. Scientists submit their findings to science forums, such as science journals and conferences, to subject the findings to the scrutiny of their peers.

Whatever the aim of their work, scientists use the same underlying steps to organize their research: (1) they make detailed observations about objects or processes, either as they occur in nature or as they take place during experiments; (2) they collect and analyse the information observed; and (3) they formulate a hypothesis that explains the behaviour of the phenomena observed.

A scientist begins an investigation by observing an object or an activity. Observations typically involve one or more of the human senses, like hearing, sight, smells, tastes, and touch. Scientists typically use tools to aid in their observations. For example, a microscope helps view objects too small to be seen with the unaided human eye, while a telescope views objects too far away to be seen by the unaided eye.

Scientists typically implement their observation skills to an experiment. An experiment is any kind of trial that enables scientists to control and change at will the conditions under which events occur. It can be something extremely simple, such as heating a solid to see when it melts, or the periodical perception to differences of complexity, such as bouncing a radio signal off the surface of a distant planet. Scientists typically repeat experiments, sometimes many times, in order to be sure that the results were not affected by unforeseen factors.

Most experiments involve real objects in the physical world, such as electric circuits, chemical compounds, or living organisms. However, with the rapid progress in electronics, computer simulations can now carry out some experiments instead. If they are carefully constructed, these simulations or models can accurately predict how real objects will behave.

One advantage of a simulation is that it allows experiments to be conducted without any risks. Another is that it can alter the apparent passage of time, speeding up or slowing natural processes. This enables scientists to investigate things that happen very gradually, such as evolution in simple organisms, or ones that happen almost instantaneously, such as collisions or explosions.

During an experiment, scientists typically make measurements and collect results as they work. This information, known as data, can take many forms. Data may be a set of numbers, such as daily measurements of the temperature in a particular location or a description of side effects in an animal that has been given an experimental drug. Scientists typically use computers to arrange data in ways that make the information easier to understand and analysed data may be arranged into a diagram such as a graph that shows how one quantity (body temperature, for instance) varies in relation to another quantity (days since starting a drug treatment). A scientist flying in a helicopter may collect information about the location of a migrating herd of elephants in Africa during different seasons of a year. The data collected maybe in the form of geographic coordinates that can be plotted on a map to provide the position of the elephant herd at any given time during a year.

Scientists use mathematics to analyse the data and help them interpret their results. The types of mathematical use that include statistics, which is the analysis of numerical data, and probability, which calculates the likelihood that any particular event will occur.

Once an experiment has been carried out, data collected and analysed, scientists look for whatever pattern their results produce and try to formulate a hypothesis that explains all the facts observed in an experiment. In developing a hypothesis, scientists employ methods of induction to generalize from the experiment’s results to predict future outcomes, and deduction to infer new facts from experimental results.

Formulating a hypothesis may be difficult for scientists because there may not be enough information provided by a single experiment, or the experiment’s conclusion may not fit old theories. Sometimes scientists do not have any prior idea of a hypothesis before they start their investigations, but often scientists start out with a working hypothesis that will be proved or disproved by the results of the experiment. Scientific hypotheses can be useful, just as hunches and intuition can be useful in everyday life. Still, they can also be problematic because they tempt scientists, either deliberately or unconsciously, to favour data that support their ideas. Scientists generally take great care to avoid bias, but it remains an ever-present threat. Throughout the history of science, numerous researchers have fallen into this trap, either in the promise of self-advancement that perceive to be the same or that they firmly believe their ideas to be true.

If a hypothesis is borne out by repeated experiments, it becomes a theory-an explanation that seems to fit with the facts consistently. The ability to predict new facts or events is a key test of a scientific theory. In the 17th century German astronomer Johannes Kepler proposed three theories concerning the motions of planets. Kepler’s theories of planetary orbits were confirmed when they were used to predict the future paths of the planets. On the other hand, when theories fail to provide suitable predictions, these failures may suggest new experiments and new explanations that may lead to new discoveries. For instance, in 1928 British microbiologist Frederick Griffith discovered that the genes of dead virulent bacteria could transform harmless bacteria into virulent ones. The prevailing theory at the time was that genes were made of proteins. Nevertheless, studies succeeded by Canadian-born American bacteriologist Oswald Avery and colleagues in the 1930's repeatedly showed that the transforming gene was active even in bacteria from which protein was removed. The failure to prove that genes were composed of proteins spurred Avery to construct different experiments and by 1944 Avery and his colleagues had found that genes were composed of deoxyribonucleic acid (DNA), not proteins.

If other scientists do not have access to scientific results, the research may as well not have had the liberated amounts of time at all. Scientists need to share the results and conclusions of their work so that other scientists can debate the implications of the work and use it to spur new research. Scientists communicate their results with other scientists by publishing them in science journals and by networking with other scientists to discuss findings and debate issues.

In science, publication follows a formal procedure that has set rules of its own. Scientists describe research in a scientific paper, which explains the methods used, the data collected, and the conclusions that can be drawn. In theory, the paper should be detailed enough to enable any other scientist to repeat the research so that the findings can be independently checked.

Scientific papers usually begin with a brief summary, or abstract, that describes the findings that follow. Abstracts enable scientists to consult papers quickly, without having to read them in full. At the end of most papers is a list of citations-bibliographic references that acknowledge earlier work that has been drawn on in the course of the research. Citations enable readers to work backwards through a chain of research advancements to verify that each step is soundly based.

Scientists typically submit their papers to the editorial board of a journal specializing in a particular field of research. Before the paper is accepted for publication, the editorial board sends it out for peer review. During this procedure a panel of experts, or referees, assesses the paper, judging whether or not the research has been carried out in a fully scientific manner. If the referees are satisfied, publication goes ahead. If they have reservations, some of the research may have to be repeated, but if they identify serious flaws, the entire paper may be rejected from publication.

The peer-review process plays a critical role because it ensures high standards of scientific method. However, it can be a contentious area, as it allows subjective views to become involved. Because scientists are human, they cannot avoid developing personal opinions about the value of each other’s work. Furthermore, because referees tend to be senior figures, they may be less than welcoming to new or unorthodox ideas.

Once a paper has been accepted and published, it becomes part of the vast and ever-expanding body of scientific knowledge. In the early days of science, new research was always published in printed form, but today scientific information spreads by many different means. Most major journals are now available via the Internet (a network of linked computers), which makes them quickly accessible to scientists all over the world.

When new research is published, it often acts as a springboard for further work. Its impact can then be gauged by seeing how often the published research appears as a cited work. Major scientific breakthroughs are cited thousands of times a year, but at the other extreme, obscure pieces of research may be cited rarely or not at all. However, citation is not always a reliable guide to the value of scientific work. Sometimes a piece of research will go largely unnoticed, only to be rediscovered in subsequent years. Such was the case for the work on genes done by American geneticist Barbara McClintock during the 1940s. McClintock discovered a new phenomenon in corn cells known as ‘transposable genes’, sometimes referred to as jumping genes. McClintock observed that a gene could move from one chromosome to another, where it would break the second chromosome at a particular site, insert itself there, and influence the function of an adjacent gene. Her work was largely ignored until the 1960s when scientists found that transposable genes were a primary means for transferring genetic material in bacteria and more complex organisms. McClintock was awarded the 1983 Nobel Prize in physiology or medicine for her work in transposable genes, more than thirty-five years after doing the research.

In addition to publications, scientists form associations with other scientists from particular fields. Many scientific organizations arrange conferences that bring together scientists to share new ideas. At these conferences, scientists present research papers and discuss their implications. In addition, science organizations promote the work of their members by publishing newsletters and Web sites; networking with journalists at newspapers, magazines, and television stations to help them understand new findings; and lobbying lawmakers to promote government funding for research.

The oldest surviving science organization is the Academia dei Lincei, in Italy, which was established in 1603. The same century also saw the inauguration of the Royal Society of London, founded in 1662, and the Académie des Sciences de Paris, founded in 1666. American scientific societies date back to the 18th century, when American scientist and diplomat Benjamin Franklin founded a philosophical club in 1727. In 1743 this organization became the American Philosophical Society, which still exists today.

No comments:

Post a Comment