This is the first chapter from Dr. Qazi Ashraf’s forthcoming book, Religion: A Brief Story from Big Bang to Baghdad, that examines God, religion and history. This is a story of the long walk of the humankind on the edge of uncertainty and invention of religion.
Religion – Its Divine Origin
The Universe is big. Very big. And old. Roughly 14 billion years. In comparison, our planet Earth is young. Only 4.5 billion years old. For the best part of more than 4 billion years, Earth was undergoing Evolution. The conditions were harsh. Only rudimentary life forms could exist. The complex Life that Earth is teeming with evolved late, hardly a few million years ago. We humans appeared later, barely some 100-200 thousand years ago. Finally, some 70,000 years ago, the first modern humans started migrating out of Africa, their homeland, to peregrinate far and wide as hunter-gatherers and foragers. It took our ancestors more than 50,000 years of living as hunter-gatherers that they finally stumbled on agriculture. And only some 12000 years ago, humans began to abandon the hunter-gatherer lifestyle and settle in agricultural communities. That’s to say, human civilization, after all, is quite a recent phenomenon when we consider how long Earth has already existed. Civilization is a costly affair and, as such, requires adequate cultural, technological, and societal innovation to get going. By the day modern humans started laying the foundations of their civilization, they already possessed the required package of inventions and technologies. One such invention crucial to their success was religion.
Hunter-gatherer communities were comparatively small but well-knit, consisting of extended yet intimately related family groups. Cooperation came easy. In the settled villages, on the contrary, there could well be unrelated strangers living side by side, which was possible only when reasonably high levels of trust and cooperation existed among different groups and individuals. In the book The Origin of (almost) Everything, Graham Lawton writes, “In evolutionary biology, trust and cooperation are usually explained in one of the two ways: kin helping each other, and reciprocal altruism, ‘you scratch my back, and I’ll scratch yours.’ But neither of these easily explains cooperation among large groups of unrelated humans. With ever greater chances of encountering strangers, opportunities for cooperation among the kin decline. Reciprocal altruism also stops paying off.”
Here religion comes in handy, bonding strangers into shared communities that grow larger with time, surpassing neighboring ones which lack such social glue as religion. The bigger communities expand out and spread, taking their religion to places and obviously, their civilizational tools and technologies. It turns out, contrary to the commonly held notion, religion rather than agriculture created the conditions for village settlements and larger societies, which in turn made the invention of agriculture necessary to feed the people.
When exactly did religion appear or who created religion in the first place is hard to establish with certainty. The most plausible explanation seems that, evolutionarily, humans are “born believers” naturally tipped toward God and religion. The religiously inclined people have no difficulty accepting that God sent religion from time to time to guide the progeny of Adam – the first man created by God. “Evolution,” Graham Lawton observes, “has endowed us with a default assumption that everything in our environment is caused by a sentient being.” We are more comfortable in assuming that God created everything, religion included. That is a better bet, as the 17th century French mathematician Blasé Pascal put out in his famous philosophical argument called “Pascal’s wager.” Most religious people assume that humankind simply couldn’t invent religion. Yet, the evidence points more toward religion as one of the countless products of human cognition than of divine origin.
Human cognition was never a fixed thing. It evolved like many other variables as life and life forms attained structural complexity over eons. The evolution and perfection of cognition sparked the invention of religion, like language, art, culture, agriculture, the wheel, etc. Typically when we talk of Evolution, we mean Darwinian Evolution. However, Evolution can be thought of in a broader sense also, in which case it has touched everything and put before us the vast Universe with all its diversity and awe-inspiring complexity that fascinates our imagination. How did this all happen? That’s a fascinating story in itself.
The Beginning
In the beginning, there was no beginning. In the no-beginning, there was nothing. In the nothing, there became something. That something was small, infinitesimally small, way smaller than the period at the end of this sentence (precisely 1 × 10-33 in size). That is another way of saying: Take 1 gram of a substance. Divide it 10 billion, billion, billion, billion times. Take one part out of this heap of nothings. That’s your Universe – you can call it “Something” – “something out of nothing.” It appeared about 13.8 billion years ago, 10ˉ43 seconds after Time Zero, in an event we call “Big Bang,” though there was neither Bang nor anything Big. Rather, Big Bang was a term casually blurted out by Fred Hoyle in a radio talk to ridicule his opponents, particularly George Lemaitre, Belgian priest and physicist who toyed with the idea of the beginning of the universe. Big Bang simply picked.
Another term often used in relation to the universe is “Singularity.” It is the small dense point of distorted gravity, time, space, and energy; all packed together. Because of the ridiculously small size, the Universe of Time Zero is called the “Big Bang singularity.” The Big Bang roughly coincides with Time Zero from the point of view of physics. In mathematics speak, though, Time Zero represents a transition point – between real and imaginary times – a moment at which time began flowing. From the Big Bang onward, our “real” time flows forward (arrow of time points ahead), and imaginary time flows backward (arrow of time points back). At Time Zero, everything was held within Singularity. Within a fraction of a second, just 3×10ˉ43 seconds, the Universe started expanding at breakneck speed. That was Inflation. Soon inflation ended (at 10ˉ33 seconds) leaving the Universe to expand at a much slower rate. That created a universe much smoother than before but far from uniform – with tiny variations and quantum fluctuations. In broad terms, that is to say, specks of matter arose here and there due to the interaction of forces of gravity, quantum state, and thermodynamics. After the expansion of the Universe slowed down, the clumps and specks of matter were stretched further, as if in a matrix of space-time to collide to form blobs. From these clumps, specks, and blobs, stars, planets, and galaxies came into being.
Some blobs came to contain more matter (at that time, it was only hydrogen and helium) and gravity clumped them into bigger objects – bigger enough that nuclear fusion in their cores made them shining stars. Nuclear fusion happens under extremely high temperatures (10 million degrees Celsius) within the core of the star. Under these temperatures, hydrogen nuclei get crushed to form helium nuclei, releasing energy that is seen as light. The more and faster the fusion happens, the brighter is the star, and the hotter its core becomes. In these woefully hot-core stars, further nuclear fusion creates a chain reaction in which helium fuses into carbon, out of which neon, oxygen, silicon, and sulfur are born. Silicon and sulfur fuse and make iron. Here the reaction stops. Iron doesn’t fuse further at these temperatures. As a result, the core becomes dense. Gravity squeezes the dense core further, causing tremendous compression of matter. The compression produces heat raising the temperature a bit more – a bit more means a couple thousand degrees. Eventually, the core collapses under the effect of colossal heat, and the star explodes into space as a Supernova to throw everywhere its debris – practically a flood of neutrons – which condenses under lower temperatures of the surrounding space and forms heavy elements. These flying elements get incorporated into other stars and planets, including Earth. Over billions of years of these Supernova bombardments Earth got all the component matter needed to get things going. But before Earth would be formed, other things needed to happen.
Let me repeat. During the first billion years after the Big Bang, massive concentrations of matter formed. We call them galaxies. There were a hundred billion of them. After nine billion years, our galaxy Milky Way was born and the universe had expanded to a diameter close to 100 billion light years. Hydrogen and helium continued to condense under thermonuclear fusion as clouds of matter in galactic space. Under the gravitational effect, condensed matter collapsed, generating heat and energy, raising the temperature further, and pushing thermonuclear fusion to happen steadily. It was in that brewing confusion, our Sun, a big and scorching star, shot into existence. The Sun sucked in the surrounding matter leaving behind a part of the matter-cloud – stardust – that gravity shaped into a disc orbiting around this newly born star. The orbiting disc caused the coagulation of the stardust into ever larger clumps and blobs, ultimately shaping them into planets revolving around the Sun. Depending on the distance from the Sun, planets evolved into rocky (for example, Earth), gaseous and icy types.
Which brings us to our Solar System. It began its formal existence 4.56 billion years ago. Earth was nowhere then. Half of Earth only got assembled in the next 10 million years, and in the following 20 million years, it solidified and settled around the Sun. No sooner had it happened than a giant meteorite smashed into Earth, depositing layers upon layers of molten rock on its surface. During the next nearly half a billion years many meteorite bombardments followed. That is to say, in its initial years – the Hadean (hellish) eon – Earth was going through a hell. After emerging from this Hellish eon, Earth was no longer a ball of molten rock, yet was nothing like today. It was still in the process of getting shaped into the future Earth. The next half billion years were comparatively better for this yet primitive Earth. There was water interspersed here and there with volcanic eruptions as the first continents were beginning to take shape. They were not the continents that we see today but simply rocky deposits of volcanic granite. Slowly, Earth was stepping into the next eon, the Archean eon. The water on Earth came from different sources across the solar system in bombardments of icy comets and asteroids. As water appeared, Life was bound to start. And, start it did for the first time around 4.1 billion years ago. That is what the chemical signatures of organisms in zircon crystals have revealed recently. However, four billion years ago Earth wasn’t still a hospitable place to sustain Life. It didn’t even have an atmosphere; leave aside a suitable one for complex life.
Rock, Water and CO2
How did life, after all, come to occupy every bit of this planet? We don’t know each detail, but we know a lot of them. The story of life in the Universe, as, for instance, John Gribbin tells us, is an example of “surface complexity built upon foundations of deep simplicity.” We know the four most common reactive elements in the Universe are carbon, hydrogen, oxygen, and nitrogen. The same four elements overwhelmingly contribute to the composition of living things on Earth. Carbon plays a key role in life because of its rich chemistry. It combines with as many as four other atoms at a time, forming rings and chains – organic compounds, for instance.
Recent spectroscopic analysis of the interstellar space material has revealed that it contains many carbon compounds such as methane, carbon dioxide, formaldehyde, ethyl alcohol, and even amino acid Glycine. This eye-opening discovery points to the fact that any or all of this interstellar material is likely to have been present in the matter from which our Solar system was born five billion years ago. The implications of this discovery are humongous for understanding the origin of life. Put simply, amino acids formed over very long periods of time in the depths of space were brought down to the surface of any young planet like the Earth. And when the Earth is just right, they get the opportunity to organize themselves into living systems.
Long story short, as Graham Lawton concludes in his best-seller book, “the emergence of life is an almost inevitable consequence of interaction of three essential ingredients of the planetary system: rock, seawater, and carbon dioxide.” He explains that rocks in the seabed developed fissures due to underlying pressure forming alkaline hydrothermal vents. Through these vents, CO2 and hydrogen seeped out. They reacted with seawater and minerals, creating complex organic compounds like sugars, amino acids, and even ribonucleic acid (RNA) – the basic building block of life. RNA can act both as protein and self-replicate – the essential prerequisite for qualifying as a living substance. The energy needed by RNA to replicate was supplied by the proton gradient at the mouth of the hydrothermal vent. “One of the best pieces of evidence,” Lawton says, “for this crucial step in evolution is that living cells are still powered by the proton gradient across cell membranes.” Water, it turns out, occupies the central place in the creation and evolution of life. It may seem like a coincidence, but the truth is that the ancient philosophers rightly intuited that life came from water though they found themselves at a loss to explain how that could happen.
The Greek philosopher Thales of Miletus (today’s Turkey), born around 624 BCE, was the first to claim and even try to prove, unsuccessfully though, that all life came from water. His view was upheld by the greats like Aristotle and others who followed him, making it the mainstream Creation Myth. Over time, this Greek myth became the celebrated narrative for more than two thousand years, finding its way into influential religious scriptures like the Bible and the Quran. Aristotle also formalized a model of the Universe based on Anaximander’s views, wherein Earth was the center of the Universe, totally unmoving – also in line with Parmenides, another Greek philosopher who postulated the “static universe” theory that even Einstein feared to transgress – and Sun, the moon, and the stars revolving in the sky along predetermined, fixed paths. According to Aristotle, the Sun, Moon, and stars are nested within invisible crystal spheres and moved along trajectories; the Sun was the source of light, and the Moon, the stars, and the planets were lit by reflected light. Gravity pulled everything to Earth and levity acts as a counter force that pulls things away. He surmised that heavy objects would fly off into space if Earth were not the center of the Universe. And, to keep, according to Aristotle, everything in place – that is, spheres within spheres – a greater than levity -and -gravity force powered the whole Universe. He called this force the prime mover. It was this prime mover who gave life to everything there was. In the mind of a religious person, that prime mover is God.
Which brings us back to how Life, after all, began in the first place. For creationists, it is simple – God wanted it. He did it. Period. Religion consistently upholds the infallible view that God created everything. Simultaneously, though, sacred scriptures don’t shy away from subscribing to the Thalesian premise that all life came from water – a clear vindication of the influence of Greek thought on the everyday life of ancient and medieval people. Rather than rejecting Thales’s views or giving credit to Thales where it belonged to him, the scriptures carried his views as if revealed by God. In a way endorsing Greek thought as an accepted folk wisdom. In science, contrary to religion, nothing is infallible. Any law, theory, or premise that doesn’t corroborate with evidence is mercilessly chopped off and thrown away.
The origin question (of Universe and life) is fascinating and has given many a brilliant mind a headache. Many conjectures, hypotheses, and theories came and went. The question stands where it was despite that many of its details have been untangled conclusively. It is hoped that the creation puzzle will be conclusively cracked in the near future, maybe in the coming decades. Mathematicians, physicists, and astronomers are teaming up with biologists, geologists and paleoanthropologists, and others to do science to solve the riddle of the creation of life. It remains to be seen how religion will face the new challenge posed by 21st-century science. Good luck to it.
To the Big Bang: With it, 13.8 billion years ago, “Deep History” not only happened but also left an imprint. Where did it leave its imprint? Nobody knows where exactly, but it is somewhere here – actually everywhere – in space, planets, stars, galaxies, black holes, Earth, atmosphere, and so on. We may not be in a position to know, locate, or read every page of the deep history yet, but that doesn’t mean it can’t be done in future. In fact, the day doesn’t seem to be too far when we can perhaps experimentally demonstrate the exact chronology of events that shaped deep history and vice versa. Perhaps it should come as no surprise, then, that the geologists have meticulously pieced together a reasonably magisterial account of the history of Earth from the geological archive – rocks, fossils, water, ice, gases, etc. Even sewer gases have a story to tell the geologist. Given the geological archive comprises practically the tidbits and half-rotten, half-decayed leftovers scattered in the corners and crevices here and there in deeper layers of Earth’s crust, it is commendable that the science of geology has been able to connect the dots and put forward a robust body of verifiable data.
Geological Archive and Life
As mentioned a page or two earlier, Earth is a relatively young entrant into the drama of history, formed barely 4.5 billion years ago. In geology speak, Earth’s history spans over four Eons – Hadean, Archean, Proterozoic, and Phanerozoic. This division into Eons is based on a geological principle called the “principle of faunal succession,” coined by an eighteenth-century British surveyor and geologist, William Smith. He was the first to note a peculiarly consistent ordering of fossils in the sections of rocks he and others studied. When “something new,” say, a fossil, “appears in the geologic archive, it marks a new phase of history,” Smith observed. Using faunal succession alongside modern techniques of dating Earth’s history can be divided into definite slices: Eons, Eras, Periods, Epochs, and Ages. The Hadean (hellish) was the dark, dead, and burning phase. The next Eon – the Archean – commenced when first Life appeared some five hundred million years after the formation of Earth (i.e., four billion years ago). The Archean Life was essentially single-celled, from which Prokaryotes (the bacteria and the archaea) evolved. The prokaryotes have no separate nucleus and survive on chemical energy derived from the depths of the ocean or sunlight. It is a simple life, recognizable.
The Archean Eon lasted for about a billion years. By that time another new event took place. Cyanobacteria, or the blue-green algae, emerged on the scene through an evolutionary jump heralding the third or Proterozoic Eon. The Proterozoic organisms (cyanobacteria) possessed an improved set of tools for trapping sunlight and converting it into energy for their living. That is to say, they could do Photosynthesis to meet their nutrient demands. One billion years of natural selection had pushed these organisms a step ahead of their ancestors, the Prokaryotes. Unfortunately for them, Photosynthesis came at a cost. It produced Oxygen as a by-product. Oxygen turned out to be a highly toxic waste and there accumulated so much of this waste product, Oxygen, that the Earth became saturated, the excess flowing into the atmosphere. This is known as the Great Oxidation Event. Huge quantities of Oxygen manufactured through cyanobacterial photosynthesis intoxicated the Earth‘s and its primitive atmosphere so much that almost all Life was wiped out.
With Oxygen everywhere and practically no carbon dioxide (CO2) left around to cause a greenhouse effect, the temperature on Earth fell steeply. But that wasn’t all. There was more to come. Oxygen combined with methane, rarefying the primitive atmosphere to the extent that no scope for greenhouse effect was left. The atmospheric temperature plunged further, and the Earth headed into a catastrophe – the Ice Age – that lasted half a billion years. All that simple life that barely survived the initial Great Oxidation Event came to a standstill. Photosynthesis completely stopped. Now, at last, CO2 began accumulating. Slowly. After millions of years, when CO2 gathered in sufficient quantities to cause a greenhouse effect, the atmospheric temperature rose a bit, just enough for the melting of ice. As the ice melted, life rebounded. Once again, the Oxidation Event took place. Thankfully, this time enough ozone had accumulated, which established a protective layer that helped maintain a greenhouse effect mitigating the precipitous fall in the atmospheric temperature. As a result, the second Oxidation Event it couldn’t prove as catastrophic as the first. The ozone induced greenhouse effect kept the atmospheric temperature stable enough for Life to continue and evolve. That gave to the atmosphere, over the next three billion years, to slowly attain the composition of gases we have today.
Had Evolution not invented a mechanism called respiration to utilize Oxygen, the Earth probably would have been left barren and frozen into a snowball. Respiration permanently changed the entire equation of Life on Earth. Life, in turn, created a favorable crust, atmosphere and ecology of Earth for more Life to follow.
Now with respiration in place and under favorable circumstances, oxygen became an excellent energy source for a new set of life – Eukaryotes – that evolved to utilize this toxic gas. That was Proterozoic Eon. Eukaryotes were definitely ahead of prokaryotes. Natural Selection gave them a better tool kit called “Nucleus,” which housed chromosomes – the information reservoir of the cell and the organism. Chromosomes carry genetic material or DNA that holds genes and proteins. The information contained in the chromosomes was transmitted to the next generation through a process called cell division. It is a complex energy-requiring process. Eukaryotes had to manufacture enough energy to carry on cell division and passage of information to the next generation. How could they do that? Natural Selection came to their rescue. They had evolved enough to retain through endosymbiosis prokaryotes, their ancestors, as powerhouses called Mitochondria. These powerhouses manufacture energy by utilizing Oxygen from the environment. The manufactured energy (stored as adenosine triphosphate, ATP) is used for running life’s complex tasks.
While all this Evolution of life was happening, it was impacting Earth which was also changing – its crust was becoming cooler and more rigid in consistency to accommodate more and more new forms of life.
Nearly four billion years into Earth’s age, an event called the Cambrian Explosion occurred. With it, the Earth’s history transitioned from the Proterozoic into the last Eon called Phanerozoic Eon (we live in this Eon, which is to say, it is barely 500 million years old). In this Eon, multicellular organisms evolved. A remarkably diverse array of life forms like insects, worms, corals, marine and terrestrial animals, and plants successively appeared on the scene, exhibiting a relatively quick transformation in the geological archive. It is a quick transformation of the fossil record that took merely 70 to 80 million years to happen – a bitsy period on a Geologic time scale.
What is Life?
How did complex life-forms evolve from simple one-celled life? It turns out that the Great Oxidation Event changed the entire equation by pushing the Oxygen levels across a critical point. That is to say, when an environment with abundant Oxygen became available, Evolution drove the once simple Life to adapt to this toxic gas and utilize it for some good purpose, i.e., energy production. Once that happened, – and given the abundance of Oxygen in the environment –, it was cheaper to manufacture big energy using Oxygen. Big energy could then be easily and efficiently utilized to undertake more complex functions – like processing and passing information to the next generation that requires humongous amounts of energy in the form of ATP. As the information could be passed on to the next and next, and so on, generations, the scope for errors, mutations, and genetic mistakes as well as improvements, selections, and adaptations increased. The evolutionary process of Natural Selection facilitated the appearance of new and improved life forms. Slowly, many entirely new life forms and species sprang out following millions of years of reproductive cross-over and hybridization.
After all, what is Life? Two of the world’s most influential minds, Simon Lewis and Mark Maslin, answer: Life is simply a collection of entities that undergo Evolution, grow and reproduce, passing copies of information to the next generation. The colossal energy requirement for information processing, transcribing, and transferring could only be met with adequate amounts of Oxygen.
Which brings me back to the Phanerozoic Eon. We live in the Cenozoic era of this Eon that began some half a million years ago. The Paleozoic (200 million years) and , the Mesozoic (250 million years) eras of the current Eon ( Phanerozoic) saw many upheavals and transformations. During the Paleozoic Era only ancient and simple life forms thrived. Climatic upheavals from volcanic eruptions, tectonic movements, meteorite strikes, methane release, and increased carbon dioxide killed much of life. Whatever survived the relentless and merciless assault of natural phenomena had to adapt to changed conditions.
Life then entered the Mesozoic Era that saw the appearance of dinosaurs and other reptiles – hugely complex and sturdy life forms. That is the era when rifts in the Earth deepened; Continents and smaller land masses began drifting away; and a gigantic meteorite hit modern-day Mexico, causing horrendous volcanic eruptions to eliminate much of the life from the Earth’s surface. Dinosaurs became completely extinct.
It took sixty-six million years for Life to creep in again slowly and catch real pace in the third and final era – the current or the Cenozoic, also called the Era of Mammals. The dinosaurs gone, life saw unexpected diversity (post the enormous trauma and loss): mammals and flowering plants sprang up from everywhere to fill and feed in the vastness of the Earth.
The geological record of Life now shows the Age of Fish (Cambrian), followed by the Age of Reptiles, and then by the Age of Mammals up to its culmination – the Age of Humans or the God species!
Which brings me, meanderingly, back to History. The planet Earth holds within its layers of dust and debris scattered pages of the Book of Life that keep the secret of how History happened over eons, eras, and ages. Nowadays, History is getting revealed as never before. The revelations are happening fast, aided by scientific advances. The accumulating data, which is huge, has the real, real, potential of endangering the whole edifice of knowledge and understanding that humans have so far held infallible.
Consider the Genome-wide Ancient DNA Project. Its revelations about human ancestry, racial make-up and migrations have rattled far-right nationalists and populist groups who hold on tenaciously to their unfounded, mythical and outrageously supremacist narratives. Turns out, the human species emerged in Africa and spread in all directions thenceforth. DNA study is enabling detailed reconstruction of deep relationships amongst ancient human populations. “Human genome project has surpassed the traditional toolkit of archeology in what it can reveal of changes in human populations in the deep past,” writes David Reich, founder of the ancient DNA laboratory at Harvard, in his book Who We Are and How We Got Here. The DNA bombshell has upended many myths and half-truths and pseudoscience, and is expected to uproot many more unfounded yet celebrated narratives in decades to come. Will the revelations of hard science ultimately lead to rethinking, redefining, or even permanently shifting the paradigm for the better? Or, will it they lead to ultimate frustration, chaos, and misunderstanding, thus paving the way for unseen violence? It can be both. Perhaps.
The second outcome is not surprising, if it happens, given that violence has been a companion of history ever since life took root. One may disagree with Thomas Hobbes’s premise that “violence is deeply ingrained in human nature,” nonetheless, he does have a point. How far his conclusions are relevant in our postmodern, post-human world is a matter of debate.
Take Harvard psychologist Steven Pinker. In his book Enlightenment Now, he brilliantly demonstrates – elaborating on Hans Rosling’s work published in book form titled Factfulness – that baring a few unpleasant events happening of late on the world scene, tremendous good has touched human life, especially over the last seven decades. Pinker took the world almost by surprise when he enumerated how much improvement humanity has seen in nearly every aspect, from reason, science, and humanism to progress. The indices for violence, disease, and death have remarkably improved. The violence, though, hasn’t wholly vanished (Hobbes can’t be trounced). But then that’s how our world works. Tides may turn, anytime. To avoid the unwanted fall outs of escalated violence, humankind has no options other than to shift the paradigm. The truth is that without violence the world is unimaginable. What we can try, though, is to aim at keeping violence at an acceptably low level.
First Humans Arrive on the Scene
With the commencement of the Phanerozoic Eon half a billion years ago, the higher-order living organisms – plants and animals – began to appear on the Earth. Their appearance was made possible by the preceding four billion years of Natural Selection, that got elementary chemical molecules slowly and steadily organized along the evolutionary path. The crowning moment of this evolutionary process was the emergence of the hominins. When did humans (Homo sapiens) exactly appear along this trajectory is hard to say with certainty, but rough estimates fall within 300,000 and 200,000 BCE.
Ten million years ago – i.e., during the Miocene epoch of the Cenozoic era in which we currently live – severe climate change hit Africa, Asia, and parts of Europe, almost wiping out ape populations in these regions. Only one African ape lineage survived the catastrophe. Two million later, the gorilla lineage emerged. Another 2 million years down the line ( i.e., around 6 million BCE), an ape lineage called the Last Common Ancestor (LCA) diverged from the parent ape family. It was this LCA that, according to evolutionary biological evidence, gave rise to modern humans and chimpanzees. Since Homo sapiens (modern humans) share biological, genetic, and socio-ecological traits with apes, they are called the “great ape family.” Surprisingly, and perhaps ironically, humans are much closer genetically to the chimpanzee than the gorilla or the orangutan.
The biologists call the great ape family, which includes humans, the hominid family (its members are called hominins).
Around 1.8 million years ago, Homo erectus, appeared on the scene and continued to exist until sixty thousand BCE – the longest surviving hominins, that dominated the scene. These bipedal apes (they walked on hind legs) spread out of Africa into Eurasia. They possessed bigger brains, and used primitive stone tools. Around 60,000 years ago, they vanished. Probably – but not necessarily – Archaic Humans who had emerged in Africa around 1.3 million years after them eliminated them.
The first archaic humans that showed up on the scene are called Homo hidelbergensis. They possessed bigger brains than Homo erectus and made better tools and thus could easily outmaneuver the latter. H. hidelbergensis didn’t stay in Africa for long. They also migrated out of Africa into Europe where they evolved into Homo neanderthalensis (or simply the Neanderthals), adapting to living in the cold climate and high-altitude habitats of Europe during the glacial epoch or Ice age (Pleistocene or 6th epoch of the Cenozoic era). The Ice Age lasted from 2.6 million BCE to 11,700 BCE. The epoch in which we now live is sometimes called the seventh or Holocene epoch of the Cenozoic era. It began 11,700 years ago when the climate became warmer, leading to an “interglacial period.”
The Neanderthals were bodily short and stout. Their body habitus helped them minimize heat loss in cold climates of Europe to an extent that these archaic humans could comfortably roam almost all of Europe and Eurasia hunting and foraging in open lands and the wild. In all, they inhabited Europe for 250,000 years – a pretty good chunk of time on the evolutionary Time Scale – during which no rivals could stand up to their might.
But then, history, as we know, takes no sides. It respects power. The Neanderthals were mighty creatures, but their power and might paled before the brain power of the new entrants into the scene. Somewhere between 300,000 to 200,000 years ago, another transformation took place in the hominid family in the same African landscape where earlier hominins had appeared. This time, archaic humans evolved into Homo sapiens, or the anatomically modern humans (AMH). Little did the Neanderthals know that the new entrants, AMH, would not only redefine power but reset the power equation as per their liking.
These newcomers had better and bigger brains (especially the Frontal lobes of the brain) but comparatively less robust and hairless bodies than the archaic humans. There were not a lot of them to begin with. Quite a few. The genetic evidence (mitochondrial DNA analysis) tells us that a meager number of females (in all 5,000) have contributed to the origin or the gene pool of the entire human population. The first humans, a few thousand in number, perhaps enjoyed life in the vastness of Northeast Africa, devouring from the abundance of nature and increasing their population exponentially. The resulting population explosion ultimately compelled some to seek food sources outside the African jungles and plains.
Finally, 70,000 years ago, around four thousand modern humans (AMH) moved out of Africa and passed along the shore of the Red Sea to find themselves in what is now the Middle East. They couldn’t take a northern route to Europe probably to because of the fear of the Neanderthals out there. The southern route brought them to the Arabian Peninsula which they made their home. It was from this “second home” along the south coast of Arabia that the subsequent batches of migrants spread out to finally reach India (55000 years ago), Europe (40000 years ago), America (15000 years ago) and Australia. Wave upon wave, the migration continued out of Arabia until the rest of the world was populated. Like it or not, the DNA analysis traces our origin from Africa to Arabia. We are migrants at our core descended from beduins, if you like. The environment shaped out color and phenotype in such a way that we forgot our origins and divided ourselves into proud races and nationalities, perpetually at war with one another.
Walkers, Makers, and Thinkers
If the Israeli historian Yuval Noah Harari is to be believed then it was primarily humans who displaced the Neanderthals, the undisputed masters of Europe, forcibly interbreeding with some, slaughtered others, and pushed the rest eventually to the brink, leading thereby to Neanderthals’ extinction – all blame rests on Europeeans’ ancestors’ shoulders. They are guilty and their progeny, the modern day Europeans should collectively seek atonement of ancestors’ sins. However, to Harari and his likes’ chagrin, there is not a shred of evidence, archeological or otherwise, to support their claim except, of course, the argument made popular by H G Wells in his famous 1922 book, A Short History of World. Well’s argument picked and refused to die down even though the world of knowledge transformed exponentially since then. The discipline of Geology progressed, and so did Science at breakneck speed. The undeniable truth is that by 28,000 BCE, Neanderthals had become completely extinct. Not a single soul was alive. All they left was traces of their story in archives of Earth that we dug out 30,000 years later in the Neander valley, hence the name Neanderthals. A chunk of their genes also remained well preserved in the gene pool of the immigrant hordes – the modern Europeans – who usurped their habitat. Nature has its way of keeping memories alive.
How did humans, bodily less robust, and at the bottom of the food chain to start with, requiring at least 20 years of protective care to develop into productive adults, jump to the top of the food chain? How could they displace – to believe Harari’s story ethnically cleanse – the 250,000-year-old masters of Europe and also control other continents?
The four crucial achievements of Natural Selection that made this bare-bodied, helpless AMH species master of the Earth are bipedalism, tools, brain size, and culture.
Only humans exhibit strict bipedalism. All other apes are essentially “knuckle-walkers,” though they can walk upright for short distances. They climb trees and heights deftly using all the four in a coordinated manner but they are not good long distance runners as humans are. Knuckle -walking doesn’t ensure them access to distant food sources as quickly as bipedalism (walking upright for long distances) does to walkers. Besides making humans efficient runners, bipedalism, by significantly increasing their horizontal and vertical fields of vision, allowed humans survey larger areas for food and screen wider spaces for the presence of predators. Yet for one, bipedalism alone wasn’t enough for AMH to score the decisive victory over hominin and animal competitors.
Tools supplanted bipedalism. Together they helped AMH to fare well in the struggle for survival. Tools no doubt helped modern humans to kill and cut better, but archaic humans also used tools. Even crows and chimpanzees can make primitive tools, but tool-making didn’t make the crows kings, at least bird kings. And chimpanzees – they are where they were millions of years ago. Subsisters. Not masters. Arguably humans made better tools than the archaics, primates, and crows. True. But, it raises the question: How could humans make better tools from the same raw material available in abundance to archaic humans and others as well? We will come to this question but first this: Homo erectus was adept at long-distance running, hunting, and using good stone tools. They roamed the Earth for 2 million years. Quite a time! They even domesticated fire, obtaining it from wildfires and then maintaining and using it for cooking to get a better caloric supply from the foods. Yet, despite being good toolmakers and food eaters, they never rose to the apex. Eventually, they succumbed to the pressure of natural selection.
The significant evolutionary transformation that tipped the balance in hominins’ favor ( and ultimately in AMH’S) was their progressively increasing brain size over 6 million years since their appearance. First, Homo habilis saw a significant (as compared to animals) increase in brain size, followed by the archaic humans. Finally, a rapid size increase occurred in the brains of the AMH (anatomically modern humans). The curious thing about the brain growth was that it was not a mere linear growth in the size of the brain; instead, it was differential. Some specific brain areas grew more than others, which made all the difference. In terms of the overall brain size, Homo erectus and Homo neanderthalensis don’t differ strikingly from Homo sapiens (AMH), yet when you consider the frontal lobes of the brain – yes. That changes everything.
Neanderthals’ occipital lobes (the back portion of the brain), were better developed than humans’. Their skull took a peculiar shape to accommodate the extra-sized lobe. The well-developed occipital lobes of the Neanderthal brain was a terrific evolutionary adaptation to handle the poor visibility due to the low sunlight levels at high altitudes of Europe – a clear advantage in evolutionary terms Yet this advantage came at a price. The Neanderthal frontal brain remained smaller. In comparison, the AMH brain is endowed with a more considerable frontal portion at the cost of other areas. As a consequence eyesight, smell, or hearing are compromised, but “thinking”? You are right, no. Thinking is beyond compromise.
The bigger and better frontal lobes allowed AMH, and still do, to engage in abstract thinking. They could do things within their minds. This power of abstract thought and imagining, called “mentalization,” let the “cumulative culture” (the storage, transmission, and expansion of knowledge) happen.
The historian Yuval Harari emphasizes that since humans could cooperate, they could quickly jump to the top of the food chain and become apex predators. However, that’s only half of the story. All animals cooperate, they hunt in groups, they maintain animal communities, and some even play politics, as the primatologist Jane Goodall conclusively documented in chimpanzees and other primates. What these all lower animals, primates, apes, and archaic humans couldn’t do, though, was establish “cumulative culture.” One of the reasons that prevented them from doing so was less evolved frontal brain coupled with higher levels of circulating testosterone (male sex hormone) in their bodies. The higher circulating levels of male sex hormone in their bodies makes them exquisitely prone to “reactive violence.”
Humans produce comparatively lower amounts of testosterone, facilitating social tolerance, living in larger groups, and establishing a cumulative culture. The cumulative culture, in turn, permits valuable cooperation. The cumulative culture was possible when humans possessed the requisite evolutionary brain hardware to do imagining, simulatinh, planning and drawing futurist strategies and even assessing the possible outcomes – all in their minds. The well-developed frontal brain made that possible. Mentalizing allowed them to view cooperation as a logical necessity, rather than instinctive reaction, to direct targeted violence on others in a systematic and organized manner. Cooperation alone, deviod of conscious awareness, didn’t help animals rise up the ladder from subsisters to masters; neither could it have helped humans. Logical cooperation and cumulative culture did. And culture is a remarkable product of “thought.”
Cumulative culture and Sacred History
With cumulative culture already in place, some 70,000 years ago, the first batch of humans set out to emmigrate out of the African Rift Valley in search of food and fodder. Eventually humans came to settle in all corners of the Earth. The first batch would hardly have thought that this their small step would prove a giant leap that would take human culture to great heights of complexity. With this small step of great migration, the seeds of the “sacred history”* were sown far and wide.
What exactly led to the making and shaping of Sacred history? Was it the result of a “quantum leap” of consciousness – a cognitive revolution? And if so, was this cognitive quantum leap a sudden occurrence or a protracted process – a consequence of cumulative knowledge gained through successive periods of historical time coupled with a slow process of accumulating beneficial “variations” (as Darwin would have it in Evolution of life) over millions of years?
By the way, “cognitive revolution” is a catchy phrase. It has created a lot of buzz in academia, enthusing Humanities disciplines ( e.g. history, archeology, philosophy, anthropology, etc.) to study their fields more or less like biological sciences. With the increased availability of modern research tools, these disciplines are becoming increasingly technology-dependent. Unfortunately the the quick and unexpected proliferation of technology has a shady side. The more the communication technology proliferates, the more the educated class transmogrifies into a purely professional labor class. Glamor and quick wealth acquisition has created a widespread public indifference toward scientific inquiry. Some quarters under the garb of nationalism compel science to conform to their politico-religious agendas. In the age of Google, WhatsApp, and Facebook, pseudoscience is fast spreading, even taking over hard science. That’s a matter of concern.
*(The dictionary definition of Sacred History is history that is retold to instill religious faith, which may or may not be founded on the fact. However, throughout this book, the phrase “sacred history” is used to mean history shaped directly or indirectly by religious impulses)
Back to cognitive revolution. It has tremendously helped human species to come out as winner in the struggle for survival. With well developed frontal lobes, the human brain was better positioned to register, learn, remember, and accumulate information about the environment and respond to the surrounding environment accordingly. Humans were weaklings in the ruthless world of apes, predators, and other big-bodied animals of Africa. They could either leave things at the mercy of the environment and lose the battle for survival, or they could use the cunning of their mind and devise ingenious defensive and offensive strategies to hold their ground. They chose the latter. It was a do-or-die situation. Humans handled a world with ruthless competition and constant danger lurking on all sides in an innovative manner in contrast to other creatures who remained wedded to instinctual behavior – fight or flee. Imagine – by the way, only humans can imagine things – when our ancestors lived in open spaces, deserts, or forests, with a limited defensive arsenal in their possession, perhaps only a primitive set of tools made of wood, flint, or stone, they would be on their toes, perpetually fearful of the surrounding environment full of ruthless, unkind, cruel, and dangerous predators, skulking carnivores, or other humanoids. For a moment, slide back mentally into those “good old” historical times and ask yourself: what kind of life early humans lived?
For them surviving was struggling against the seen and the unseen, and the known and the unknown. It was a struggle against the darkness of the night and the light of the day – a never-ending, perpetual battle against everything and everybody. The nighttime particularly increased manifold the odds of being suddenly charged upon by predators from any corner. It would simply be a matter of luck – tremendous luck to live to see another day of life. The fear of death and the uncertainty of life loomed from all sides. And, it was precisely this perpetual fear coupled with the uncertainty that ultimately led to the invention of religion and creation of what we call the Sacred history of humankind. Who else could invent abstract things if not a Frontal-brained, thinking species like Homo sapiens?
The invention and subsequent evolution of sacred history has undoubtedly been a long, drawn-out stepwise process. The “which led to what” cascade of steps is often difficult to trace in sacred history precisely because its deep past has left behind an indistinct, nay an imperceptible, wake. Identifying the exact sequence of the events, and reconstructing the whole story, to trace the birth and evolution of the sacred history is hard to do, given how difficult it is to decipher the deep history. If future technology helps overcome the difficulty of probing the deep history deeper than we can at the moment, that will be an ultimate feather in the cap of human intellect – a gratifying moment given the complexity of the task.
With the “genuine history,” things are comparatively more straightforward than Sacred history though not entirely unproblematic. The leftover evidence of genuine history is more abundant and ubiquitous. That, however, doesn’t mean that interpreting this data is easy and straightforward. The mute remains – rocks, artifacts, and layers of dead tissue and fossils – are a “read-only” script that requires smart deciphering first and accurate corroboration and correlation next. That is hard. There are empty spaces, gaps, and deficiencies in archeological and geological archives. Gaps need to be filled. And it is during this process of filling in the blanks that trouble arises. History becomes infested with bias. It turns out, to borrow Will Durant, most history is guessing, and the rest is prejudice.
History-telling is story-telling. Stories replace facts, and fiction becomes faith. Not always. But quite often. Hard history is as good a science as Biology. Yet, as observed by Will Durant in The Lessons of History, unfortunately history usually is “beclouded by ambivalent evidence and biased historians, and perhaps distorted by our own patriotic or religious partisanship.” By nature, humans love stories – stories that stir emotion. The more the stories tickle emotion, the more they power the narrative. Eventually, history is transformed into a living philosophy – not a text of biology – that stirs the passions of the human species. In truth, story, emotion, and narrative are extrinsic to history, and should not – ideally – bemire history. Yet, they get laid over the facts. The mute rocks and fossils are – well, mute. When history-makers and history-writers add the above extrinsic characteristics to the fossilized past, history becomes laden with all types of biases, some intentional and some unintentional. Deep history is more problematic than the recorded history of the recent past. The gaps in the sources leave a vast scope for getting overfilled or under-filled depending on which direction the wind is blowing.
Nonetheless, there is no dearth of hard data and evidence on Deep history that can tumble many a myth down to rubble. The same doesn’t hold for sacred history. Sacred history is more challenging to handle objectively in view of the paucity of hard data and fossilized evidence. Sacred history (religion) has mostly survived and propagated in the deep past as oral tradition leaving behind hardly any fossils. All the same, it is pretty intriguing and, well, exciting to explore sacred history despite the limitations it poses. Proper exploration aids in doing introspection..
For one thing, Sacred history is cognition-dependent and draws much of its strength from human imagination than from factual evidence. Cognition, as we know, is directly proportional to the size and architectural complexity of the brain (particularly the frontal lobes). Well-developed frontal lobes allow us to engage in abstract thinking or imagining – a unique feature that differentiates Homo sapiens from other species – wherein lies the secret of our success as a species. Yet the undeniable scientific fact is that our big frontal lobes bring to the fore a subtle yet significant pitfall, which we almost invariably ignore: at the fundamental level, our brain doesn’t distinguish between the real and the imaginary – a fallacy that has cost us dearly in terms of blood and flesh, as we shall talk about later.
Our brain can be thought of as hardware and the mind as software. The brain holds pictures and words as memory files, and the mind, like computer software, plays out these pictures and words on our mental screen as successive movies, kind of. These movies, we call thoughts. We can have thoughts about real things or imaginary ones. It doesn’t matter to our brain when it comes to reacting such thoughts. We look, say, at our favourite food, our mouth waters. Or we think about our favourite dish, and our mouth waters. Two types of stimuli bring out an identical result. Fundamentally, our brain ignores the difference between the real and the imaginary. It reacts to thoughts ( picture and word movies). The repetitive playing of thoughts,, as we follow and carry them out, creates patterns in us – our habits, beliefs, perspectives, and paradigms. These patterns then play out on autopilot – once set in place, the software automatically generates the same sequences repeatedly. That is how overtime beliefs get established and firmly entrenched. In one of my previous books, Open Secret – A giant Leap to success, prosperity, and peace, I dealt with belief formation in detail. Beliefs don’t need to be realistic. In fact, they can be far removed from reality. Religious beliefs belong to this category. Some religious belief patterns can be outrightly dangerous.
Sense of Safety –A Key to Success
Our story is interesting. We are bodily weak. A newborn is entirely helpless; it needs constant care and protection for years, in contrast to the offspring of other species, which are mostly fully formed and mature at birth. As infants and children, we are powerless creatures. We require a long learning time to become livable and useful adults. And still, in the brute nature, we as a species survived the hardships, calamities, and merciless disasters that made many living species extinct. Life is modern times has become far smoother and easier to live. That was not always the case. We have come a long way and our perspective is so transformed that most of us have a hard time imagining how difficult life was for our ancestors. But that was that.
What we are today is a result of the cumulative legacy of thousands of years of sacrifice and forbearance of our ancestors through the ruthless circumstances of the past. We carry that imprint of struggle – that fear, that uncertainty, that anxiety etc. in our genes and blood – that our ancestors went through in an environment of “kill or get killed.” They perpetually found themselves at the edge of uncertainty. They lived in a state of heightened awareness about their surroundings. A tiger, a cobra, or a pack of hyenas could be creeping in the nearby bushes; any misstep and you end up as lunch for the hungry predator. The nighttime particularly would be a horrifying koshmar to live in the midst of dangerous predators.
Living in a precarious situation is a trial for the weak. When it is impossible to fight back the enemy, the weak invent novel methods to ensure their safety and a “sense of safety.” The “state of safety” and the “sense of safety” are two different things. The former is a real or temporal phenomenon implying the relative absence of threat. In contrast, the sense of safety is merely a mental construct with no temporal existence, neither in the brain, mind, body, nor in the surrounding environment. It is purely a template, or perhaps a mental trick, not even a belief. Only the human brain is adept at performing such tricks: creating “something” out of nothing.
Ironically, the sense of safety, never mind its falsehood, was one of the driving engines of our success. Our brain routinely short-circuits the mental cascade by ignoring many irrelevant or relatively unimportant elements of the stimuli it receives during the course of the day. That is not to say that the human brain only seeks out relevant and potentially beneficial things all the time. Not at all. Many seemingly irrelevant and irrational things have served our species well along our historical journey to success. To adapt better to the changing circumstances of the struggle for survival, we simply couldn’t afford to chase after only strictly relevant and rational things. That wouldn’t be very smart. Foolishness pays when circumstances demand.
A hundred millennia ago, surviving in an unkind environment of African forests was quite a deal. In that ruthless world everything went leaving little room for toying with concepts and theories to dissect out what was relevant from what was not. The choices were limited. Security was supreme and so was food. Whatever came in hand was devoured. No hygiene, no safe drinking water, no hand-washing mattered. And how could they when they had just put their foot on the Earth, not knowing much about the environment, let alone doing Science.
What else could the first humans do if not eat plants and roots or devour insects and worms before they learned to hunt for the big game? And, what better options did these weaklings have than breaking open the dead animals’ bones and eating their marrow? Some researchers believe that breaking animal bones for eating marrow was the original niche skill of humans. That may seem preposterous (and also an extraordinarily irrelevant or foolish thing to do by today’s standards), but humans were not always as powerful or sophisticated as today. For thousands of years, they were struggling to survive in a self-centric, individualistic, and hostile to each other environment. That is not the least surprising. Individualism is the rule rather than the exception in biological systems, and humans are no different. Only after early humans learned to live in groups and assemblies was strict individualism overshadowed by collectivism. Living in groups imparted them a sense of safety. Belonging to a group became a matter of utmost importance and prestige – the bigger the group, the better the chances of survival and the better the sense of safety.
Big Brain to Big groups
A spark ignited the fire of transformation: it dawned on early humans that instead of waiting for days for the leftovers of carnivores they could themselves successfully hunt in groups like, for instance, lions, hyenas, and other predators. Hunting in coordinated groups was more profitable than stalking the big carnivores for their leftovers. A small behavior change, it turned out, returned big dividends. For thousands of years, humans then hunted first for small, and overtime, for bigger and bigger games. Simultaneously they learned defending themselves better against stronger predators.
As humans learned to form bigger and bigger coordinated groups, some focusing on strategy, some on attack, and others on logistics, the tide turned in their favor. In groups and assemblies, they proved a formidable power to reckon with, and big game hunting became their regular affair posing now a threat even to predators. Fairly quickly then, our species jumped from the bottom to the middle of the food chain 100,000 years ago and began moving to the apex by 50,000 years ago. Monkeys, gorillas, chimpanzees, dogs, buffaloes, sheep, and chickens also live in groups and assemblies, and lions, tigers, and hyenas hunt in packs. All of them live in communities or animal societies. But they stayed where they were on the food chain ladder. How could humans (AMH) accomplish a “quantum leap” to the apex of the food chain? The truth is that there has been nothing like a leap or jump. It has been a slow and laborious process stretched over centuries. However, to fathom how this so-called “quantum leap” occurred, we need to look at the structure, formation, .and maintenance of animal and human assemblies or communities.
Which brings us back, for a moment, to the brain. Brain imaging studies (fMRI, Brain Scan, and PET scan) show that the brain’s frontal lobes are directly responsible for how complex a species’ behavior can be and how big a social group a particular species can maintain. In other words, social behavior puts a limit on the group size. And since social behavior is determined by the frontal brain, the bigger the frontal brain of a species, the bigger the groups they maintain. In short, the group size is the index of a species’ cognitive capacity, which underpins its relationship and adaptation to the environment.
Imaging studies have demonstrated that the volume of the orbitofrontal cortex (a part of the frontal lobe of the brain just behind the eyes) determines mentation or abstract thinking – a prerequisite for maintaining a big group size. The group size hinges on a species’ ability to control impulses – a function directly proportional controlled by the frontal lobe. In our case, with well-developed frontal brains, it takes twenty to twenty-five years to become adept at handling the complexities of social behavior. And yet the learning process continues till late in life.
The primates and apes can maintain an average group size of 55 and 100 members, respectively, beyond which the group becomes unstable. In their case, all others falling outside the group are enemies – real enemies who deserve to be killed. Compare that with human groups. The smallest closely knit, functional human group comprises 150 members, which can include relatives and unrelated friends held together as a community. People outside the 150 members group are regarded as others, not enemies, contrary to what is seen in primate populations. In times of need these others can be approached for cooperation and help. In the case of non-humans the cognitive limit placed by the frontal lobe makes it impossible for them to even think of anything like a systematic and broader purpose-driven cooperation with those outside the group.
So a big brain is good. But why can’t non-humans afford to have equally good brains as humans? The answer is that it asks for an equally good price: it demands considerable energy. Energy is costly. Evolution hasn’t endowed all life forms with the capacity to handle extra costs that the big brain brings in. How did Natural Selection solve the problem of extra costs for the big-brained hominins and us? It saw the solution lay in the gut (intestine).
Typically, both the gut and the brain consume high energy. To handle energy demands, one of these two organs needed to strike a compromise. To accommodate the big brain, the knife of Evolution (Natural Selection) fell on the gut. It was cut to size to save energy for the brain. But, the smaller gut size would naturally bring down the overall food and nutrient supply and hence the energy supply to the rest of the body. What about that?
To handle the nutrition problem, the Hominins had two options: either switch over to a different source of energy supply or come up with a better way to improve nutrient extraction, particularly Niacin (vitamin B3), from the existing sources of food. They finally settled on the second option. Social evolution came to their rescue to make the best out of it, albeit over a period stretching over thousands of centuries. Niacin is essential for proper brain development and growth. Meat is a rich source of Niacin, but hard to digest raw. It needs to be cooked to extract enough Niacin out of it. AMH (our species) mastered the use of fire and learned cooking. Cooked foods provided better digestible fats, proteins, and Niacin, which obviated the need for a long gut. Cooking made the job easy for Evolution to cut the intestine in favor of the brain.
Archaic humans learned – not mastered, unlike AMH – the use of fire around 1.8 million years ago, i.e., long before the AMH (our species) had even evolved. But there is no evidence that the archaic humans were using fire for cooking. Regular cooking is no older than 400,000 years. These archaic humans ((H.ergaster/erectus and early Heidelberg), despite being highly evolved and sophisticated than contemporary apes, primates, and other mammals, were severely handicapped to systematically and purposefully handle the fire. But, it seems the later Heidelberg ( in contrast to the early ones) did a lot of cooking and even organized communal eating. That might have increased social bonding, group size (approximately 100-110 members – a reasonably good size), and cooperation among them. Neanderthal community sizes were more or less the same as late Heidelberg.
The early AMH also started out life with small community sizes. As their population grew, they significantly impacted the mega-fauna, archaic humans included. In contrast to archaics and primates, AMH relatively easily managed to widen the circle of interaction (minimum group size 150 members) and organize better and concerted offense and defense strategies against enemies, predators, and competitors.
For some archaics fire, cooking, communal eating, social bonding, kinship within the group, and other activities (big game hunting and ambush-hunting in the case of Neanderthals) were important means to increase the group size to even more than 110 members, but that still couldn’t match AMH. None of them could come up with a civilization. None of them could control and master the environment as humans did. What did AMH do differently with their bigger brains that archaic humans couldn’t? AMH combined fire with tools like language, stories, and religion – a deadly combination that catapulted them to the apex of the food chain ladder. They created history, both genuine and sacred. No non-human could match them.The big frontal lobes of the human brain changed everything. Abstract thinking, you know, fueled their power.
Talking out the Thoughts
Language, one of our most sophisticated tools, made us a “God species.” In his book The Unfolding of Language, Guy Deutscher notes: “Language is mankind’s greatest invention – except, of course, it was never invented.” He lists it at the top in the hierarchy of special tools. What is so remarkable to merit the top slot in the hierarchy of special tools? At a glance, nothing! But a little deep look reveals a sea of mysteries around this language thing. For one thing, the stark question stares in face: how did the human language originate and evolve? Whence did the faculty of language come from to humans? We don’t have hard answers. We have speculations. Which leaves much to be desired.
The fantastic thing about language is its simplicity. There is hardly a score and a half of sounds, noises, mumbling, or splutters. That is all there is to language. And every animal, at least in mammalian species, possesses them. Yet when we, humans, put these sounds and noises in order, guess what? We only surprise ourselves. These meaningless noises achieve unachievable: talk, gossip, songs, communication, translation of thoughts – from banal signs of dissatisfaction to mind-boggling epic stories about the unseen realms of the Universe. The awesome thing about human language is that even non-sounds are indispensable. They add to the beauty of the construction when placed in a particular place; otherwise, the bland tapestry of words could never get transformed into a live expression, richly sprinkled with colors of emotion. That is what language does every minute of the day!
We, humans, in a sense, are lucky to enjoy nature’s generosity. The language faculty that we are endowed with has had tremendously far-reaching consequences on our civilizational development. Language is undoubtedly a marvel of Evolution – if you are like me, raised in a deeply religious society, you may have trouble accepting this – traced to bewildering reorganization of small changes, variations and improvements over a pretty long time scale. For one thing, religion, too, has been quite mystified by this remarkable capacity of us being able to use language. It is surprising, perhaps intriguing, that almost all the Holy scriptures except the Bible have avoided a direct discussion on language.
The Bible has gone on record to defend the variations and differences among the spoken languages. Without wasting time explaining the origin of language, the Bible comes straight to the point: “God invented language.” Yet it implicitly acknowledges in the same breath that language made humans too powerful, and God kind of regretted having given men this tool, evident in the biblical statements like: He (God), in His power, did His best to punish the people by scattering them all over the face of the Earth; He confounded their languages. Much to God’s chagrin, it turned out that this seemingly simple gossiping tool was a devastatingly powerful weapon that even God had to think twice about. After all, isn’t God the all-knowing and almighty? What does a human being amount to when facing Him? But no! It turned out humans couldn’t be taken for granted; they could even change the Mind of God with their gossip – at least, that is what the Bible implies. One may agree or disagree with the biblical narration, but still, the fact is that the story of the Tower of Babel, fictional or otherwise, is a remarkable testimony that language has been an outstandingly powerful weapon in possession of human beings. No wonder, the biblical authors, like philosophers of the old and scientists of today, were genuinely troubled by the question of the origin of language. Modern science works with tools, ancient philosophies and religion had no such luxury. They had God.
Language, no doubt, is a unique tool. Why could only our species acquire this special tool? How could language come to us with so much ease? When did language reveal itself in History? There is no one-line answer to these questions. A serious look into the matter reveals that we were not the only species who possessed language. Few other species, too, had language – at least some form of rudimentary language. Scientific estimates are that language appeared around 1.5 million years ago (Homo erectus roamed approximately 1.5 million years ago, and that is how we settle on this time-stretch back in history), though the exact time of language’s appearance is difficult to pinpoint. Nonetheless, the clue that the researchers find interesting is that Homo erectus already possessed a primitive form of language.
The hard question to answer is to what extent was Homo erectus’ language structured as we compare it with our language. However, there are plenty of clues pointing to the some form of structure to their language. For example, Homo erectus’ large brain size, standardized stone tools, and use of fire all point to the fact that they used advanced communication methods to pass on the information in groups and to their descendants. Only some form of structured language could have accomplished that.
In general, the learned consensus upholds that language could not have emerged earlier than 150,000 years ago–the time when modern humans arrived on the scene. Proponents of recent origin of language argue that only we, human, possess an appropriately shaped and positioned larynx, and only our brain houses the necessary infrastructure or hardware for mastering the language. Though, so far, as the specific language hardware is concerned, nothing even remotely resembling hardware has been identified in any particular area of human brain. This premise ( specific brain hardware) stands on truism. However, there are certain areas of the brain which are closely associated with different aspects of language.
Noam Chomsky, an influential linguist, subscribes to the view that humans are innately equipped to learn a language. He and his school argue that the elements of the language structure are specified in the genes so that the general grammar rules are biologically predetermined. In other words, a newborn baby possesses all the necessary neuronal circuits to handle the complex grammatical structures of languages. Chomsky makes his point through a simple example: if you take a human baby from one part of the globe and raise them in another part of the globe, within only a few years, they will grow up and speak fluently and flawlessly the native language of that region. The same is not true for other species. In this sense, humans are unique. It is hard to reject this observation of Chomsky, but does it prove that language hardware is innate only to the human brain? No.
We know that chimpanzees don’t exhibit the same learning ability for human language as human babies; nonetheless, these poor fellows in captivity have demonstrated remarkable communication skills. In the 1980s, a baby chimp named Kanzi was born at the Language Research Center of Georgia State University, USA. Kanzi was the first ape that learned to communicate with humans without undergoing formal training. He developed cognitive and communicative skills far surpassing any other ape before him. He reportedly understood some simple sentences and more than 500 spoken words. No doubt, Kanzi never came close to anything like human speech, but the fact that he picked up some human language should awaken us to the possibility that chimps are also equipped with a more or less identical to humans toolkit to learn some tidbits of human language. Where did the chimps get this brain hardware from? And may be Kanzi didn’t want to learn human language, or wanted to learn only that what he thought necessary?
We tend to look at everything from human perspective. Just as it is difficult for us to look at life from the chimpanzees’ perspective, so is the case with chimps to have a human perspective. Perhaps a chimpanzee won’t simply want to learn human skills and language because it may be demeaning to its self-esteem. Just as prisoners don’t always follow their captors’ orders and behave defiantly at times, maybe Kanzi resented human presence. In any case, the speaking Kanzi has put to rest Chomsky’s claim that only humans possess language hardware. Humans are no special.
Long story short: language is a species-specific trait. For chimps, their specific trait works well, and, for us ours. Could human language serve chimps as well as it does humans? Were the behavior a simple arithmetic model, yes. Behavior is not arithmatic. There are many parameters to it. Possibly, human language can put chimps at a real disadvantage.
Chomsky and others argue that children acquire good linguistic skills even from scanty or insufficient inputs and that, according to them, is sufficient proof of the existence of innate capacity. They argue that children are not taught their mother tongue systematically; nevertheless, they acquire grammatical rules. The only plausible explanation, Chomsky says, for the remarkable success of human babies in developing linguistic skills is that some of the grammar rules are already hardwired in the brain. So humans never have to learn them in the first place. That sounds good, but.
Artificial intelligence-based modern computer-speech technology has destroyed the Chomskian argument. It has reduced language to a bunch of algorithms. And much more is expected to follow in the near future. Science makes and breaks the edifice of knowledge. Hard science is ruthless.
Well then, it brings us back to why Guy Deutscher called language humankind’s greatest invention. Guy Deutscher is not the only one who is fascinated by language. Celebrated historians like Yuval Harari and others have also talked about language and its power. Surprisingly, from a scientific perspective, language is simply a “thing” like many other things; Language per se has nothing special about it; it is merely one of the components of a toolkit that humans possess—no puns intended here. As can be recalled from a page or two earlier, even Homo erectus possessed language, or even chimps have some coarse linguistic skills. But none of them went any further with it. We did. What made the real difference for us was that we used this simple tool uniquely. Herein lay our genius as a species. It is a fact that every living organism does some” thinking,” some more, some less, but what they can’t do is talk out their thoughts. We do. And that makes us different from all other creatures.
Which is to say, we can translate information: What happens inside our brain is transferred outside to the environment in a way that influences the recipient.
Language is one of the best tools for conveying to others what happens inside our brains. It gives form to our imaginations, abstract thoughts, and memories hosted within each of us. Language made it possible for humans to carry knowledge from generation to generation. Stories, epics, myths, poetry, culture, society, history, science, and civilization came into existence because of language. No language, no civilization. Period.
Stories and Story-telling
Humans are political animals. Language made it possible for them to weave stories about almost everything – their immediate surroundings or the far-off world of the Sun, the Moon, the stars, and other heavenly bodies. Since the human brain and mind work simultaneously in two worlds – the real and the imaginary – it becomes difficult, at times, to draw a line between them. The real and the imaginary merge at the deeper levels of our subconscious, making us more often than not oblivious of the difference between the two. In trance states – which we often get into without knowing that we are in a trance state – the fuzzy boundary between fact and fiction is obscured, and we are lulled into accepting all sorts of suggestions, stories, myths and fables as a component of reality.
As you are reading this book, you can prove the above to yourself with a simple thought experiment: for a moment think of a lemon. Imagine putting a slice of lemon in your mouth and recall its taste. In a moment, your mouth will water. Did you taste the lemon slice physically? No. Why did your mouth water? Simply because your brain responded to a picture you held in your thoughts (mind)! The brain is fooled into believing it as a reality, with all its temporal dimensions and qualities. At a fundamental level ( neuronal level) brain can’t differentiate between a physical thing (real) and an imagined thing (thought). That is innate to our brain. It serves good ends, but it can be dangerous also. When we consider the frustration, violence, oppression, and outright wars fought in the name of stories, then? You are right. Dangerous is a mild word to qualify that innate property of our brain.
The hunter-gatherer lifestyle brought our ancestors face-to-face with the challenges posed by Nature, demonstrating to them that the world belonged to none and that violence underpinned the struggle for survival. The only way to survive was to live with it.
In the wild, early humans couldn’t help but watch the daily spectacle of violence playing out in front of them. A giraffe or an elephant could only be brought down by a pride of lions attacking collectively; in the case of a gazelle, the lioness could do the job single-handed. The simple but significant lesson the early humans learned was that predator’s strength and prowess defined its survival; with the group effort the hunting prowess rose exponentially – plain common sense, even for lions, jackals, hyenas, and other predators, if it’s supposed they think consciously rather than act instinctively.
So, there was nothing special about cooperation. Nor did humans invent it. Cooperation and herd-behavior is an innate trait of the animal kingdom. Early humans, like animals, cooperated instinctively. But herd behavior alone wasn’t enough our species, weak and bare-bodied, to handle the ruthless world of the jungle and the wild, red in tooth and claw. Something more was needed—something brilliant and remarkable, , call it strategy, given the comparatively weaker physique of Homo sapiens.
Primatologists tell us that non-humans also plan and devise defense, offense, and survival strategies. Yet all these strategies haven’t helped the primates to move beyond survival. How many non-humans have managed to rise to a higher than their previous level on the food chain ladder? None. True, that many living species have tremendously multiplied their numbers, but numbers alone don’t matter. The algae and the eukaryotes are still ubiquitous, yet they are in the same place they were billions of years ago. The chicken, sheep, goats and pigs outnumber any known mammalian species at a given time, but they get slaughtered in equally great numbers daily.
The strategy, plan, or scheme is basically an idea. Ideas are nothing but a bunch of thoughts, useless unless communicated to others, discussed, improvised and acted upon. A species that can do that will have the survival advantage. Lower animals, apes, or archaic humans couldn’t do that. Humans do. And they rose on the ladder of the food chain and civilizational success.
It is impossible that humans could have succeeded as they did, without a language to translate ideas into words. Otherwise, ideas howsoever brilliant would have remained buried in their minds. Through language, they could effectively share them with other group members. Thus, what was happening in one mind was transported to many minds. That is to say, ideas in one mind became a collective property – the precursor to a collective consciousness. Once there were ideas and language, it was natural that the story would take birth. The stories stretched the ideas, sprinkling them with spice and salt, and thus making them spread quickly and definitively. Stories – the vehicle for carrying the ideas – became the hallmark of our cumulative culture.
Myths, Culture, and Violence
Real and imagined, stories became an indispensable part of human culture. They gave rise to powerful myths that definitively shaped the outlook of humankind. Different groups, communities, and societies identified with their respective myths. People became so possessive about their abstractions, concepts, stories, and legends that they didn’t hesitate to write history with each other’s blood. The evolutionary struggle for survival transmogrified into a fierce battle for sacred history, pitting human groups against each other.
The history, genuine and sacred, got written with the blood of the weak. Violence was, is, and will perhaps remain one of the significant hallmarks of human history (more so of sacred history). Strangely an all-out nuclear war didn’t happen in the twentieth century when two superpowers – one Christian and another atheistic – had, for a moment, ceased to see eye to eye.
It is a wishful thinking to contemplate a world without violence. In the aftermath of the September 2001 terrorist attack on the twin towers in Manhattan, New York, the violence got a new role to play: to establish peace. Ah! What a shameless struggle for supremacy wrapped in deception. True that we, human, have survived to live in perpetual deception because that is how our brains function. But there sure are more benign and acceptable ways and means to substitute violence with peace. As, for instance, Steven Pinker observes, we have already brought down overall violence to a great extent, but to eliminate it completely from our everyday life – is that realistic to strive for? No. Take, for instance, food procurement. It doesn’t take a rocket scientist to see how many sheep, goats, and pigs are slaughtered daily for meat. You kill a human being or a wild animal, especially of endangered species, that is violence. You cull chicken and turkeys in tonnes, that is food procurement. And what about the countless tons of plants devoured daily by a vast population of humans and herbivore animals? After all, plants also are living creatures. You cut them down, cook them on fire, and eat them with curd containing billions of lactobacillus – living organisms. That’s not violence because plants don’t cry and weep from pain! You justify one as violence and the other as lawful food procurement. From a biological point of view, both meat-eating and vegetarianism qualify as acts of violence – the basic principle underpinning the food chain.
Violence is the law of biology. All animals commit violence, but they kill to eat; humans do both – they kill to eat and eat to kill (quite often). If Yuval Harari is to be believed, then humans are a disaster – they wiped out the Neanderthals and other competitors; they could have been a little kinder to the archaic humans, but they chose not to. That, however, is not entirely true. Neanderthals became extinct because they lost the struggle for survival. They couldn’t adapt to the changing circumstances or perhaps succumbed to the infectious diseases spread by peregrinating human populations. For a minute, let’s suppose Harari is right: humans killed Neanderthals. Given the law of biology, what other option the humans did have? You kill or get killed in the struggle for survival. Had humans not killed Neanderthals, they would have killed humans. In that case, there would be no Harari to castigate humans for being brutal toward Neanderthals. Perhaps.
Contrary to Harari’s view, the truth is that Neanderthals couldn’t view violence through the same prism as humans. That was their problem. We, humans, are remarkably ingenious when it comes to justifying violence against others. Our sleight for logic and rationalization always come in handy. Throughout history, we lived on slaughter, violence, crime, and exploitation of others, yet we rationalized and justified our acts and moved on. How could we do that – justify the violence?
Again language came in handy. It helped our species create a propaganda machine through word of mouth or gossip. Underlying the drama of violence was a struggle for survival, which resulted in nasty things. The nasty things couldn’t be averted, perhaps, but it can be argued that the magnitude of nastiness could well have been lessened. Perhaps. But that’s debatable. It can’t be said for sure what might have happened had the humans behaved more mercifully, kindly, or peacefully. Did they have the luxury of being civil and polite in that ruthless world of the ancient and deep past? Can we project today’s perspective onto the past, demanding from our ancestors to fit into it? I, for one, think no.
Return to language. It birthed gossip, and gossip, in turn, perfected language as a tool. With gossip, sophisticated cumulative culture started out life. Both cooperation and conflict became manageable. Not that gossip didn’t fuel scandals, power struggles, and outright fights between individuals and groups, sometimes tipping the balance toward serious conflicts. It did. Opposing confederacies, inflamed by gossip reports, would at times inflict unprecedented brutalities on each other. Gossip can be dangerous.
Yet, without gossip and language humans couldn’t succeed also.As can be recalled from a couple of pages earlier on group formation, humans, like birds of the same feather, flock together. They create cohesive groups and live in an organized society. The mere capacity of a species to create groups, though, doesn’t guarantee its dominance over other species. The real genius lies in a species’ ability to maintain an organized group and set up regulating rules and conventions for everyone in the group to follow. Only humans can do that systematically and methodically. Language made it possible.
Ringleader and Alpha male
Until recently, it was believed that the concept of a ringleader is a purely human invention. In the second half of the twentieth century, that belief was challenged. It turned out that this ringleader thing is a biological trait rather than a purely cultural one. Primatologists have documented that ringleaders exist in chimp and other ape societies also. Chimps, like humans, live in hierarchical organizations, call them societies, pivoting on friendships and social bonds. Their group leader is almost always a male, called the Alpha male. It seems that male dominance is an innate trait of the animal kingdom inherited by humans through Evolution. Perhaps that is why, historically, male members of society felt discomfort whenever women assumed leadership roles. Even a cursory reading reveals that that discomfort is reflected in the bias toward women in religious scriptures.
The Alpha male in a chimp community maintains the order and is usually uncontested in his decisions. When two males aspiring for the top job of Alpha are in the fray, they go out to form coalitions of supporters within the group. We are told by Jane Goodall and others, that all types of campaigns, intrigues, and mechanization, are seen during such contests. Eventually, the winner isn’t the male who is more robust in physique but the one who has a larger and more stable coalition of supporters. That is an ultimate show of chimp-oligarchy using violence, coercion, and muscle power.
There’s nothing in the human social system that isn’t seen, in some or the other form, in ape communities. The tussle for power and dominance can be observed in almost all life forms. Perhaps humans merely took it to a new heights with the attendant unnecessary bloodshed of their fellow humans. Here is the point: a typical chimpanzee society consists of fewer than 50 members. When it grows beyond fifty, the order breaks down, and splinter groups are formed. These splinter groups grow independently, and different groups seldom cooperate; they only compete for territory and food. The Primatologists have documented long drawn-out wars between chimp groups. In some cases, chimp wars extend over many years.
Human behavior is almost identical to that of chimps. Despite the language, gossip, and cumulative culture, human groups also reach a threshold number (150 members) that puts a constraint on their proper functioning. It can be observed in organizations, institutions, and corporate companies that they face problems when the employee number crosses the threshold number. The group number – 50 in chimps and 150 in humans – is simply an index of the cognitive capabilities of a species; it doesn’t guarantee a species its place in the hierarchical biological pyramid.
Numbers alone don’t matter. Dinosaurs were terrifyingly numerous. They got wiped out in a blink of an eye on the geological time scale. Similarly, today there are far more farm animals – more than ten times the number of humans on the planet, as Yuval Harari notes – they get slaughtered day in and day out. On an average, fifty billion farm animals are put to knife every year. That is more than seven animals per human being. How do numbers help sheep, chickens, or cattle? A linear or even exponential increase in the population doesn’t matter except to save a species from extinction. To reach the top, in a dog-eat-dog world, brains matter, not bodies. Like it or not, the plain truth is that Life on Earth sustained itself by violence. Thousands of years ago, when humans were struggling against enormous odds, they needed not only numbers, language, and gossip but something more – some additional efficient tools to maintain effective cooperation and unity. Gossip alone can’t guarantee an index size as big as 150 members.
The Trinity – Gossip, Story, and Religion
Gossip needed to be supplemented with auxiliaries. That was not difficult to do, given that humans possess the wherewithal to invent other uses for language. Deep thinking, imagination, and other abstract contemplative capacities are inbuilt into human nature courtesy of well-developed frontal lobes of the brain. Abstract thinking happens in closely related formats: pictures and words. Words are supplied by language, and pictures are already in abundance. All the five sense organs, particularly the eyes, help record impressions of the environment in the brain as pictures. What remains for the mind to do is to associate words with the pictures. That is pretty easy. Our mind is a master at creating streams of thoughts from words and pictures. Then those thoughts are spoken out as poems, prose, and stories. Stories are a powerful tool of communication. We are inherently hardwired to tell and listen to stories. We gather in groups, engage in gossip, and listen to stories. Good stories can hold bigger groups together who own these stories, identify with them, celebrate them and build their identity around stories. In the process, we glue ourselves as a society through our shared stories, myths, and legends. It is the stories, tales, and myths that strengthen and multiply our chains of communication. Idle gossip alone can’t achieve such a feat.
Which leads us to this: we create a story from an idea; one story leads to another, then another, and on and on, still more, giving rise to myths; myths create fiction; the fiction leads to the creation of more and more new fantasies – call them concepts; and concept takes us back to the idea – the building block of everything that we have. The cycle repeats itself ad infinitum. That is to say, one mental construct fuses with another, and another, on and on, giving rise to a tapestry of stories, myths, and fiction. In that tapestry, individual images, pictures, and memories get transmogrified – Sky, Heaven, Earth, and the Sun derive meaning, and beauty, ugliness, love, or hate get a life. That all is a trick of the mind.
Which brings me back to gossip and story: they are potent tools for social bonding. Telling stories at the fireside about the day’s experiences, deceased relatives or people, nature and spirits, ghosts and demons, or about the past, present, and future created, in the early humans case, a sense of community among those who shared a common worldview. For small sized communities, gossip and stories are good bonding tools. For establishing and maintaining bigger community groups like clans (150 or more), mega-clans (500), and tribes (1500), individual stories, howsoever powerful, won’t help much. A composite tapestry blending gossip, stories, and myths– all in one and something more – is needed.
The search for that “something more” led humankind to lay the foundations of the Sacred History. The sacred history inevitably had to collide with Genuine History, and, as Reza Aslan says in No god but God, “precisely there, at that moment of collision, religion was born.” Rest, as they say, are details. The trinity of “gossip, story, and religion” altered the entire equation of raw power of numbers, permanently tilting the balance in favor of Homo sapiens in that cruel game of nature – the struggle for survival.
Dr. Qazi Ashraf is a surgical oncologist and the chief spokesman of JK United Front, a Kashmir-based socio-political organisation.