Share This Page
Let’s say an asteroid or technological mishap wipes out 99.9 percent of humankind, and all of civilization’s inorganic material achievements along with it—computers, power lines, buildings, all of it. The seven million humans left on the planet, while a tiny fraction of what the species used to be, are still many more than the few hundreds of the endangered cheetahs or orangutans currently around. Also, let’s say that those surviving humans are not spread so thin that their reproductive success is at risk from the mere improbability of finding a mate. How long would humanity linger? “Not long” is a good bet. Our teeth and nails haven’t been deadly in ages. Scavenging for hunting implements might turn up a knife or gun, but blades rust and break. And gunpowder? What is gunpowder, again?
“That’s quite extreme,” you might say. So let’s try again, and this time around say it’s biological damage, due to a virus that only kills humans—99.9 percent of them, of all ages. The power plants, power lines, computers, etc. remain. They quickly become useless, though, because not enough people are left to operate them, much less have the knowledge to do so. After all, it does take more than flipping a switch to get a hydroelectric plant to spew voltage from its turbines. Some of the remaining humans know about solar cells, and are able to find a few solar plates with the required connectors to power gadgets. But other than home appliances, electronic gadgets are mostly unworkable, because, well, there is no internet, and no news stations broadcasting signals, also from a shortage of able operators and power.
In this doomsday scenario, computers can be turned on, but they can only access the limited information they stored themselves. With time, they stop working for no major reason other than those motherboard-meets-connector fatalities that happen as materials deteriorate. Maybe there is one human left who does know how to find the faulty circuit or contact and repair it, but she might well be on the other side of the globe (all long-distance communications are down, so you can’t find out), and you have no means to get to her other than on a bike (which soon will also break down or bust a tire that you won’t know how to fix or replace) or on a horse (that will eventually die, and, well, new horses aren’t born tamed. How does one do that again?). You don’t even think of resorting to gas, because once reserves of gasoline are exhausted, getting fossil fuels out of the ground—literally getting the earth to spit black oil—is now akin to doing magic. Soon, the elders are telling stories of pitch-black geysers that caught fire in the air, and metal birds that rained fire, water, or food and people on the ground. They might as well be speaking of unicorns.
Neurons: Size vs. Numbers
And yet, all those remaining humans would still have the same average 16 billion neurons in their cerebral cortex as they had before, a number of neurons so large, that take so much energy to sustain, that no other animal on the planet can afford anything close; at best, gorillas and orangutans carry about half as many neurons in their cerebral cortices.1-3 So many cortical neurons endow humans with cognitive capabilities that are unparalleled in nature, but somehow are not enough to guarantee, in and of themselves, the amazing abilities accrued by humanity. What is it that allows biological capabilities, such as representing quantities and ideas, to get shaped into cognitive abilities such as multi-part mental problem-solving, strategizing, and creating contingency plans? Another human invention made not just possible but necessary by all the technologies those 16 billion cortical neurons have produced and accumulated over time: schooling.
Fortunately, the experiment of taking technological achievements away to separate human abilities from biological capabilities does not have to be made, other than as mental exercises in dystopian science fiction. Anthropologists and paleontologists have already revealed what human biology achieves without modern technology and cultural transmission. Modern humans have been around for at least 200,000 years; shaved and suited up, the sapiens variety of human that took over Europe after the last Ice Age would probably have looked very much like a modern businessman.4. The distribution of some genes in the population may have changed over the millennia, as foods underwent artificial selection and wheat and cheese were introduced as diet staples in some societies, and myopia and other biological shortcomings became fixable with the likes of glasses and surgery.4 The size of the modern human brain, however, has been roughly the same, which, given what we have learned in my lab about how brain size relates to numbers of neurons within and across species, means that the first modern human of 200,000 years ago most likely already had the same 16 billion neurons in the cerebral cortex that we do today.3
There certainly are human-specific genes that code for human-specific features, just like there must be chimpanzee-specific genes, duck-specific genes, and hummingbird-specific genes. Our research has shown, however, that there doesn’t seem to be a distinctively human brain, but rather a primate-specific way of organizing neurons (much as there is, say, a rodent- or carnivoran-specific way of putting brains together)—and of those primate brains, ours happens to be the biggest, with the most neurons in the cerebral cortex.5 The biological grounds of human uniqueness might thus lie simply and foremost (even if not exclusively) in being the primate species with the most cortical neurons.6
Since neurons are the basic information-processing units of the nervous system, the 16 billion cortical neurons with which humans are endowed provide a uniquely large biological capability to process information. The cortical processing that finds patterns, infers conclusions, tells the good from the bad, remembers events, makes plans, changes plans as circumstances demand—it’s all there. Importantly, none of these capabilities is exclusively human. Brains as different as those of a pigeon, a mouse, a macaque, and a human all share similar layouts in how their neurons are connected: each version of a cortex has multiple sensory, motor, and associative zones that appear to function similarly in representing, cross-referencing, and storing information.7
From logical reasoning and understanding symbols to using and even making tools, recognizing itself in the mirror, or planning for the future, there doesn’t seem to be any fundamental functionality of the human brain that is not shared with other species.6 Thanks to many years of behavioral psychological studies based on the growing suspicion that non-human species might be more capable than human hubris once conceded, cognitive differences across species are now believed to be a matter of quantity, not quality: not whether a species can do something, but how well, and how flexibly, they can do it.8 If cortical neurons are like Lego blocks, we humans have the most to play with, which means that, to the extent that they can be rearranged while still obeying the same generic layout, the larger number of assembling blocks in the human cortex endows it with immensely more possibilities.
Because we are primates, not only do we have an enormous pile of Legos, we have fairly small Legos, which means that our brain can do a lot while still not being humongous. Incidentally, our neuronal Legos are not the smallest: even the largest crows and parrots as well as the smallest mammals have neurons that we calculate to be on average much smaller than human cortical neurons.9 What distinguishes humans from other species is not how small or large, dense or scarce our cortical neurons are, but simply how many we have to do the job of navigating through life.9
Cooking as Technology
Having lots of neurons comes at a high price, since the energetic cost of the cerebral cortex is proportional to its number of neurons.10 In that case, how did our species, and it alone, come to have the most neurons in the cerebral cortex? The reason may be quite prosaic: no other animal cooks its food as thoroughly as our ancestors of 1.5 million ago learned to do, a technology that we keep passing down through generations. As gross as it may sound, cooking is tantamount to predigesting food before it enters the mouth, which increases immensely the number of calories that can be effectively transferred to the body, rather than processed from scratch in the enzymatic conveyor belt that is the GI tract.11,12 Soft, pre-digested foods can be completely turned into a pulp in the mouth, which guarantees that digestive enzymes will have access to every molecule that is swallowed, rather than just to the surface of barely broken-down chunks.
Modern humans don’t think of cooking as technology, but indeed it is. In its simplest definition, any object, system, process, knowledge, or idea that facilitates solving a problem. The Homo variety that invented cooking was already bipedal, with the advantage over knuckle-walking apes of consuming less energy to go the same distance, expanding the range of foraging and thus increasing the likelihood of finding food.4 The process of cooking builds on the very first technological implement: the stone tool, not simply a serendipitous rock splinter but an implement fashioned systematically to be handled, and that could be applied to cut meat from animals, crush bone, or pound roots. Our human ancestors that first invented stone tools over two million years ago could use them to feed themselves more rapidly and efficiently, and could thus afford the time to approach and solve new problems.
With “cold” cooking (using knives, acidic marinades, or crushing tools like the first stone implements) or “hot” cooking (with fire), more energy also comes in less time, which by itself compensates for the liability of having an energy-guzzling, neuron-rich brain. And more: because food has to be chewed down into a wet mass before it can be swallowed, the time that is freed up by making foods soft through cooking can now be used for more interesting enterprises, like convincing others to come hunt with you, or trying a new method to make fire, rather than chasing one more hooved creature or digging up one more root.
Once obtaining enough calories per day to feed a large number of neurons was no longer a liability, humans could actually start benefiting from having them. What allows for the energy to maintain more neurons while liberating time also provides more cognitive capabilities and the opportunity to use them. This offered an edge that must have been remarkable enough that, in little over 1.5 million years, the size of the brain of our ancestors, and ours alone, tripled, as those individuals with more neurons tended to fare better and better than the competition.6 And so there was the human species, in all its neuronal glory, but still limited in cognitive feats.
The problem is that a brain with 16 billion cortical neurons is still just that: a big pile of neuronal Legos assembled haphazardly. We have the energy to afford more cortical neurons than any other species, and that number is now presumably written in some still undiscovered form in our genome. But exactly how to arrange those blocks is not specified in our genes and, oddly enough, the lack of that information in our genomes is what makes a cerebral cortex with so many parts so powerful: it can self-organize according to how it is used. Neurons remain malleable even once arranged into the crude layout that is specified by the genes, like the main interstate highways in a country; as they start being used, they assimilate information from the environment that shapes neuronal roads, streets, and alleys as they are deemed useful. The more neurons that build a cerebral cortex, the more that can be tried and found to be true, experimented with, and cause the connections involved to become either reinforced and strengthened, or weakened and eventually lost. And so our brains become shaped by what they do, what challenges they are faced with, which ones they manage to solve, and what others they endeavor to tackle next.
Like a simple block cube of Legos assembled according to the simplest instructions rather than a dazzlingly intricate pattern based on years of painstaking and progressively more complex instructions, the achievements of the first humans pale in comparison to what their descendants do today. Although we can’t know the thoughts and mental complexity of our ancestors, what human biology achieves before acquiring the technology that shapes its brain is demonstrated by a common experiment performed anew by every generation, in every single family: raising infants.
Educating the Mind
This brings us back to schooling. Turning those quantitatively remarkable biological capabilities of the human brain into the actual abilities of modern humans—doing mental math, using one or more languages and translating between them, elaborating a multi-part plan to navigate somewhere, deliver a checkmate or build a new industry—is a whole other story: one of technological achievements and cultural transmission. Yet it too is made possible by those same 16 billion cortical neurons. Remove all technology, or one single generation of transmission to enough people to embody all of its diverse richness, and humanity would be restored to its biological foundation: human capabilities, without the abilities.13
I may hold a Ph.D. in neuroscience, but were I one of the few to survive that viral apocalypse with which this discussion began, I still would not know how to make paper and pencil to commit to writing what I have learned about how brains work and frankly, that would not matter very much. Those survivors who did have the know-how to craft pencil and paper most likely couldn’t put a bicycle together, much less a car, or even a toaster. I would also most likely fail at more down-to-earth tasks such as finding drinking water and non-poisonous plants to eat, navigating back to a safe shelter each day, predicting when to plant and when to harvest, when to slaughter and when to breed. Never mind calculating how many stones I can safely pile in a column or how many sticks I must tie together so they support a roof over my head.
What about coming up with a plan for sanitation lines and potable water, designing a multi-floor building, developing the concept of germs and antibodies and thus vaccines and remedies, conceiving of anesthesia to temporarily turn off pain and our very consciousness so that the body can be opened and operated upon, proposing to use little green pieces of paper as placeholders for work, coming up with intangible codices of what is right and what is not, thinking up a strategy for negotiating world peace? Either we can learn such things from those who came before us, or we have to figure them out all over again each time.
There are so many of us around now that each individual can shape the biological capabilities that come with those 16 billion cortical neurons with bespoke information according to his or her needs, wants and likes, within the realm of one’s opportunities. No longer concerned with the need to ensure that every community has enough healers, hunters, builders, and enforcers, we are now so many brains to have their cognitive abilities shaped that we can take those abilities for granted and fool ourselves with the idea that schooling is optional, an exposure to the ideas of some who came before us so we can “stand on the shoulders of giants” and “not repeat the mistakes of the past.” As long as every child has the opportunity to go to school, it will do to keep believing that simple awareness of the past is the reason why they go to school.
But it is not. We need schooling because our 16 billion cortical neurons, the most of any species, are enough to make us biologically human, but not enough to make us modern humans. We need to be taught by those who came before us; we need exposure to their ways of thinking, knowledge, and technology, to assimilate into our cortices the know-what and the know-how of humanity as a whole, in a systematically curated program of ever increasing complexity and duration that shapes our brains and keeps them ready to pass it on yet again. The more technologies to pass on, the more the teaching technologies needed—those systems and processes to transfer information systematically.
And because no single human can any longer hold in its brain all the knowledge accrued by our ancestors, we need as many brains as possible to be shaped by schooling, so that enough learn to make fire and pottery while others learn to cook meals for the masses or delicacies for the few; enough learn to make steel out of ore, while others learn to bend and assemble it into skyscrapers; enough learn to juggle the sound patterns that our tongues produce and weave their meanings into stories of where we came from and where we could go from here, and commit them to symbols that enough know to decipher back into meaning again; and enough learn to teach it all over again.
That, in a nutshell, is why every human generation needs to go to school: to keep alive the possibility that our descendants, too, will go on learning to shape their human biology into humanity, again, and again, and again.
Financial Disclosure: The author has no conflicts of interest to report.
- Azevedo FAC, Carvalho LRB, Grinberg LT, Farfel JM, Ferretti REL, Leite REP, Jacob Filho W, Lent R, Herculano-Houzel S (2009). Equal numbers of neuronal and non-neuronal cells make the human brain an isometrically scaled-up primate brain. Journal of Comparative Neurology 513, 532-541.
- Fonseca-Azevedo K, Herculano-Houzel S (2012) Metabolic constraint imposes trade-off between body size and number of brain neurons in human evolution. Proc Natl Acad Sci USA 109, 18571-18576.
- Herculano-Houzel S, Kaas JH (2011) Gorilla and orangutan brains conform to the primate scaling rules: Implications for hominin evolution. Brain Behav Evol 77: 33-44.
- Lieberman DE (2014) The story of the human body. Vintage, NY.
- Herculano-Houzel S (2012) The remarkable, yet not extraordinary human brain as a scaled-up primate brain and its associated costs and advantages. Proc Natl Acad Sci USA 109: 10661-10668.
- Herculano-Houzel S (2016) The Human Advantage: A new understanding of how our brain became remarkable. MIT Press, Cambridge.
- Shanahan M, Bingman VP, Shimizu T, Wild M, Güntürkün O (2013) Large-scale network organization in the avian forebrain: a connectivity matrix and theoretical analysis. Front Comput Neurosci 7, 89.
- Emery NJ, Clayton NS (2004) The mentality of crows: convergent evolution of intelligence in corvids and apes. Science 306, 1903-1907.
- Herculano-Houzel S (2017) Numbers of neurons as biological correlates of cognitive capability. Curr Opin Behav Sci 16, 1-7.
- Herculano-Houzel S (2011) Scaling of brain metabolism with a fixed energy budget per neuron: implications for neuronal activity, plasticity and evolution. PLoS One 6, e17514.
- Wrangham RW (2009) Catching fire: How cooking made us human. Basic Books, NY.
- Zink KD, Lieberman DE (2016) Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans. Nature 531, 500-503.
- Henrich J (2015) The secret of our success: How culture is driving human evolution, domesticating our species, and making us smarter. Princeton University Press.