By Preston Estep, PhD, and Alexander Hoekstra | 24 April 2014
FQXi

Background
Humanity faces many great challenges. Some already appear imposing, yet grow relentlessly in seriousness and complexity. Critical resources are in decline in much of the world. Quantities and qualities of clean air, fresh water and topsoil (1) are outstripping their renewal rates. Production of non-renewable resources such as oil has peaked in most of the world. Climate change and increasingly unpredictable weather patterns make regular news. Border skirmishes and wars still break out routinely in many areas of the world. Other notable challenges include environmental and habitat decline, the growing geographical spread and antibiotic resistance of pathogens, increasing burdens of disease (especially in growing numbers of elderly) and health care expenditures, a potentially catastrophic asteroid strike, and so on.
Immature Science
A recent poll by The Pew Research Center shows that most in the U.S. expect science and technology to come to the rescue—a view likely shared by an increasing number of people in other countries (2). Although those polled have a favorable view of technological progress generally, the poll also indicates that many specific advances are regarded with suspicion or even trepidation. This dichotomy reveals the uneasy historical relationship between a general perceived need for betterment, and the implementation of potentially disruptive specific ideas or technologies. Even the practice of science itself had trouble gaining initial traction, since it historically required that a single individual propose a new idea that challenged prevailing orthodoxy.
Modern discoveries in genetics tell us that human populations separated and have lived in essential isolation from each other for at least 50,000 years, and we know that people from all separated branches of the family tree are able to do science. It is very unlikely that human populations experienced universal convergent evolution toward scientific ability, and much more likely that humans at that time of divergence were capable of science. Yet the age of modern science is probably less than 500 years old—only about 1% of the time since populations split. Understanding why science is so unnatural, and took so long tells us much about human nature and our inherent resistance to change. It also helps us chart our best possible course to the future.
Science and engineering are considered inseparably intertwined in the modern world, but things haven’t always been so. Engineering was quite advanced prior to modern science. For several thousand years, humans have been designing and building amazingly complex and sophisticated roads, bridges, aqueducts, buildings and amphitheaters. Consider the Egyptian pyramids—feats of exceptional engineering. They are over 4500 years old, and even far older monuments and artifacts stand as persuasive testimony to the very long history of engineering. Effective tools and weapons were being made well over 1 million years ago. So why is science so young? Let’s begin at the official beginning.
Though exact dates are disputed it is a generally held convention that the year 1543 launched the Scientific Revolution. Andreas Vesalius published the first work of scientific physiology and Nicolaus Copernicus published his revolutionary claim that the earth orbited the sun, rather than the other way round. Copernicus withheld publication of his heliocentric theory for many years—until 1543, the year of his death—because he feared the repercussions. Copernicus had very good reason to fear, and even if he’d lived another century he might have chosen the same course. Galileo Galilei’s observational evidence from the early 1600s in support of the Copernican theory was dealt with harshly by the Roman Catholic Church, and he spent almost the last decade of his life under house arrest, dying in 1642. Important advances in science and mathematics were made throughout Europe for the remainder of the 17th century, most notably by Sir Isaac Newton, but Newton and other scientists were very guarded about their religious views and were very careful to explain away any possible contradictions their findings might present to accepted religious orthodoxy. In 1697 Thomas Aikenhead was the last person hanged for blasphemy in Britain. The 18th century brought more but still slow and gradual change in the perceptions of science.
12. Human Anatomy
Andreas Vesalius of Brussels, first great teacher of anatomy from natural observations, conducted many anatomical demonstrations on human bodies while Professor of Surgery and of Anatomy at the University of Padua, 1537-1543. pic.twitter.com/iuppmZSlh0— THE ART OF MEDICINE (@Medicalknowled1) November 20, 2019
Over two centuries after Galileo’s death, and a century and a half after Aikenhead’s execution, Charles Darwin—like Copernicus three centuries before—feared the repercussions of his revolutionary ideas, and delayed publication for as long as possible. Darwin might have followed Copernicus’ example, and waited until death was imminent to publish his theory, but a letter from Alfred Russel Wallace, describing his own formulation of essentially the same theory, compelled Darwin to publish. He did so fretfully, fully aware of the still-restrictive social climate and history of persecution—and even execution—of those who dared contradict official church dogma. The newness of science can be more fully appreciated by another development during Darwin’s life: when Darwin began his famous voyage on the Beagle in 1831, the term scientist didn’t even exist; it was only in 1883 that William Whewell coined the term (3).
These historical details underscore the recency of modern science, and strongly suggest at least one powerful reason why it took so long to take hold: people feared contradicting powerful religious dogma. But is that explanation fundamental, or is there a deeper level to this mystery? And why does opposition to certain scientific findings increase as supportive evidence does, as happened in the Galileo case, and as is happening even today in some areas, most notably evolution? Fundamental and retrospectively obvious discoveries are still made, and their apparent obviousness forces people to wonder how they remained undiscovered for so long. Many who fruitlessly prospected the same intellectual territories, but habitually overlooked the now-obvious riches are secular and even self-described atheists.
Is it possible that conventionalism, rather than religion per se, is the more fundamental problem? We can’t ignore such strong evidence—maybe not pointing away from religion so much as pointing toward more fundamental human limitations as ultimate motivations for persecution of ideas that catalyze social upheaval. When important truths lie long undiscovered, and we are seduced into wondering how so many could have been so blind for so long, we should take a moment to realize that a vast treasure of undiscovered truth still lies in plain view before us all. The now obvious wasn’t at all obvious a short time ago, and the completely non-obvious will soon be obvious—that is, once someone has done the difficult work of overthrowing the conventionalism apparently innate to the human mind.
A Mind Lost In Time
The fact that science is so young has important implications for our future. Most importantly, human minds are not good at science. James Watson, the co-discoverer of the structure of DNA, is characteristically blunt on this point, saying “most scientists are stupid.” Watson explained further: “Yes, I think that’s a correct way of looking at it, because they don’t see the future.” Understanding the present well enough to predict the future with reasonable accuracy is an extremely important type of intelligence, and it contributes to good science. Nevertheless, his relativism excuses the failings of better scientists. Again, humans are not good enough at science, and that means all humans. This point is sure to be contested, but alternative explanations are very weak or simply unacceptable.
Some minds are better than others at science, but the basis for a better future, is the acknowledgement that the human mind in its current form is insufficient for certain critical challenges now facing humanity. Albert Einstein, who is considered one of the greatest scientists in history, remarked (during the year following the atomic devastation of Hiroshima and Nagasaki) that “a new type of thinking is essential if mankind is to survive and move toward higher levels.” Those who believe that some people are sufficiently good at science, must confront the unavoidable ethical dilemma accompanying such a belief: they either don’t believe science has the power to fix human problems and assuage suffering, or they don’t care to assuage it. A being capable of practicing science and engineering at the highest imaginable level (for argument, consider god-like abilities) would be capable of assuaging most or all human suffering in short order. This leaves us with two possible attributes to explain our current situation: “insufficiently able” versus “uncaring.” Generalizing from the abundance of caring scientists we know leaves only one explanation consistent with all evidence: human minds as they currently exist are not capable of effecting our most desirable present and future. When we consider that our future depends fundamentally on our minds, both the challenges and the most efficient solution are made clear.
Here is a key question: why should we try to cope with modern, complex civilization, using brains provided by nature for use in a simpler time; brains that have been shaped and constrained by forces that are either already or quickly becoming irrelevant? For example, consider the expense of brains over evolutionary time. The human brain is very large for body size, relative to other species, and countless women have died in childbirth (and still do) as the size of the brain increased well beyond the typical ratio found in other species. Both fetal head size and the additional food energy required in the mother’s diet ensured that in utero brains were under strict constraints that have become more relaxed.
Furthermore, the adult human brain is about 2% of total body weight, but generally consumes more than 20% of daily food energy intake. As a result, making a bigger brain has been very expensive over evolutionary time. Harvard anthropologist Richard Wrangham has put forth the compelling hypothesis that fire was of primary importance in human evolution because cooking allowed a quantum leap in the amount of energy obtained from a given piece of food (4). He suggests that this critical advance helped to launch a phase of rapid evolutionary change in the size and power of the brain. Several important elements needed to be in place in order to discover and exploit fire, but one of them was sufficient intelligence, and that type and level of intelligence was further amplified by a critical technology: the reliable domestication of fire.
This general strategy of developing and using technologies has ultimately leveraged existing intelligence through incrementally higher types and levels of intelligence over evolutionary time. Such “bootstrapping” has been selected for because there are reproductive rewards that accrue to an organism able to adapt quickly to new niches, or even able to create or modify existing niches to better suit their existing biological limits. These are essential features of what we think of as higher intelligence. So, even though this process requires expensive fuel for nature’s tinkering on the brain, sentient life’s most metabolically costly organ, this expense reduces the expense of useful information. This reveals that there has been an inverse relationship between the expense of building a good brain, and the expense of useful information. But, is there a point where useful information becomes so costly that the price of building a better brain is too high? In some cases, the answer must be yes.
Even a large and powerful brain is confronted by challenges that are potentially rewarding, but for which optimal answers cannot be found soon or in the local environment. Even for countless simpler problems, the set of possible solutions is infinite and only some are practical and efficient. Random trial and error explorations of an infinitely large “solution space” will not often be rewarded. There are many types of information that might benefit us, but many are extremely expensive to both acquire and maintain. Given that brains are expensive, and that information can be both difficult to acquire yet extremely valuable for survival and reproduction, there will exist a constant tension—an unbridgeable gap—between what we have and what would benefit us more. UCLA anthropologist Rob Boyd and UC Davis evolutionary sociologist Pete Richerson have extended economic theory into the study of evolution and focus primarily on the acquisition of knowledge. Boyd and Richerson’s “costly information hypothesis” is premised on the idea that when information is costly to acquire, it pays to rely upon cheaper ways of gaining information, and these are generally obtained through social interaction and instruction (5). Note that their hypothesis is essentially just another way to say brains are expensive, except that they focus on the cost of information rather than information processing (brains).
In general, it is cheaper to learn from or mimic someone else’s sequence of words, actions or expressions than to learn a complex behavior by experimentation. These cheaper methods come with other costs not incurred by discovering the information for oneself, but the overall cost will be less in simpler environments. In other words, when information is dangerous, time-consuming, or difficult to acquire and process, it will be learned indirectly through others, but then the expected accuracy will be much smaller than information acquired through direct means. Such a strategy for acquiring new information has obvious implications for adherence to convention, and for constraining innovation, including in the sciences. Boyd and Richerson have built a very solid foundation for this theory, and they make a compelling case that it explains many apparently maladaptive behaviors. As we consider the evolutionary tradeoffs that have shaped the human mind, and acknowledge that essentially all the costs of building better brains and other thinking machines have declined or disappeared, we are left to ask again, why should we continue to struggle to get by with brains mismatched to the complex world we now inhabit?
A Fundamental And General Solution
The most efficient and generalizable solution to all human problems is to enhance our fundamental abilities to solve problems. A dizzying multitude of technologies have been developed for enhancing our physical selves and environments. Tools and techniques have been created to feed, clothe, and care for our material wants and needs. We have, with machines of human design, wrangled rivers and moved mountains; we have tapped the planet for its finite bounty, to suit our immediate desires. But this enhancement of humankind’s physical abilities has expanded at a greater rate than our capacity to wield any such power responsibly, and to foresee the long-term consequences. Only recently—only through this young mode of problem solving that we call science—has a realistic approach to enhancing our innermost selves become conceivable.
Great talk (and beautiful science) by @davidrliu on DNA base editing cures. https://t.co/TEBGFlCqT9
— Preston Estep (@PrestonWEstep) April 25, 2019
Increasing and refining human abilities to solve problems is not a new endeavor. Modifying the mind is a practice visible in every classroom around the world. The act of instruction originated before recorded history, and indeed, before humanity. Learning through traditional means physically changes the structure of the brain, but is slow and inefficient. A complete professional education, from primary school through college and then graduate school, is expected to take well over two decades. Education is the best technology currently available to alter human minds, but it is demonstrably too slow and too narrow to address and surmount the complex threats we face. Education is alteration, but it is not enhancement; it falls short of fundamentally augmenting the evolved potential or upper limits of the mind.
Typical proposals for reducing the impact of problems faced routinely by people every day, in all parts of the world, focus on treating symptoms rather than root causes. There is often no commonality of goals, no sharing of resources produced for each of the litany of serious problems facing humanity today. In fact, the opposite is true: many strategies for solving disparate challenging problems compete for funding and attention. We need to begin thinking more efficiently, cooperatively and synergistically, and seriously consider more fundamental solutions that can be applied to problems more generally. Better brains and other thinking machines are arguably the only technologies capable of counteracting the myriad complex obstacles, problems, and threats facing humanity (including or especially those for which humanity played a contributing role). Thus, better minds provide a truly fundamental and general solution, and to our knowledge, no other problem-solving approach is worthy of such a claim.
The Path To The New Mind
Some of the most threatening global problems have remained tenaciously intractable over the past decades, irrespective of national wealth and technological achievement. Even developed nations suffer from stubbornly stable levels of mental illness, poverty, crime, and homelessness, in otherwise increasingly wealthy economies. Many interventions have been tried, in an effort to reduce poverty and homelessness, including provisions of social services, food allowances, housing benefits, employment resources, various kinds of training and education for all age groups, so-called microloans and other loan guarantees, and so forth. But careful research shows that the primary driver of apparent cycles of social ills is the mind: mental health services improve social conditions, but improved social conditions do not improve mental health and functioning (6).
Mental health research and treatment represents a gateway to the unprecedented and uniquely important enhancement of our minds. Technologies spanning across the fields of genetics and genomics, synthetic biology, neuroimaging, brain-machine interfaces, and others are becoming increasingly powerful, with immediate applications for understanding and treating mental dysfunction and disease. However, these discoveries and technologies are relevant beyond treating mental illness. Given that even the most “normal” human mind is in many ways disabled by naturally imposed limitations, mind research focused initially on disease can provide entree to a more general research platform for mind engineering. This engineering provides a possible escape from outdated and destructive cognitive constructs, which produce and exacerbate human suffering and existential risk.
Minds are central; they are the foundation of humanity’s past, its present, and its future. Human minds are the root cause of all problem-solving inefficiencies, but they are also the only creative engines capable of taking on each of these challenges, and of designing and building a better future. Obvious and serious memory, behavioral, and cognitive inabilities now plague people across the board, irrespective of their level of learning, or best efforts at self-improvement. If the information explosion and complexity of the world at large overwhelm even extremely intelligent and capable people, the situation among others is even more dire, and the costs to all are enormous. The evolution of the human mind allowed us to rise to a position of dominance on our planet, but a rise to dominance in the past does not presage control over the future. As circumstances change dramatically, so must our thinking—and ability to think—to survive and thrive through the long-term. Better minds are indispensable to our survival.
References
1. Paulson T The lowdown on topsoil: It’s disappearing. Available at: seattlepi.com/national/article/The-lowdown-on-topsoil-It-s-disappearing-1262214.php [Accessed April 8, 2014].
2. Smith A Future of Technology | Pew Research Center’s Internet & American Life Project. Available at: pewinternet.org/2014/04/17/us-views-of-technology-and-the-future [Accessed April 19, 2014].
3. William Whewell (Stanford Encyclopedia of Philosophy) Available at: plato.stanford.edu/entries/whewell [Accessed April 19, 2014].
4. Wrangham R (2009) Catching Fire: How Cooking Made Us Human (Basic Books, New York).
5. Richerson PJ, Boyd R (2005) Not by Genes Alone: How Culture Transformed Human Evolution (University of Chicago Press, Chicago).
6. Lund C et al. (2011) Poverty and mental disorders: breaking the cycle in low-income and middle-income countries. Lancet 378:1502–1514.
Reprinted with permission.
Preston Estep is an American biologist and science and technology advocate. He did his doctoral research in the laboratory of genomics pioneer Professor George Church at Harvard Medical School. He is Director of Gerontology and an adviser to the Personal Genome Project, (PGP), an open-source initiative to explore the role of genomes and environments on human traits, based at Harvard Medical School. He founded and is the former Chairman and Chief Science Officer of the Innerspace Foundation, a nonprofit organization focused on neuroengineering. He is the Chief Scientific Officer of Veritas Genetics. Follow him on Twitter @prestonwestep.
Alexander Hoekstra is a biotechnology contributor at the Institute for Ethics and Emerging Technologies and Project Coordinator for the Personal Genome Project.
"The Mindspan Diet: Reduce Alzheimer's Risk, Minimize Memory Loss, and Keep Your Brain Young" by Preston W. Estep. https://t.co/0ofdV8jPun pic.twitter.com/9bzmpoGKvN
— Church and State (@ChurchAndStateN) November 28, 2019
Maximizing Mindspan | Preston Estep | TEDxSanFrancisco
Key Longevity Factors in the Real World and in the Genome | Preston Estep | TEDxSanFranciscoSalon
The Evolution of Genomics – Preston Estep
Be sure to ‘like’ us on Facebook