The Future Will Be Different, But Will We Be the Same?
The Future of Human Augmentation and Performance Enhancement
In most of our science fiction and our projections of the future, everything has changed—we have robots, flying cars, artificial intelligence, warp speed, laser swords--but we remain pretty much the same. Humans of the future are exactly the same physically and mentally as humans today.
In science fiction, this is probably necessary for dramatic purposes. You want the audience to identify with your characters, and that's harder to do if those characters are too strange and different. When science fiction does touch on the idea of genetic or cybernetic enhancements, it usually does so as a dystopian cautionary tale. Even the Star Trek franchise, our usual go-to source for an optimistic take on the future, becomes notably technophobic when it comes to human augmentation. The idea of cybernetic enhancements, of bionic limbs and brain implants, gave the Star Trek universe its most menacing antagonist: the Borg. As for the idea of genetic enhancements, well, let's just sum up Star Trek's attitude this way.
Yet those enhancements are coming, and some of the technology is here or nearly here. We had better start giving serious, realistic consideration to how it can be used, how it will be used, and what we should think about it.
The concept of human augmentation, which is also called human performance enhancement or HPE, tends not to receive much attention because it is diffuse. It encompasses a range of technologies across very different disciplines. It's helpful to gather them together under one heading and survey the different ways in which we humans might potentially alter our own nature.
There are five main areas where we are currently pursuing human augmentation.
1. Bionics and Prosthetics
This is the form of human augmentation that is already being tested out for a small number of special users. You can even go to a Cyborg Olympics, a competition to test whose bionic limbs and robotics exoskeletons are the best.
[T]he Paralympics celebrates exclusively human performance: athletes must use commercially available devices that run on muscle power alone. But the Cybathlon honors technology and innovation. Its champions will use powered prostheses, often straight out of the lab, and are called pilots rather than athletes. The hope is that devices trialed in the games will accelerate technology development and eventually be used by people around the world.
Note that this is not about improving or enhancing existing human capabilities so much as it is about restoring capabilities to those who have lost them. But that is largely due to the limits of the technology, to the fact that a bionic arm is still much less dexterous than a normal human arm. At some point, however, this technology will become good enough that it will offer the prospect of enhancing existing human abilities.
We're a very long way from the point where anyone would be tempted to amputate a normal limb in order to replace it with a cybernetic version, though that is the ultimate vision of a few scientists and entrepreneurs, and it is already reflected in some fictional portrayals. Will Rossellini, the CEO of a neurotechnology company who also served as an adviser for the latest Deus Ex video game, predicts that "our bodies are going to look more like cars in the future, where we are making parts that will fit into anybody's system, where we are upgrading parts the way we upgrade a cell phone."
The more likely alternative that is already being implemented in real life is the use of robotic exoskeletons, which don't replace the normal human body but give it extra strength and in some cases extra dexterity.
They are already being used to help the paralyzed walk or as a robotic glove to help those with limited strength or range of motion in their hands. And exoskeletons are beginning to be used in industrial applications and in the military, which sees a lot of value in a system that could help a soldier travel farther and faster and carry heavier loads with less fatigue.
The ultimate goal for military applications is an armored robotic super-suit—yes, like in "Iron Man." As I quipped a number of years ago, the future is less likely to look like the Terminator movies, with soft, weak unenhanced humans battling indestructible killer robots, and will look more like the climactic battle in Aliens, where Sigourney Weaver dons a massive industrial forklift suit to battle the alien monster.
Or, on a more prosaic level, it will look more like Keahi Seymour, the engineer who has created stilt-like bionic boots that allow him to run at astonishingly fast speeds. We think of industrial and military applications for bionic enhancements, but also think of, say, Ultimate Frisbee played with bionic extensions for the legs and hands.
As we contemplate integrating machines with the human body, we face a number of difficult hurdles. These machines need better power supplies so that bionically enhanced humans don't have to be tethered by power cords. For all of the recent advances in battery technology, the human body is still far more advanced when it comes to carrying its own fuel and power supply. Moreover, to operate like real human limbs, or alongside real human limbs, robotic enhancements need a sensitive sense of touch. Both of these problems are dealt with in a recent attempt to create an artificial skin that detects touch and derives power from sunlight. But there's still a very long way to go.
So let's sum up:
The Promise: An end to physical disabilities, with bionics replacing lost eyes, ears, and limbs—and the prospect of enhancing the able-bodied with super-human strength, speed, and stamina.
The Questions: Would people eventually feel pressure or temptation to remove healthy limbs or organs in favor of enhanced bionic replacements, and would that be a good thing?
One of the biggest challenges is to coordinate our bionic or robotic augmentations with the brain—to send signals back and forth across the barrier between our mechanical enhancements and our biological nervous system—so that you will be able to pilot a robotic exoskeleton with the same ease with which you move one of your own limbs.
The need for such an interface leads us to the next major category of human augmentation.
2. Brain-Computer Interfaces
Brain-computer interfaces, or BCIs, are being tested for use in controlling artificial limbs for the disabled, and for communication with those who are "locked in" due to spinal cord injuries—or even for reversing paralysis through a "neural bypass" that allows the brain to communicate directly with the muscles.
This is beginning on a small scale. Arguably, the world's first real cyborgs are people with cochlear implants to restore their hearing. Retinal implants are about 30 years behind but are improving.
Brain-computer interfaces have to overcome some basic problems. External systems, like brain-scanning headsets, should in theory be able to detect activity in the brain with enough detail to tell when you are thinking of a certain word, or when you are thinking about moving in a particular direction. This capability is being explored as a way of controlling your avatar in virtual reality, for example, and it has even spawned a Star Wars-themed "force training" toy that works by detecting a certain type of concentration. But these headsets can't project information back into the brain, and they still have very limited detail and resolution.
"The quality and fidelity of the data depends upon how many EEG sensor contact points will be able to make a direction connection to the skin on your scalp. The more sensors that available will provide better data, but may be more inconvenient to use. Since the most crucial contact points are at the same place as to where the VR straps are at, then using EEG input for a input to a VR experience may require a custom integrated headset."
Internal implants face their own set of limitations. Metal and electronics don't tend to mix well with flesh and chemicals, and the formation of scar tissue over implanted electrodes degrades their function over time. There have been recent experiments with implants that sit on top of the brain and project magnetic fields into it, which can be focused very precisely, sending signals from an artificial retina into the brain.
Project this out far enough and you might get virtual reality and augmented reality coming from inside your head. Or phone calls coming to you as voices in your head. Or you will find yourself talking to someone and notice that they get a far-off look in their eyes for a moment, and that's because they're looking up a fact on Google so they can bring it back into your conversation.
This technology, whenever it arrives, will revolutionize how we interface with our devices and with the growing digital world around us. The best interface with our technology is not to point and click and scroll through menus. The best interface is no interface. It's to interact with our devices the way we interact with our hands and feet and eyelids. We just think it and it happens.
As Elon Musk explains it, "We're already cyborgs. Your phone and your computer are extensions of you, but the interface is through finger movements or speech, which are very slow." The prospect is to augment our thinking more swiftly and directly by connecting it to the cloud and even to the Internet.
But could this also change the very way we think? In augmenting our brains, will we alter them? What if you could use all of the Internet as an extended memory bank—and would that be a particularly wise thing to do? Certainly, getting ranked first in Google search results would become even more valuable if companies knew the results were being beamed directly into people's heads. And imagine the fierce editing wars when people use Wikipedia as a kind of collective memory. That is, more so than we do already.
These are issues we are already struggling with just because people spend a lot of time squinting at hand-held electronic devices. Bringing the devices into our brains magnifies the issues in scope and intensity.
Let's sum up:
The Promise: Faster, seamless access to information and to interaction with our machines.
The Questions: Will this result in an even greater dependence on our devices? Could it lead to a real life version of this cartoon?
Even when we want to, it will be harder to tune out our devices when they're a part of us. So will the future of BCIs make it even harder for people to slow down and interact with the real world?
These enhancements aren't ready yet, but a lot of capital is being poured into them. Elon Musk just announced the launch of Neuralink which is working on a "neural lace" brain-machine interface. He joins another Silicon Valley entrepreneur, Bryan Johnson, whose Kernel start-up is working on the same problem.
But Kernel isn't just trying to make neural interfaces for our machines. It's also experimenting with ways to change and enhance the functioning of our brains. That's an even more radical notion and leads us to the next form of future human enhancement.
This is the most speculative technology of all, because we still know so little about how the human brain works, which limits our ability to affect that function in a beneficial way.
Current efforts, particular under Bryan Johnson's Kernel, are focusing on "neuroprosthetics" to enhance memory by breaking the code for the storage and retrieval of memories in a part of the brain called the hippocampus, which can then be augmented by an implant.
Notice that neuroprosthetics are following the same path as mechanical prosthetics: they are being proposed first as an attempt to restore normal functioning to the impaired. Which makes sense. If it is morally and practically questionable to remove a healthy limb in favor of an enhanced bionic replacement, think how much more questionable it would be to intervene in a healthy brain in pursuit of some speculative new enhancement. So it makes sense that this technology will be tested out first in patients who are already facing the progressive loss of their mental faculties and thus have less to lose by trying to stem the deterioration.
At some point, however, this technology is going to be perfected to the point where it will be considered a valuable enhancement. What if you could, for example, draw on perfect recall of all the events in your life—every meeting, every conversation, every piece of music? What if you could sort through data more rapidly and notice new connections?
Elon Musk has suggested that this is what we will need to do to keep artificial intelligence from making us obsolete, but I think Bryan Johnson's perspective is more interesting: that the difference between us and the machine will be moot. Johnson hails an era of HI, human intelligence, seamlessly augmented by AI, artificial intelligence.
[I]t is already obvious that humans and AIs will be able to form a dizzying variety of combinations to create new kinds of art, science, wealth and meaning. What could we do if the humans in the picture were enhanced in powerful ways? What might happen if every human had perfect memory, for instance?
In short, we are poised for an explosive, generative epoch of massively increased human capability through a Cambrian explosion of possibilities represented by the simple equation: HI+AI. When HI combines with AI, we will have the most significant advancement to our capabilities of thought, creativity and intelligence that we will have ever had in history.
To sum up:
The Promise: Building enhanced mental function, possibly even some form of super-intelligence, into our own brains instead of building it into a separate computer.
The Questions: What are the side-effects and potential long-term consequences of altering our brain functions with implants?
Consider one of the reasons steroids are banned in most sports. When one player begins taking steroids, this can enhance his performance so dramatically that everyone else feels the need to take them to remain competitive—but then everyone ends up experiencing the side effects, both immediately and in the long term. So one reason for banning steroids is to keep athletes from ruining their bodies in an attempt to keep up with their enhanced rivals.
When it comes to neurotechnology, we can call this the Total Recall Principle, after the 1990 film in which a friend advises Arnold Schwarzenegger's character (in somewhat earthier language), "Don't mess with your brain." If you think it's going to take us a long time to trust self-driving cars, it's going to take us much longer to trust machines to help us drive our brains.
Speaking of steroids, that brings us to our next form of enhancement. Using electronic implants to affect our brain functions is a radical and difficult step. Instead, we could just try doing it the old-fashioned way: with pharmaceuticals.
We are all familiar—some of us more so than others—with psychotropic drugs that affect mood and perception. Nootropic drugs (from nous, the Greek word for "mind") are drugs that affect and in theory enhance the process of thinking. The term was coined in 1972 by a Romanian chemist who sought drugs that would enhance learning and memory.
Nootropics have developed something of a fan base in Silicon Valley, where acute mental function is revered and rewarded as much as physical strength and stamina are in the NFL. Hence the Silicon Valley executive who takes "a mixture of exotic dietary supplements and research chemicals that he says gives him an edge in his job without ill effects: better memory, more clarity and focus and enhanced problem-solving abilities. 'I can keep a lot of things on my mind at once.'"
The problem: "long-term safety studies in healthy people have never been done. Most efficacy studies have only been short-term." What you tend to get is a lot of anecdotal evidence from committed enthusiasts, but it is hard to tell how much of this is just a placebo affect among people who would be motivated high achievers in any case.
In effect, nootropics are supposed to be steroids for the brain, and that comparison ought to prompt some cautious reflection, because the effect of steroids is well known—including a list of deleterious side effects, particularly from long-term use. This already includes impacts on mental function and mood. On a more everyday level, we can observe the effect of well-known drugs that boost mental stamina or alertness, from caffeine to amphetamines. A period of heightened alertness and concentration tends to be followed by a crash in which the same characteristics that were briefly enhanced are now depressed. What the drug giveth, the drug taketh away. And the body can develop resistance to a drug's effects. Methamphetamine addicts who start taking the drug so they can party all night end up taking it just to be able to get out of bed in the morning.
A normal, healthy body and brain are delicately balanced self-regulating systems, and we have to be very cautious about interfering with that balance. Benefits gained at one place for one moment tend to be canceled out elsewhere.
Let's sum up:
The Promise: Super-smarts, like that pill in the movie Limitless (which is, in fact, based on a real drug).
The Questions: What's the catch? What are the side effects? And how big is the benefit, really? Is this just "brain training" all over again—a fad that thrives on anecdotal evidence but doesn't have much actual data to show its effectiveness?
One writer's experiment with Nuvigil, the real-life inspiration for Limitless, gives us an idea. He reported noticeably increased mental function—combined with extreme physical lethargy and difficulty sleeping, so that he found he didn't miss the drug when he stopped taking it.
All the augmentations we've discussed so far seek to enhance our natural endowments by adding on to them. But what if we can change our natural endowments themselves, at their source, on the genetic level?
That leads us to the last major area of human enhancement.
5. Gene Editing
Back in the 1980s, we first starting hearing a lot about the promise of genetic engineering and gene therapy, the idea of being able to edit human genetic code and propagate the new code throughout the body. Reports at the time indicated that it might take another ten to fifteen years before the technology was practical. It turned out to be more like 30 years—but it's finally here, thanks to CRISPR.
A lot of other things had to happen, too. The speed and cost of sequencing the human genome need to collapse, which it has. And we need to have an enormous amount of data about which genes code which characteristics—something we're still working on. The final piece is the ability to edit DNA at will and propagate the change through multiple cells. And that's what CRISPR has just given us. This has led to speculation that in the future, "writing code" won't just mean software for computers. It will mean coding DNA.
As usual, this technology is being used first for curing diseases, including personalized medicine for cancer treatment and repair of congenital defects.
But the wider possibility for augmentation is obvious. Humanity is the product of a multi-billion-year experiment in testing out genetic variations. We can survey our species and find people with genes that make them taller, smarter, faster, stronger, and so on—and not just by a little bit. There are people in the world who are genetic outliers, born with a capacity for truly extraordinary performance. If we can identify those extraordinary genetic endowments and figure out how to patch them in to our existing genetic code, giving us these enhanced qualities.
It is not clear whether it will be possible to do that by changing an adult's existing DNA. The technology is likely to be applied first to human embryos, creating "designer babies" at the behest of ambitious parents who want their children to be ready to excel from the womb.
If we won't do it, the Chinese probably will.
The limitation of this technology is that DNA is incredibly complex. There are 3 billion base pairs in the human genome, and just reading and comparing the genomes from a significant number of individuals involves staggering amounts of data storage. Because each genetic change can potentially interact with the others, the effects of gene editing may not be predictable or immediately obvious, particularly as we progress beyond fixing a single mutation that is known to cause a specific disease.
Moreover, there is no ethical way to experiment with all of these combinations and their effects, which could involve creating a whole cohort of human children who are supposed to be enhanced but end up with genetic problems that degrade their bodies and shorten their lives. This is not something we want to be doing by trial and error.
To sum up:
The Promise: Picking and choosing extraordinary abilities out the gene pool and putting them all together in ultra-capable new humans.
The Questions: There is the risk that in re-engineering our own DNA, we will introduce problems instead of improvements, and that we won't know this ahead of time.
We know that all of these enhancements are eventually going to become possible, in some form and to some degree. A normal, healthy person might be inclined to leave well enough alone rather than signing up as the guinea pig for an untested enhancement, but there are two circumstances where the incentives become much greater and the inhibitions lower.
First, we already see various forms of augmentation being tested on those who are injured, diseased, or disabled, where the purpose is not to enhance their performance, but to bring them back to normal. It's a matter of risk versus reward. If you have already lost a limb, or are suffering from a relentless degenerative disease, you have little to lose by trying a drug or an implant or gene therapy that promises to restore you to health. Yet if the technology succeeds at this goal, it will be natural for us to start asking how we can use it for enhancement above the normal.
Second, the main driver for the use of such enhancements will probably be its military application. It's a similar risk-reward calculation. For the soldier, an extra degree of strength, stamina, or alertness can be a matter of life and death. Combine that with the resources available to the US military, which will be willing to pay for the new augmentations early on, while they are still relatively expensive.
In addition to these two factors, we can also expect the technology to be brought into the mainstream by a small number of enthusiastic early adopters, like those Chinese parents or nootropic-obsessed Silicon Valley entrepreneurs. There will be people who are so excited by the possibility of enhanced function that they are willing—perhaps foolishly—to bear the risk and added expense.
Those are the reasons to think the advent of this technology is inevitable. But is it desirable?
For all of the concern that augmentation will change human nature, part of our nature has always been a desire to improve ourselves and to increase our power over the world. Humans have already been significantly "enhanced" by such technologies as language, culture, and medicine. Kernel's Bryan Johnson puts it this way:
A seemingly simple change 2.5 million years ago—using stone tools to butcher animals—led early hominids down the path to becoming modern humans.
From that modest starting point, throughout human history, we created tools that increased our individual and collective intelligence and became extensions of our natural selves. We started with crude functional tools such as hammers and axes. Over just a few tens of thousands of years, we progressed to more intelligent tools, such as thermostats, and governance technologies based on rule-of-law rather than despotism.
Or as Elon Musk put it, we are already practically cyborgs in the way that we have integrated technology and electronic computers into our lives. Certainly, the way we live now would be unrecognizable to a caveman or even a medieval peasant.
Moreover, the next stage of human development is likely to come slowly and gradually, over a period of many decades. Boosters of the new technology may predict that they're going to be downloading their minds into computers by the 2030s. If that ever does happen, it will almost certainly take much longer. This means we will have time to adjust, and in the meantime, we will enjoy so many other technological improvements and minor enhancements that what seems dangerous and unnatural today may not seem so radical by the time it actually becomes available.
If we have time to adjust, then we'd better use it, to begin recognizing the new technology as it arises and prepare ourselves for the important decisions about whether and how to use it.