Alvin Toffler, Heidi Toffler
More Technology, Not Less
ALVIN AND HEIDI TOFFLER are authors of such influential books as Future Shock, The Third Wave, Powershift and Creating a New Civilization. They are respectively, chairman and vice chairman of Toffler Associates, advisors to business and governments.
Los Angeles — One of America’s leading technologists has caused a great stir by provocatively admitting that he sees “some merit” in the Luddite reasoning of Theodore Kaczynski, the Unabomber, who once terrorized scientists whose inventions he thought were enslaving the human race.
Bill Joy, chief scientist at Sun Microsystems and the chairman of the presidential commission on the future of Internet technology research, has gone so far as to call on the scientific community to “relinquish” research that might lead to the domination of the human species by the “destructive self-replication” of technologies made possible by genetics, nano-science and robotics.
According to Joy’s essay in the April issue of Wired magazine titled “Why the Future Doesn’t Need Us,” the alarm bell went off when it recently became clear, thanks to new advances in molecular electronics, that computer processing speeds would match the capacity of the human brain by the year 2030. Theoretically that might make possible robots as smart as humans — and as processing capacities develop further, smarter; indeed, smart enough to reproduce themselves.
Joy is not a Luddite; he is a serious man raising a responsible warning, and certainly knows a lot about computers. He has spent 25 years on computer networking where, as he himself writes, “the sending and receiving of messages creates the opportunity for out-of-control replication.” All of us are now familiar with the damage that computer viruses can wreak.
But he too easily accepts the Unabomber’s argument that the human race “might easily permit itself to drift into a position of such dependence on machines that it would have no practical choice but to accept all of the machines’ decisions.” Or else, that the need for control of such processes would lead to a takeover of society by an elite that “domesticates” the masses like animals.
This reasoning is far too either-or. These are not the only alternatives. The scenarios that both Kaczynski and Joy present are essentially mechanistic and mono-causal. They ignore the rich complexity of the physical and social environment, which is filled with thousands, if not millions of negative feedback loops that, in fact, damp down most runaway processes before they reach their ultimate limits. In fact, they assume that computer capacity grows, while the human brain remains static.
But must it? The very technologies they regard as most dangerous—robotics, genetics and nanotech—may very well help us expand the human brain’s capabilities and make it possible for us to use those technologies in completely new ways. Recent advances in stem cell research, for example, challenge the assumption that the brain’s capacity is fixed.
Like Kaczynski, Joy underestimates the ability of humans to mess things up, to rebel, to create, whether by chance or not, counter-technologies, and to step back from the brink of disaster.
Speaking of runaway processes, one of the most amazing facts about the 20th century is not the invention of the atomic bomb, but the fact that after one demonstration of its power, the human race managed for more than half a century never to use one again. That does not mean we won’t, and the dangers of proliferation are real. Nonetheless, the record so far has been remarkable. We managed to chain the chain reaction.
We, too, worry about some of the effects of technology, whether self-replicating or not, and raised warning flags long ago. In Future Shock (1970), we forecast cloning of mammals and, eventually, humans; we warned about the misuse of genetic engineering (even the possibility of race-selective genetic weapons); we discussed the danger of eugenic manipulation and a “biological Hiroshima.” In War and Anti-War (1994), we wrote about replication in the form of “self-reproducing war machines.”
Unfortunately, Joy’s proposed remedy is potentially more frightening than the disease. He recommends that we not only “relinquish” certain technologies (stuffing genies back into bottles, as it were), but that we limit the search for certain kinds of knowledge. Of course, the search for knowledge is always limited — by funds, by cultural blind spots, by political and other forces. Yet within this reality, the ethos of science has always included a belief in free, unhindered curiosity and research. Joy’s proposal strikes at the heart of that ethos, which has, for the last three centuries, deepened our knowledge of the universe we live in and made possible not merely atomic bombs and industrial pollution, but longer life spans, a reduction of pain and hunger, and the blessings of what limited democracy we have.
If, in fact, we set out to limit the range of scientific curiosity, the obvious question is: Who decides? The Ayatollah Ali Khamenei? The Chinese Communist Party (which regards even runof-the-mill farm statistics as state secrets)? Saddam Hussein? The limitations will be imposed by those in power, and they may do exactly the reverse of what Joy proposes.
More fundamentally, trying to restrain the search for knowledge to stem the “destructive selfreplication of technology” is an attempt to limit the self-replication of knowledge itself. That is no more possible for an American computer scientist than for a Chinese apparatchik. Fortunately, you just can’t turn off 6 billion brains. If you could, you would send the entire human race time-traveling back into the 12th century.
The solution? There isn’t an easy one. But the answer is probably more technology, not less. We will need new technologies that shut down systems on their way out of control. We have such technologies now in everything from jet planes to home space heaters. Pharmaceutical companies are on their way to finding precisely how to use proteins in our DNA for these purposes.
Joy’s fear of “destructive self-replication” is warranted. His clarion call for responsible discussion of this issue is well taken. But his extreme pessimism is not. Cancer is an example of a runaway, self-replicating process. But do any of us anymore believe there is no cure for cancer?
Along with worrying about self-replication, it is worth thinking about the following lines from Future Shock:
“The incipient worldwide movement for control of technology, however, must not be permitted to fall into the hands of irresponsible technophobes, nihilists and Rousseauian romantics.... Reckless attempts to halt technology will produce results quite as destructive as reckless attempts to advance it.”
Kaczynski would never agree with that. But we’ll bet Bill Joy does.