Managing Editor of International Journal of Algebra and Computation John Meakin and Contributor of International Journal of Algebra and Computation Peter Roberts spoke to Sir Martin Rees about his book titled Our Final Century. Our Final Century reviews the main threats facing humanity.
V) Why did you write this book?
MR) For many years I’ve been giving talks to students and other groups on scientific responsibility, on arms control, on the environment. I gathered notes from lectures and articles and tried to synthesize them into a publishable book. Obviously, I’m not an expert in many of the sciences relevant to these threats, and that’s why I tried, when I could, to quote people who were. When I write a book like this, I am writing as a generally concerned scientist, not specifically as an astronomer.
V) Are you predicting the end of the world?
MR) No, in my book I say that I think there is a 50 percent chance that in the coming century civilization will suffer a setback as bad as a catastrophic nuclear war. New science and new technology may open up risks that are at least as threatening and in some cases even more intractable. The future of science is very hard to predict. All we can say is that the advance of science in this century will be at least as dramatic as in the last century and, indeed, that the advances will be on a broader front.
Specifically, I am concerned that, for the very first time, human beings and human character may be changed by science. Over the last several thousand years almost everything in our world has changed except human nature and human beings themselves. But even that will be malleable in the next century. For all these reasons, I believe we have entered a century that is going to confront us with unprecedented ethical challenges, risks and opportunities. These are all things that a wide public has to be aware of. Although scientists may be responsible for these changes, it is not just scientists who will need to confront them and certainly not scientists who need to address the ethical issues.
V) Odds of survival of around 50 - 50 are rather horrendous. Would many scientists agree with your estimate?
MR) I mean odds of survival of civilization. I think it is unlikely that everyone is going to be wiped out. But you’d have to be optimistic to rate something that’s as bad as a catastrophic nuclear war as unlikely. I would say those who think I’m ultra pessimistic are themselves in denial about the growing threat that is going to come from the greater empowerment of individuals and small groups by new technology.
V) Behind the threats that you talk about is the fear that someone somewhere, for ideological, political or other reasons, may decide to unleash mass destruction. What does that say about the stability of human nature, and why do we have this dark propensity?
MR) A small fraction of people have had those propensities. But the amount of damage that an individual could do in the past has been limited. What concerns me is that we may reach a stage when a single individual can unleash a real catastrophe. That means that the tiny fraction of individuals with those propensities will be harder for us to cope with in the future than in the past. A single individual empowered by new technology can have a far bigger effect because not only is there greater empowerment, but society is more interconnected and more vulnerable. And the media can amplify any local catastrophe so it affects the mindset of a whole nation or even the whole world.
V) And you think science can play a role in changing human nature?
MR) All I emphasize in my book is that new kinds of targeted drugs can have stronger effects on human character than traditional drugs have done. There’s even a possibility of changing human beings over a few generations genetically. And people talk about electronic implants in our brains so we can improve some aspects of mental capacity. All these things are at the moment science fiction but may not be 50 years from now. So that lends an extra uncertainty to all our predictions.
V) One of the themes of your book is the nuclear threat. There was a huge sigh of relief when the Cold War ended; a lot of people felt that the nuclear threat ended with it.
MR) Well, it’s alleviated temporarily in that the risk of a catastrophic nuclear exchange, involving letting off tens of thousands of nuclear weapons, has now diminished. But there is a greater risk than before of a few nuclear weapons going off in some regional conflict. If we look a hundred years ahead, there could be political realignments that lead to a standoff as serious as we had during the Cold War.
We can’t assume that the present disposition of international power, with one superpower, will remain for more than a few decades. As long as we have this vast arsenal of nuclear weapons and the knowledge to make them, we will always live under this threat. On a shorter timescale, I think we should be concerned about the proliferation of nuclear weapons to more than the 10 countries that now have them.
V) It is widely predicted that, in the years to come, the EU and China will rise to superpower status, challenging the supremacy of the United States. Some say superpower rivalry is probably the preeminent threat in the years to come.
MR) Well, I think it is likely. One of the possibilities is some drastic realignment, with a multipolar world, and those superpowers will have nuclear technology and perhaps other technology that could be even more destabilizing if it’s misused.
V) Underlying many of these questions, as you say in your book, we have an unrestrained science that is moving ahead at an ever faster rate and turning out ever more exciting and wonderful things, but that also has the terrible potential for what you call the downside. Is science out of control, and if so, what can be done to contain it?
MR) I think the dilemma is twofold. First, when scientific research is carried out, the scientists themselves can’t predict how it will be applied. The same discovery has both benign and harmful applications. So we can’t avoid the risks without denying ourselves the benefits, which we don’t want to do. Second, even if we do feel that certain applications of science are unethical or dangerous and we try to regulate against them, enforcing such regulations worldwide is going to be very hard.
V) Doesn’t this logically point to the need for individual moral restraint and a sense of personal morality?
MR) Yes certainly, and also, obviously, hoping that society will evolve so that the number of people who are disaffected is diminished. I think scientists themselves can’t make these decisions alone, but they have an obligation to engage in dialogue with the wider public to make people aware of what the opportunities are, what the risks are, and how best to achieve the benefits and reduce the risks.
V) We are at a point in history where development is speeding on at a fantastic pace, and yet we are talking about whether we will survive or not. That’s quite a paradox, isn’t it?
MR) Well, those two things are connected, because it is the rapid advances of science and technology that have allowed human beings to actually change the global environment, and to have such a tremendous multiplier effect that they can affect whole societies. So it’s these rapid advances that have made us vulnerable and also have opened up new possibilities of changing human nature, controlling the way the future evolves, and even spreading beyond the earth.
What happens this century is crucial, even viewed in a time span of billions of years. It is the very first time in which human beings can change their planet. In fact, in the book I give this concept of an alien civilization looking at the earth. They would have seen very slow changes over several billion years and then this sudden spasm of activity - in one hundredth of one millionth of the age of the earth - and they would wonder what’s going to happen; would we survive this spasm or not?
V) Can we change human attitudes and the way we think quickly enough to survive? Are you more negative or more positive about the outcome?
MR) Well, as my 50 - 50 odds indicate, I’m very open - minded about whether the situation will improve in the next hundred years or whether it will lead to some kind of catastrophe. Obviously, we can foresee a very benign future if we live in a more sustainable way. And there are reasons for optimism, too, because the most rapidly developing technologies are miniaturization and information technology. Those are both benign because they allow an improvement in lifestyle without greater demands on raw materials.
So there’s a possibility that we can have an improvement in the quality of life, even in the developing world, without consuming more resources. We can have a more energy - efficient style of life for everyone using new technologies. One can imagine a future that is sustainable and is entirely benign. But the question is whether we can achieve that and whether it is stable.
V) What action would you like to see that would improve humanity’s odds of survival and tip the balance in the right direction?
MR) I believe that greater awareness of the opportunities and the risks is a first step toward focusing on how to optimize things, and I would hope that there will be a concern to minimize the risks. I would hope also that there would be some pressure to redirect technology so that the focus is on environmentally benign aspects, and that we’ll try to underplay those technologies that are most threatening.
V) There’s a fascinating quote in your book from someone who you said summed up your own thinking: “I don’t feel humbled by the vastness of the heavens, but by human qualities of thinking and love.” One could make a strong argument that what the world needs a lot more of, even more than tremendous scientific discoveries, is love. Do you see this as the quality that is needed for human survival?
MR) Certainly I think many of the problems confronting the world now require a better application of existing knowledge rather than new knowledge. New knowledge can sometimes help to alleviate the problems raised by science in the past, but obviously the most glaring injustices in the world - for example, the maldistribution of globalization’s benefits and of health care - don’t require new science. They just require a change in national attitudes so as to allow better use of existing knowledge, a different set of priorities, and so on. Many of the problems require, as you say, a change in attitude and not more knowledge.
No comments:
Post a Comment