Wednesday, 12 April 2017

Evaluating Existential Threats, Part 3: Anthropogenic Threats

Lake Huron Sunset, at the entrance to Kincardine Harbour, April 12, 2017

In my last post in this series I referred to an article on Wikipedia about "Global catastrophic risk" which I had stumbled upon while researching existential risks for this series of posts. A good article, which gave me the idea of dividing those risks into non-manmade and manmade varieties to spread them over two posts. It has a pretty thorough list of non-anthropogenic threats (which I covered in that post). And it lists most of the major anthropogenic threats that I want to discuss in this post. But it fails to to tie those risks together in the way that I think is necessary if one is going to arrive at any sort of deep understanding of what is going on in the world today. I'll be addressing this in my next post —"Evaluating Existential Threats, Conclusions."

Artificial intelligence

There are two different threats in this area.

1) A Technological Singularity

The idea here is that we will develop general artificial intelligence that is capable of improving on itself, and that it will do so at an ever increasing rate, quickly out stepping human intelligence and continuing to grow beyond what we can even imagine (thus the term "singularity"). Such intelligence could constitute an existential threat if it doesn't have our best interests at heart or it it is confused about what those interest may be.

Based on this, several notable people (Stephen Hawkin, Bill Gates, Elon Musk and others) have expressed concern about the dangers of developing artificial intelligence. This is probably not a bad thing, in that it will likely make AI researches more cautious.

But I see several problems with the idea that general AI is a credible threat, mainly to do with the limits that exist in the real world.

But first, is general artificial intelligence even possible? Since I am a monistic materialist (there is only one thing, the material world that we see around us) I see my own consciousness as a software process running on the "meatware" computer between my ears. So, in principle at least, I can see no reason why similar software can't be developed to run on manmade hardware.

In practice, though, it will neither be simple or easy. "Singularitarians" point to the "law of accelerating progress", and it is true that technological progress has been accelerating for the last couple of centuries. They assume that progress will continue at an exponential rate. But they are disregarding the "law of diminishing returns", which we talked about in my posts on "The Collapse of Complex Societies." (1 2).

The advances in computer science and AI research are clearly a case of problem solving by adding complexity, and with that comes increases in cost which will eventually slow down the process and bring it to a halt. Essentially they are looking at the upward leg of a logistic curve and mistaking it for an exponential curve, jumping to the conclusion that it will continue upward forever instead of leveling off. In the real world, spurts of exponential growth always run out of steam and level off.

(Left: exponential curve  ---  Right: logistic curve)

Graphic of exponential curve, Peter John Acklam—Own work
Graphic of logistic curve By Qef Created from scratch with gnuplot, Public Domain
Creative Commons Licence

We already see this happening with "Moore's Law", a favourite phenomenon of the technological optimists. This is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years. The death of Moore's Law is already happening, making it seem unlikely that there will be a singularity based on growth in the power of computing hardware.

It will be interesting to see where the growth in the capability of software tops out, but I would say there is no guarantee we will achieve artificial intelligence even equaling human intelligence. And that is without considering the problems I expect over next few decades as decreasing surplus energy and the resulting economic contraction reduce the resources available for research. But that will happen very unevenly and I have no doubt that there will be a few institutions (universities, corporations) who hang on for quite some time and keep working on AI.

So let's assume for the moment that we do develop general artificial intelligence that is capable of improving on itself, and that it does out step human intelligence by a significant amount before it encounters limits and tops out. Before such an intelligence could do much harm it would have to have accurate knowledge of the physical world and some agency to act in that world (through robotics, presumably). But even then, we must remember is that intelligence is not a magic solution to all problems, it will not automatically be able to develop technology that can do whatever it and/or we want. It will be subject to the same limits as we are, since those limits are not some human failing, but are actually built into the nature of reality.

The main idea of this blog is that the problems we face are caused by our encountering those limits and cannot be solved, by intelligence or anything else, but are simply conditions that we must adapt to. I really do believe this is true, so I expect that a hypothetical artificial intelligence of a sufficiently high degree would reach the same conclusion.

So, it seems to me that the risks here are small, and the supposed negative consequences of AI are by no means certain. Accordingly, a technological singularity is just not a big worry for me.

2) An Economic Singularity

The whole history of the industrial revolution has been one of industries improving their productivity by replacing human labour with powered machinery. Over the last few decades, as the surplus energy available to our society decreased, business desperate for productivity gains to maintain their profitability have taken this process to new heights. Long before general AI becomes a problem, "narrow" artificial intelligence and robotics will have replaced most human workers. Many of us are quite worried about what will happen to the consumer economy when most consumers cannot find jobs. And more important, what will happen to the ex-consumers themselves.

Others argue that this is the beginning of a world without scarcity —that we will start with a guaranteed minimum income and go on from there to create a workers' paradise. I don't think this is likely for a number of reasons.

First of all, scarcity exists because we live on a finite planet and there really is a limited supply of the resources (energy and materials) needed to run our industrial civilization. Scarcity is not caused by high labour costs, and is unlikely to be cured by automation. We are experiencing a shortage of energy, not labour, and we would do better to adopt a policy of "rehumanization" —replacing automation with people, rather than the reverse.

In any case, those who are running the world's industries do not, for the most part, care in the least about their unemployed former workers. If you don't have a part to play in the BAU world, at the bare minimum if you have no money in your pocket to spend as a consumer, then you had best just go away. Of course, with fewer industrial workers, there will be fewer consumers and less of a market for the products of those industries. It would appear that the industrial/consumer economy will continue to contract, flying backwards in ever decreasing circles until it disappears up its own.... And as the ranks of the unemployed and homeless swell, they will have more and more reason to take things into their own hands and resort to violence against the system.

This is especially true in countries where lowering of taxes on corporations and the rich is seen as the primary way of stimulating the economy. It is true that lowering taxes can somewhat protect corporations and the rich from economic contraction in the short run. But it leaves the rest of the population even more exposed. And it leaves the government no alternative but to go further into debt to maintain its commitments.

Progressive taxation to fund infrastructure and social programs can, it is pretty clear, slow economic contraction and shield more of the population from the initial effects of collapse. But this too is a limited solution, as economic contraction continues and the tax base grows ever smaller.

The "economic singularity" is something that is already happening and is almost certain to get worse. It is definitely something to worry about.

Biotechnology

I see biotech (genetic engineering) as an area with a lot of room for advancement and a good deal of promise in helping mankind cope with the challenges ahead of us. So it pains me considerably to say that it is also a technology with a large potential for harm. I say this in particular because advances in technique are in the process of making genetic engineering accessible to people without proper training and who work in an unregulated environment. This increases the both chance of accidents and the opportunities for deliberate misuse.

I have no doubt we will see the biotech equivalent of improvised explosive devices used in terrorist attacks at some point in the not too distant future. These could take many forms but basically what we are talking about is an engineered pandemic, with the potential to spread rapidly and harm far more people than a conventional IED.

The risk is high and the potential for harm is great. So this is a threat that I would definitely worry about. It will be difficult to make preventative measures effective as techniques like CRISP become widely available. About all we can do is beef up the organizations that detect, quarantine and develop vaccines for communicable diseases.

Human Sourced Pandemic

Another similar threat not covered in the Wikipedia article is a pandemic originating from within human society. In my post on non-anthropogenic threats, I touched brief on the possibility of a pandemic arising from nature, but concluded I wouldn't be worrying about this because the risk was small. There is a significantly larger risk of a pandemic originating in the disease factories of hospitals, refugee camps and cities.

According to the evolutionary epidemiologist Paul W Ewald of the University of Louisville, the most dangerous infectious diseases are almost always not animal diseases freshly broken into the human species, but diseases adapted to humanity over time: smallpox, malaria, tuberculosis, leprosy, typhus, yellow fever, polio. In order to adapt to the human species, a germ needs to cycle among people —from person to person to person. In each iteration, the strains best adapted to transmission will be the ones that spread. So natural selection will push circulating strains towards more and more effective transmission, and therefore towards increasing adaptation to human hosts. This process necessarily takes place among people....

Looking at epidemics and pandemics through this evolutionary lens makes it clear that the most important condition necessary for the evolution of virulent, transmissible disease is the existence of a human disease factory. Without social conditions that allow the evolution of virulent, transmissible disease, deadly outbreaks are unlikely to emerge.

In the next few decades we are going to see more refugee camps and cities (and slums) growing ever larger. This is one to worry about.

Global warming (anthropogenic climate change)

There is simply no doubt left that human activities releasing carbon dioxide (and other greenhouse gasses) into the atmosphere are causing global warming. There was 0.6 degrees of warming during the twentieth century and since the turn of the century that has gone up to a total of almost 1 degree. By the end of the century there is expected to be a total of between 2.4 to 6.4 degrees of warming, depending on the level of emissions between now and then. Of course, those numbers of average over the whole planet. In fact, the warming is taking place unevenly, with more warming at high latitudes than near the equator.

This warming is causing many changes:

  • more extremely hot days, fewer extremely cold days
  • currently wet areas getting more and heavier rain (flooding)
  • currently dry areas getting less rain (drought)
  • intensification of tropical storms
  • less winter snow pack
  • retreating mountain glaciers
  • melting polar ice caps
  • warming oceans
  • sea level rise
  • ocean acidification

These changes are already having disruptive effects on our global civilization, which will only get worse as they intensify:

  • agriculture grows less productive with the disappearance of the reliable weather it relies on, in some areas it becomes impractical to continue farming
  • health effects of heat waves and the spread of tropical diseases into formerly temperate areas
  • damage to homes, businesses and infrastructure due to increasingly heavy weather and rising sea level

There are those who argue that the effects of anthropogenic climate change will be much more severe:

  • rendering much, if not all, of the planet unfit for human habitation
  • with further heating, much of the planet might become unfit for life of any kind
  • runaway climate change could eventually transform Earth into another Venus

Lots here to worry about, even if (like me) you take those last three points as unlikely. The rest of it is almost certain to happen. Another one to worry about.

Ecological disaster

This would consist of failure of ecosystems and loss of the services they provide us due to trends such as overpopulation, economic development, and non-sustainable agriculture.

This is already underway and it is definitely something worry about.

Mineral resource exhaustion

From Wikipedia:
Romanian American economist Nicholas Georgescu-Roegen, a progenitor in economics and the paradigm founder of ecological economics, has argued that the carrying capacity of Earth — that is, Earth's capacity to sustain human populations and consumption levels — is bound to decrease sometime in the future as Earth's finite stock of mineral resources is presently being extracted and put to use; and consequently, that the world economy as a whole is heading towards an inevitable future collapse, leading to the demise of human civilization itself.

Because we use the easy to access part of a resource first (the low hanging fruit), when we get to using the not-so-low hanging fruit, it is not just less plentiful, but harder to access as well. Even though we have developed techniques for refining minerals from much lower grade ores, it takes more energy to access each unit of such resources, making them prohibitively expensive long before they run out in any absolute sense.

The term "mineral resources" includes the fossil fuels that power our civilization, and for which there really isn't any practical substitute. Unfortunately, the authors of the Wikipedia article do not seem to be aware of the critical link between energy and the economy.

Resource depletion is a serious problem that is actually happen and will have even more severe consequence down the road. Definitely something to worry about.

Experimental technology accident

I find this pretty hard to take seriously. Those working with such technologies are very aware of the risks involved and go to great lengths to avoid them, since they themselves would be a ground zero if there is an accident. And measures are in place to reduce the seriousness of such accidents if they do take place. A small risk is thus rendered much smaller —insignificant, to my way of thinking, provide we continue to take the appropriate precautions.

Nanotechnology

I read Eric Drexler's Engines of Creation: The Coming Era of Nanotechnology shortly after it came out in 1986. The heart of the idea was that we would soon develop nano-scale computers and robots which could accomplish fantastic things one atom at a time that could not otherwise be done at all. Presumably such powerful machines could be used to do great harm, or could do great harm if they got out of control.

Here we are over 30 years later and this is an area of technology that has not lived up what its promoters saw as its promise, for good or ill. I'm not worrying about this one.

Warfare and mass destruction

Since the end of the Cold War, we have had a few decades of relative relief from the fear of nuclear war. But the arsenals still exist and it is beginning to seem that international relations are deteriorating. With climate change and resource depletion as stressors they may continue to do so.

This is another one to worry about. We need to do whatever we can to discourage politicians with itchy trigger fingers.

World population and agricultural crisis

From Wikipedia:

The 20th century saw a rapid increase in human population due to medical developments and massive increases in agricultural productivity such as the Green Revolution. Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. The Green Revolution in agriculture helped food production to keep pace with worldwide population growth or actually enabled population growth. The energy for the Green Revolution was provided by fossil fuels in the form of fertilizers (natural gas), pesticides (oil), and hydrocarbon fueled irrigation. David Pimentel, professor of ecology and agriculture at Cornell University, and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in their 1994 study Food, Land, Population and the U.S. Economy the maximum U.S. population for a sustainable economy at 200 million. To achieve a sustainable economy and avert disaster, the United States must reduce its population by at least one-third, and world population will have to be reduced by two-thirds, says the study.

The authors of this study believe that the mentioned agricultural crisis will begin to impact us after 2020, and will become critical after 2050. Geologist Dale Allen Pfeiffer claims that coming decades could see spiraling food prices without relief and massive starvation on a global level such as never experienced before. Wheat is humanity's 3rd most produced cereal. Extant fungal infections such as Ug99 (a kind of stem rust) can cause 100% crop losses in most modern varieties. Little or no treatment is possible and infection spreads on the wind. Should the world's large grain producing areas become infected then there would be a crisis in wheat availability leading to price spikes and shortages in other food products.

Definitely one to worry about.

My Analysis

I will have a great deal more to say about what this all means in my next few posts. But I think it is clear from what we've looked at so far, that the universe is a relatively benign place. True, there are a couple of non-anthropogenic threats that are worthy of our attention, but the threats we have created ourselves are numerous, they are not hypothetical—they are already happening and certain to continue, and they have as much or more potential for harm than anything nature seems likely to throw at us.

No comments: