Thursday, January 16, 2014

Existential Risk and our Ticking Clock



Risk Management, and awareness of 'Existential Risk' is something that interests me, and should interest everyone.

A decade ago, I read Martin Rees' [our former Astronomer Royal] book 'Our Final Century' [re-titled in the US as 'Our Final Hour'], and also Nick Bostrom's work on 'Existential Risk' and his 'Simulation Argument', as well as the 'Singularity theory' from Vernor Vinge [before the Millennium]. 

These theories, debates, axioms are actually interlinked, and my view of them is rather depressing. With thousands of scientists working in the world, Moore's Law being valid [ie Computer Technology doubles every 18 months], technologies are converging, and some of this convergence will [and have] lead to major technological innovations, some which may lead to a singularity, or a major structural change in what we "see around us", and could lead to what is 'trans-humanism', or perhaps indicate that the fabric of reality is a simulation, with past, present and future co-exiting in a flux. This also leads to another dilemma, ie have we free will? If not then all this theorising is irrelevant, as the future[s] are set as solidly as our past[s]. The reason for the plural is that it is also highly likely that our reality is one of an infinite number, if multiverse theories are to be considered valid.

The major problem to this is that we may never reach that stage ['trans-human'], due to 'existential risks'. These could be external, eg Major Meteorite collision with Earth, an Accident eg Major Viral Outbreak, or intentional cause like a Nuclear War. Thus scientific convergence will fuel the path to either [a] Trans-humanism or [b] our destruction. 

The 'simulation argument' is also related. If we are living in a simulation, it may well be positioned and programmed to observe, either [a] or [b]. So even if we are living in a simulation, there is no getting away from 'existential risk' or a 'technological singularity' that will change us radically, akin to a 'singularity'.

One such change was the discovery and exploitation of Oil / Gas at the turn of the century - ie hydrocarbons as an energy source [and in Pharma, Food, Plastic et al]. So I consider the time period we are traversing to be a fascinating one, but one also loaded with existential risks, many fueled by the convergence of technologies.

If you are old enough, you will be aware of the 'Doomsday Clock' and its implications. Let's hope that as humanity, we understand how we got to this position, and not destroy ourselves, intentionally or inadvertently. If we do enter a trans-humanist phase, let it be for the better of all of us.

It would be a monumental shame if all this development of humankind [that has got us to this point in our history] were for nought, because we couldn't understand what is at risk, a risk that is far from being purely existential.

Here's some links if you are interested in these thoughts and what they pose to humankind, today -

THE LATEST NEWS ON THE DOOMSDAY CLOCK'S POSITION
http://www.cbsnews.com/news/doomsday-clock-set-at-5-to-midnight/

THE CONCEPT OF EXISTENTIAL RISK
http://en.wikipedia.org/wiki/Existential_risk

TECHNOLOGICAL SINGULARITY
http://en.wikipedia.org/wiki/Technological_singularity

SIMULATION ARGUMENT 
http://www.simulation-argument.com/


No comments:

Post a Comment