The idea in a nutshell : Any rational person would invest the ‘right’ proportion of time and money in mitigating risks they face. Unfortunately, I don’t know anyone with a rational view of the biggest threats around. Oxford University calculates the chances of us being wiped out by an ‘existential threat’ at 19%. What’s your plan to deal with the things we can influence, if the worst does occur ?
Surviving the Zombie Apocalypse
A Zombie Apocalypse is what happens when the world ends because of an outbreak of Zombies. I love Zombie films and The Walking Dead so I consider that I know as much about it as most. Zombies are unlikely to occur in real life, however.
When you look in to threats which pose a real life Armageddon, you rapidly find that there are a lot of things which could go wrong. Many of them would cause the world to end in the sense of killing every human on the planet. In Wikipedia, these possibilities are described as Global Catastrophic Risk.
From whom can we learn on this subject ?
I stumbled across this subject reading litterature from a man who has since become a bit of a personal hero, Nick Bostrom. Nick has written extensively on existential threats (as well as other subjects I am interested in) and it / they make(s) for enlightening reading.
Historically, Bostrom argues, cataclysmic events which hit the human race were relatively benign. Our species have learned from pandemics and wars, a reasonably effective approach of trial and error in dealing with them. Now, he says, things are different. We face risks (borne of technology and terrorism amongst other things) which could wipe out not just sections of the population but human life as a whole. He goes on to point out the waste : These new risks could wipe out the whole of humanity not just now but for the whole of all the future. Such ‘existential’ risk, he argues deprives the whole of the future of humanity from all the good they would have experienced and the value they would have created.
Cambridge has an equivalent to Mr Bostrom: They have a ‘Center for the Study Of Existential Risk’
We have already nearly been wiped out – recently
The Cambridge guys express similar ideas in a more accessible way than Bostrom (to me, anyway.) They provide an example of researchers in Wisconsin who modified a virus to more closely match the 1918 flu which killed 50m people. In so doing, they reveal that Existential Risk is real, practical, immediate and could have a devastating effect.
The chance of it happening might be higher than you think
I love the percentages Bostrom and his counterparts bandy about. He thinks the risk from these threats in a lifetime is 25%. He is part of a group of analysts at Oxford who say the threat is closer to 19% (Source : Wikipedia.) As he rightly points out, if it’s just 1% we should be doing a lot about it and aren’t.
Existential threat reduction is a Public Good
Bostrom’s arguments are so logical, in the face of such emotional topics that I wonder if he is sometimes saying these things tongue in cheek. He suggests, for example, that investments in avoiding global existential threats are under supplied because they are public goods.
(Public Goods, if my years studying Economics are reliable, are undersupplied by the private sector because it’s not possible to exclude people who don’t pay from using them. Classic examples include street lighting and national defence investments like The Army.)
Existential Risks you face which you can do nothing about
I split the long list of exestential threats we face down in to two groups. The first are those about which we can do nothing. That includes:
- Comets crashing in to the earth.
- A sun exploding with a shockwave travelling at the speed of light so that as soon as we see it we’re blown up.
- The concept that we’re living in a simulation and someone turns the computer off.
- Being killed by aliens.
- Being blown up by physicists trying an experiment we know nothing about – like those experimenting with particle accelerators to find the Higgs Bosun particle / similar.
This stuff is good for Science Fiction films but, since you can’t do anything about it, probably not worth worrying about.
Risks we can do something about at a national level
It wouldn’t be easy for me to negotiate agreements, of the following sort, personally.
- As a species, we can come to agreements on the atomic bomb and mitigating it’s threats.
- Similarly, governments can manage threats from terrorism in the form of nanotech which might turn us all in to grey goo or other e.g. genetic engineering.
- In separate work, Bostrom has also presented some ideas on what we can do about badly programmed SuperIntelligence
Most important: Risks we can personally do something about
My concern as a married man, who wants to live a long time, is what can I do to improve the chances I have of surviving an incident. I want to focus on the things we can do something about. Examples might include, say, a pandemic, terrorist act which knocks out services like the army and the police.
Having spent a lot of time thinking about it, I feel like having a sailing boat well stocked with food and water is about the best place to be in any of these situations. The obvious exception is a tsunami.
I love the stories, not the thing
I have always felt like we are close to the end of the world. I loved books about it when I was a kid. Maybe because, as an introvert, I liked the idea of a lifetime of silence. I have thought long and hard about these things. Partly because it’s amusing and interesting.
I wonder if the reason these risks are so threatening are that they have fallen through the cracks in our mental models : We have a normalcy bias – just because things have always been pretty normal, we think they will continue to be so. Risks like these are SEPs – a phrase popularised by one of my favourite authors – Douglas Adams. But we each have a personal responsibility to take the necessary steps to give ourselves a chance in the event that something does happen. And that’s at least one of the reasons I have a boat.
I really do feel like we are hanging by a thread in an increasingly intricate world. Some are suggesting that a disruption to transport (think truck deliveries in cities) might stop key industries, starting with food and healthcare very quickly – like within 72 hours.
Talk of a ‘Zombie Apocalypse’ (in the sense of an event which might end human life on the planet) sounds impossible but it’s not. And numbers like 19%, or 25% or less are non trivial.
Even for those who agree a threat but quantify it at levels below that, we can agree there is a non zero risk. And there are things you can do. So it makes sense to do something. You’ll be kicking yourself if you don’t have a boat and they come to eat your brain.