Global Catastrophic Risks by Nick Bostrom

Rating: 7/10

Read More on AmazonSubscribe to get future book notes & reviews

High-Level Thoughts

An interesting though very dense and heavy book. If you want to become depressed over the potential of going extinct, well, here you go!

Summary Notes

A “global catastrophic risk” is one that threatens the existence of mankind.

“Current estimates indicate that our biosphere will be essentially sterilized in about 3.5 billion years, so this future time marks the end of life on Earth. The end of complex life may come sooner, in 0.9-1.5 billion years owing to the runaway greenhouse effect (e.g., Caldeira and Kasting, 1992).”

“Burton et al. (1978) report that when dams and levees are built, they reduce the frequency of floods, and thus apparently create a false sense of security, leading to reduced precautions. While building dams decreases the frequency of floods, damage per flood is so much greater afterwards that the average yearly damage increases.”

“More generally, people tend to overestimate conjunctive probabilities and underestimate disjunctive probabilities (Tversky and Kahneman, 1974). That is, people tend to overestimate the probability that, for example, seven events of 90% probability will all occur. Conversely, people tend to underestimate the probability that at least one of seven events of 10% probability will occur.”

“If human beings did not age, so that 100-year-olds had the same death rate as 15-year-olds, we would not be immortal. We would last only until the probabilities caught up with us. To live even a million years, as an unaging human in a world as risky as our own, you must somehow drive your annual probability of accident down to nearly zero. You may not drive; you may not fly; you may not walk across the street even after looking both ways, for it is still too great a risk. Even if you abandoned all thoughts of fun, gave up living to preserve your life, you could not navigate a million-year obstacle course. It would be, not physically impossible, but cognitively impossible.”

“Instead, one must accept that a very small human population would mostly have to retrace the growth path of our human ancestors; one hundred people cannot support an industrial society today, and perhaps not even a farming society. They might have to start with hunting and gathering, until they could reach a scale where simple farming was feasible. And only when their farming population was large and dense enough could they consider returning to industry. So it might make sense to stock a refuge with real hunter-gatherers and subsistence farmers, together with the tools they find useful. Of course such people would need to be disciplined enough to wait peacefully in the refuge until the time to emerge was right. Perhaps such people could be rotated periodically from a well-protected region where they practiced simple lifestyles, so they could keep their skills fresh. And perhaps we should test our refuge concepts, isolating real people near them for long periods to see how well particular sorts of refuges actually perform at returning their inhabitants to a simple sustainable lifestyle.”

“From this perspective, the most dangerous political development to avoid is world government. A world totalitarian government could permanently ignore the trade-off between stability and openness.”

“If parents had complete control over their babies’ genetic makeup, the end result would be a healthier, smarter, better-looking version of the diverse world of today. If governments had complete control over babies’ genetic makeup, the end result could easily be a population docile and conformist enough to make totalitarianism stable.”

Enjoyed this? Be sure to subscribe!