Thevulnerable world hypothesis[1] or the"black ball" hypothesis[2] refers to the idea that civilizations may likely be destroyed by somedisruptive technologies (a black ball) unless extraordinary measures are taken against the scenario from happening. The philosopherNick Bostrom introduced the hypothesis in an initial publication in 2019 in the journalGlobal Policy[3][1] and later further discussed in a 2022 essay published inAeon along with co-author Matthew van der Merwe.[4] The hypothesis is quoted in discussions about the safety of advanced technologies.[5][6]
Bostrom illustrated the hypothesis using an urn analogy. He likened the process of technological invention to drawing balls from an urn where the color of balls represents their impact. White balls are beneficial and constitute most of the balls drawn from the urn. Some balls are gray, which represent technologies with mixed or moderate effects. Black balls represent hypothetical technologies that tend to destroy by default the civilization that invents it. According to Bostrom, it is largely due to luck that humanity has not encountered a black ball yet, rather than carefulness or wisdom.[5]
Bostrom defined the vulnerable world hypothesis as the possibility that "If technological development continues then a set of capabilities will at some point be attained that make the devastation of civilization extremely likely, unless civilization sufficiently exits the semi-anarchic default condition."[3] except in some specific cases.[a] The "semi-anarchic default condition" refers here to having:[3][7]
To exemplify the vulnerabilities, Bostrom proposed a classification system and gave examples of how technology could have gone wrong, and policy recommendations such asdifferential technological development.[5][3] If a technology that entails such a vulnerability is developed, the solutions supposed to be needed to survive (i.e. effective global governance or preventive policing depending on the type of vulnerability) are controversial.[5][6][8] The classification includes:[3][1]
A proposed hypothetical example of this is if nuclear bombs had been able to ignite the atmosphere. Nuclear ignition was predicted not to occur for theTrinity nuclear test in a report commissioned byRobert Oppenheimer. But the report has been deemed shaky given the potential consequences : "One may conclude that the arguments of this paper make it unreasonable to expect that the N + N reaction could propagate. An unlimited propagation is even less likely. However, the complexity of the argument and the absence of satisfactory experimental foundation makes further work on the subject highly desirable."[5]
The "easy nukes" thought experiment proposed by Bostrom opens the question of what would have happened if nuclear chain reactions had been easier to produce, for example by "sending an electric current through a metal object placed between two sheets of glass."[5]
According to Bostrom, pausing the technological progress may not be possible or desirable. An alternative would be to prioritize the technologies that are expected to have a positive impact, and to delay those that may be catastrophic, a principle calleddifferential technological development.[5]
The potential solutions varies depending on the type of vulnerability. Dealing with type-2 vulnerabilities may require a very effective governance and international cooperation. For type-1 vulnerabilities, if mass destruction ever becomes accessible to individuals, there may be at least some small fraction of the population that would use it.[5] In extreme cases,mass surveillance might be required to avoid the destruction of civilization, a controversial prospect that has received significant media coverage.[9][10][11][12][13]
Technologies that have been proposed as potential vulnerabilities areadvanced artificial intelligence,nanotechnology andsynthetic biology (synthetic biology may give the ability to easily create enhancedpandemics).[14][2][15][16]