You can edit almost every page by Creating an account. Otherwise, see the FAQ.

Vulnerable world hypothesis

From EverybodyWiki Bios & Wiki






Script error: No such module "Draft topics". Script error: No such module "AfC topic".

The vulnerable world hypothesis refers to the idea that there may be some technologies that destroy the civilization when discovered (except in some specific cases[lower-alpha 1]). This concept was introduced by the philosopher Nick Bostrom.[1][2] He also proposes a classification of those vulnerabilities, some counterfactual examples of how technology could have gone wrong, and policy recommendations such as differential technological development.[3][1] If a technology that entails such a vulnerability is developed, the solutions supposed to be needed to survive (i.e. effective global governance or preventive policing depending on the type of vulnerability) are controversial.[3][4][5]

The urn analogy[edit]

According to Bostrom:[3]

One way of looking at human creativity is as a process of pulling balls out of a giant urn. The balls represent possible ideas, discoveries, technological inventions. Over the course of history, we have extracted a great many balls – mostly white (beneficial) but also various shades of gray (moderately harmful ones and mixed blessings). The cumulative effect on the human condition has so far been overwhelmingly positive [...]. What we haven’t extracted, so far, is a black ball: a technology that invariably or by default destroys the civilization that invents it. The reason is not that we have been particularly careful or wise in our technology policy. We have just been lucky.

— Nick Bostrom, The Vulnerable World Hypothesis

Definitions[edit]

Bostrom defines the vulnerable world hypothesis as the possibility that “If technological development continues then a set of capabilities will at some point be attained that make the devastation of civilization extremely likely, unless civilization sufficiently exits the semi-anarchic default condition.”[1]

The "semi-anarchic default condition" refers here to having:[1]

  1. Limited capacity for preventive policing.
  2. Limited capacity for global governance.
  3. Diverse motivations[lower-alpha 2]

Types of vulnerability[edit]

Bostrom distinguishes multiple types of vulnerabilities :[1]

  • Type 0 : a technology carries a hidden risk and inadvertently devastates the civilization.
  • Type 1 : a technology gives small groups of people the ability to cause mass destruction.
  • Type 2 : a technology has the potential to devastate the civilization, and powerful actors are incentivized to use it, potentially because using it first seems to bring an advantage, or because of some tragedy of the commons scenario.

Counterfactual examples with nuclear bombs[edit]

Atmospheric ignition[edit]

If nuclear bombs had been able to ignite the atmosphere, it would have been an example of a type-0 vulnerability. Such ignition was predicted not to occur for the Trinity nuclear test in a report commissioned by Robert Oppenheimer. But the report has been deemed shaky given the potential consequences : “One may conclude that the arguments of this paper make it unreasonable to expect that the N + N reaction could propagate. An unlimited propagation is even less likely. However, the complexity of the argument and the absence of satisfactory experimental foundation makes further work on the subject highly desirable.”[3]

Easy nukes[edit]

Luckily, it is very difficult, costly and long to produce nuclear bombs. Still nowadays, only a few states can afford to produce it due to the difficulty of making enriched uranium. The "easy nukes" thought experiment proposed by Nick Bostrom opens the question of what would have happened if nuclear chain reactions had been easier to produce, for example by "sending an electric current through a metal object placed between two sheets of glass."[3]

This illustrates a situation where individuals or small groups get the ability to cause mass destruction[3] (a vulnerability of type 1).

Real-world candidates[edit]

While this is a subject of debate, some technologies that have been pointed out as potential vulnerabilities are advanced artificial intelligence, nanotechnology and synthetic biology (synthetic biology may give the ability to easily create enhanced pandemics).[6][7][8][9]

Implications[edit]

Pausing the technological progress may not be possible or desirable. An alternative would be to prioritize the technologies that are expected to have a positive impact, and to delay those that may be catastrophic, a principle called differential technological development.[3]

The potential solutions varies depending on the type of vulnerability. Dealing with type-2 vulnerabilities may require a very effective governance and international cooperation. For type-1 vulnerabilities, if mass-destruction ever gets accessible to individuals, there may be at least some small fraction of the population that would use it.[3] In extreme cases, mass surveillance might be required to avoid the destruction of civilization, a prospect that received significant media coverage.[10][11][12][13][14]

Footnotes[edit]

  1. It depends, according to Nick Bostrom, on whether society is in a "semi-anarchic default condition" (see § Definitions).
  2. And in particular, the motivation of at least some small fraction of the population to destroy the civilization even at a personal cost. According to Bostrom : “Given the diversity of human character and circumstance, for any ever so imprudent, immoral, or self-defeating action, there is some residual fraction of humans who would choose to take that action.”[3]

References[edit]

  1. 1.0 1.1 1.2 1.3 1.4 Katte, Abhijeet (2018-12-25). "AI Doomsday Can Be Avoided If We Establish 'World Government': Nick Bostrom". Analytics India Magazine. Retrieved 2023-05-28.
  2. Bostrom, Nick (November 2019). "The Vulnerable World Hypothesis". Global Policy. 10 (4): 455–476. doi:10.1111/1758-5899.12718.
  3. 3.0 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 Piper, Kelsey (2018-11-19). "How technological progress is making it likelier than ever that humans will destroy ourselves". Vox. Retrieved 2023-05-28.
  4. Finley, Klint. "Technology That Could End Humanity—and How to Stop It". Wired. ISSN 1059-1028. Retrieved 2023-11-07.
  5. "How to Protect Humanity From the Invention That Inadvertently Kills Us All". Inverse. 2019-04-18. Retrieved 2023-11-07.
  6. Walsh, Bryan (July 15, 2020). "The dire lessons of the first nuclear bomb test". Axios.
  7. Bilton, Nick (2018-11-28). "The "Black Ball" Hypothesis: Is Gene Editing More Dangerous Than Nuclear Weapons?". Vanity Fair. Retrieved 2023-11-07.
  8. Torres, Phil (2019-10-21). "Omniviolence Is Coming and the World Isn't Ready". Nautilus. Retrieved 2023-05-29.
  9. "AI-Powered Malware Holds Potential For Extreme Consequences - Could Artificial Intelligence Be a Black Ball From the Urn of Creativity?". Zvelo. 2023-04-26. Retrieved 2023-11-07.
  10. Houser, Kristin (19 April 2019). "Professor: Total surveillance is the only way to save humanity". Futurism. Retrieved 2023-05-28.
  11. Bendix, Aria. "An Oxford philosopher who's inspired Elon Musk thinks mass surveillance might be the only way to save humanity from doom". Business Insider. Retrieved 2023-05-28.
  12. Taggart, Dagny (2019-04-24). "Global Government and Surveillance May Be Needed to Save Humanity". The Organic Prepper. Retrieved 2023-10-16.
  13. Gheorghe, Ana (2019-04-27). "Mass surveillance could save us from extinction, claims Professor". Cherwell. Retrieved 2023-10-16.
  14. "None of our technologies has managed to destroy humanity – yet | Aeon Essays". Aeon. Retrieved 2023-05-28.

External link[edit]

The Vulnerable World Hypothesis


This article "Vulnerable world hypothesis" is from Wikipedia. The list of its authors can be seen in its historical and/or the page Edithistory:Vulnerable world hypothesis. Articles copied from Draft Namespace on Wikipedia could be seen on the Draft Namespace of Wikipedia and not main one.