Differential technological development

Strategy of technology governance
(Learn how and when to remove this template message)

Differential technological development is a strategy of technology governance aiming to decrease risks from emerging technologies by influencing the sequence in which they are developed. On this strategy, societies would strive to delay the development of harmful technologies and their applications, while accelerating the development of beneficial technologies, especially those that offer protection against the harmful ones.[1][2]

History of the idea

Differential technological development was initially proposed by philosopher Nick Bostrom in 2002[1] and he applied the idea to the governance of artificial intelligence in his 2014 book Superintelligence: Paths, Dangers, Strategies.[3] The strategy was also endorsed by philosopher Toby Ord in his 2020 book The Precipice: Existential Risk and the Future of Humanity, who writes that "While it may be too difficult to prevent the development of a risky technology, we may be able to reduce existential risk by speeding up the development of protective technologies relative to dangerous ones."[2][4]

Informal discussion

Paul Christiano believes that while accelerating technological progress appears to be one of the best ways to improve human welfare in the next few decades,[5] a faster rate of growth cannot be equally important for the far future because growth must eventually saturate due to physical limits. Hence, from the perspective of the far future, differential technological development appears more crucial.[6][unreliable source?]

Inspired by Bostrom's proposal, Luke Muehlhauser and Anna Salamon suggested a more general project of "differential intellectual progress", in which society advances its wisdom, philosophical sophistication, and understanding of risks faster than its technological power.[7][unreliable source?][8][unreliable source?] Brian Tomasik has expanded on this notion.[9][unreliable source?]

See also

References

  1. ^ a b Bostrom, Nick (2002). "Existential Risks: Analyzing Human Extinction Scenarios". {{cite journal}}: Cite journal requires |journal= (help) 9 Journal of Evolution and Technology Jetpress Oxford Research Archive
  2. ^ a b Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. United Kingdom: Bloomsbury Publishing. p. 200. ISBN 978-1526600219.
  3. ^ Bostrom, Nick (2014). Superintelligence: Paths, Dangers, Strategies. Oxford: Oxford University Press. pp. 229–237. ISBN 978-0199678112.
  4. ^ Purtill, Corinne (21 November 2020). "How Close Is Humanity to the Edge?". The New Yorker. Retrieved 2020-11-27.
  5. ^ Muehlhauser, Anna Salamon. "RFID solutions".
  6. ^ Christiano, Paul (15 Oct 2014). "On Progress and Prosperity". Effective Altruism Forum. Retrieved 21 October 2014.
  7. ^ Muehlhauser, Luke; Anna Salamon (2012). "Intelligence Explosion: Evidence and Import" (PDF): 18–19. Archived from the original (PDF) on 26 October 2014. Retrieved 29 November 2013. {{cite journal}}: Cite journal requires |journal= (help)
  8. ^ Muehlhauser, Luke (2013). Facing the Intelligence Explosion. Machine Intelligence Research Institute. Retrieved 29 November 2013.
  9. ^ Tomasik, Brian (23 Oct 2013). "Differential Intellectual Progress as a Positive-Sum Project". Foundational Research Institute. Retrieved 18 February 2016.
  • v
  • t
  • e
People
  • Nick Bostrom
  • K. Eric Drexler
  • Robin Hanson
  • Toby Ord
  • Anders Sandberg
  • Rebecca Roache
Concepts
  • Differential technological development
  • Global catastrophic risk
  • Great Filter
  • Pascal's mugging
  • Reversal test
  • Self-indication assumption
  • Self-sampling assumption
  • Simulation hypothesis
  • Singleton
Works
  • Anthropic Bias
  • Global Catastrophic Risks
  • Human Enhancement
  • The Precipice
  • Superintelligence: Paths, Dangers, Strategies
  • v
  • t
  • e
Topics


Stub icon

This technology-related article is a stub. You can help Wikipedia by expanding it.

  • v
  • t
  • e
Stub icon

This article about futures studies is a stub. You can help Wikipedia by expanding it.

  • v
  • t
  • e