Differential technological development is a strategy of technology governance aiming to decrease risks from
emerging technologies
Emerging technologies are technology, technologies whose development, practical applications, or both are still largely unrealized. These technologies are generally innovation, new but also include old technologies finding new applications. Emer ...
by influencing the sequence in which they are developed. Using this strategy, societies would strive to delay the development of harmful technologies and their applications while accelerating the development of beneficial technologies, especially those that offer protection against harmful technologies.
[Oxford Research Archive](_blank)
/ref>
History of the idea
Differential technological development was initially proposed by philosopher Nick Bostrom
Nick Bostrom ( ; ; born 10 March 1973) is a Philosophy, philosopher known for his work on existential risk, the anthropic principle, human enhancement ethics, whole brain emulation, Existential risk from artificial general intelligence, superin ...
in 2002 and he applied the idea to the governance of artificial intelligence in his 2014 book Superintelligence: Paths, Dangers, Strategies. The strategy was also endorsed by philosopher Toby Ord in his 2020 book The Precipice: Existential Risk and the Future of Humanity, who writes that "While it may be too difficult to prevent the development of a risky technology, we may be able to reduce existential risk
A global catastrophic risk or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, endangering or even destroying Modernity, modern civilization. Existential risk is a related term limited to even ...
by speeding up the development of protective technologies relative to dangerous ones."
Informal discussion
Paul Christiano believes that while accelerating technological progress appears to be one of the best ways to improve human welfare in the next few decades, a faster rate of growth cannot be equally important for the far future because growth must eventually saturate due to physical limits. Hence, from the perspective of the far future, differential technological development appears more crucial.
Inspired by Bostrom's proposal, Luke Muehlhauser and Anna Salamon suggested a more general project of "differential intellectual progress", in which society advances its wisdom, philosophical sophistication, and understanding of risks faster than its technological power. Brian Tomasik has expanded on this notion.
See also
* Existential risk
A global catastrophic risk or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, endangering or even destroying Modernity, modern civilization. Existential risk is a related term limited to even ...
References
Technology forecasting
Transhumanism
Technological change
Existential risk
{{future-stub