Singularitarians
   HOME





Singularitarians
Singularitarianism is a Social movement, movement defined by the belief that a technological singularity—the creation of superintelligence—will likely happen in the medium future, and that deliberate action ought to be taken to ensure that the singularity benefits humans. Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the singularity is not only possible, but desirable if guided prudently. Accordingly, they may sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization. American news magazine ''Time (magazine), Time'' describes the worldview of Singularitarians by saying "even though it sounds like science fiction, it isn't, no more than a weather forecast is science fiction. It's not a fringe idea; it's a serious hypothesis about the future of life on Earth. There's an intellectual gag reflex that kicks in anytime you try to swallow an idea that in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Eliezer Yudkowsky
Eliezer S. Yudkowsky ( ; born September 11, 1979) is an American artificial intelligence researcher and writer on decision theory and ethics, best known for popularizing ideas related to friendly artificial intelligence. He is the founder of and a research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion influenced philosopher Nick Bostrom's 2014 book '' Superintelligence: Paths, Dangers, Strategies''. Work in artificial intelligence safety Goal learning and incentives in software systems Yudkowsky's views on the safety challenges future generations of AI systems pose are discussed in Stuart Russell's and Peter Norvig's undergraduate textbook '' Artificial Intelligence: A Modern Approach''. Noting the difficulty of formally specifying general-purpose goals by hand, Russell and Norvig cite Yudkowsky's proposal that autonomous and adaptive systems ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  



MORE