Superintelligence Explosion Period

From GM-RKB
Jump to: navigation, search

An Superintelligence Explosion Period is an emergence period of superintelligences.



References

2017a

  • (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/technological_singularity#Manifestations Retrieved:2017-2-28.
    • I. J. Good speculated in 1965 that artificial general intelligence might bring about an intelligence explosion. Good's scenario runs as follows: as computers increase in power, it becomes possible for people to build a machine that is more intelligent than humanity; this superhuman intelligence possesses greater problem-solving and inventive skills than current humans are capable of. This superintelligent machine then designs an even more capable machine, or re-writes its own software to become even more intelligent; this (ever more capable) machine then goes on to design a machine of yet greater capability, and so on. These iterations of recursive self-improvement accelerate, allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.[1]

2017b

  • (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/intelligence_explosion Retrieved:2017-2-28.
    • The intelligence explosion is the expected outcome of the hypothetically forthcoming technological singularity, that is, the result of humanity building artificial general intelligence (AGI). AGI would be capable of recursive self-improvement leading to the emergence of ASI (artificial superintelligence), the limits of which are unknown.

      The notion of an "intelligence explosion" was first described by , who speculated on the effects of superhuman machines, should they ever be invented:

      Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia.[2] However, with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humans.[3] If a superhuman intelligence were to be invented —either through the amplification of human intelligence or through artificial intelligence — it would bring to bear greater problem-solving and inventive skills than current humans are capable of. Such an AI is referred to as Seed AI[4] [5] because if an AI were created with engineering capabilities that matched or surpassed those of its human creators, it would have the potential to autonomously improve its own software and hardware or design an even more capable machine. This more capable machine could then go on to design a machine of yet greater capability. These iterations of recursive self-improvement could accelerate, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in. It is speculated that over many iterations, such an AI would far surpass human cognitive abilities.

2012

  1. Cite error: Invalid <ref> tag; no text was provided for refs named stat
  2. Ehrlich, Paul. The Dominant Animal: Human Evolution and the Environment
  3. Superbrains born of silicon will change everything.
  4. Yampolskiy, Roman V. "Analysis of types of self-improving software." Artificial General Intelligence. Springer International Publishing, 2015. 384-393.
  5. Eliezer Yudkowsky. General Intelligence and Seed AI-Creating Complete Minds Capable of Open-Ended Self-Improvement, 2001

2011

1990

1988

1965