Superintelligences Expansion Period

From GM-RKB
Jump to navigation Jump to search

A Superintelligences Expansion Period is a time period when superintelligences expand into the Universe.



References

2024

  • GPT-4
    • ASI Emergence Period (Hypothetical: Late 21st Century - Early 22nd Century):
      • It follows the advanced development of AGI, potentially occurring in the late 21st or early 22nd century.
      • It marks the transition to an intelligence far superior to human cognition.
      • It involves the development of entities that surpass human abilities in all domains.
      • It represents the initial phase of true Superintelligence.
    • ASI Expansion Period (Hypothetical: Early to Mid-22nd Century):
      • It involves the application of Superintelligence in global systems.
      • It aims to address complex global challenges such as climate change, poverty, or disease.
      • It raises significant concerns about control and safety due to its immense capabilities.
      • It highlights the potential misalignment between Superintelligence goals and human well-being.
    • ASI Explosion Period (Hypothetical: Mid-22nd Century and Beyond):
      • It is often associated with the concept of a technological "singularity."
      • It represents a period of unpredictable and rapid advancement in Superintelligence.
      • It could lead to a complete transformation of human society, technology, and possibly biology.
      • It presents a future where the outcomes and impacts of Superintelligence are beyond human comprehension.

2014

  • (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Technological_singularity#Intelligence_explosion Retrieved:2014-2-22.
    • The technological singularity, or simply the singularity, is a hypothetical moment in time when artificial intelligence will have progressed to the point of a greater-than-human intelligence, radically changing civilization, and perhaps human nature. Since the capabilities of such an intelligence may be difficult for a human to comprehend, the technological singularity is often seen as an occurrence (akin to a gravitational singularity) beyond which the future course of human history is unpredictable or even unfathomable. The first use of the term "singularity" in this context was by mathematician John von Neumann. In 1958, regarding a summary of a conversation with von Neumann, Stanislaw Ulam described "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". The term was popularized by science fiction writer Vernor Vinge, who argues that artificial intelligence, human biological enhancement, or brain-computer interfaces could be possible causes of the singularity. Futurist Ray Kurzweil cited von Neumann's use of the term in a foreword to von Neumann's classic The Computer and the Brain. Proponents of the singularity typically postulate an "intelligence explosion",[1] [2] where superintelligences design successive generations of increasingly powerful minds, that might occur very quickly and might not stop until the agent's cognitive abilities greatly surpass that of any human.

      Kurzweil predicts the singularity to occur around 2045 whereas Vinge predicts some time before 2030. At the 2012 Singularity Summit, Stuart Armstrong did a study of artificial generalized intelligence (AGI) predictions by experts and found a wide range of predicted dates, with a median value of 2040. His own prediction on reviewing the data is that there's an 80% probability that the singularity will occur between 2017 and 2112.

  1. David Chalmers on Singularity, Intelligence Explosion. April 8th, 2010. Singularity Institute for Artificial Intelligence: http://singinst.org/blog/2010/04/08/david-chalmers-on-singularity-intelligence-explosion/
  2. Editor's Blog Why an Intelligence Explosion is Probable By: Richard Loosemore and Ben Goertzel. March 7, 2011; hplusmagazine: http://hplusmagazine.com/2011/03/07/why-an-intelligence-explosion-is-probable/

2014

2013

2012


2011

2010

2008

  • Steven Pinkerhttp://spectrum.ieee.org/computing/hardware/tech-luminaries-address-singularity
    • QUOTE: There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles--all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems.

2005

2000

1999

  • (Kurzweil, 1999) ⇒ Ray Kurzweil. (1999). “The Age of Spiritual Machines: When computers exceed human intelligence." Viking Press. ISBN:0-670-88217-8

1993

1990

1988

1965