Singularity (technology): Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Joe Quick
m (subpages)
imported>Mike Johnson
m (Closed references tag. Hope to revise more, time permitting.)
Line 8: Line 8:
# [[Vernor Vinge]]'s singularity refers to the concept of rapidly accelerating change creating an [[event horizon]]-like barrier to social prediction;
# [[Vernor Vinge]]'s singularity refers to the concept of rapidly accelerating change creating an [[event horizon]]-like barrier to social prediction;
# [[I.J. Good]]'s singularity refers to the concept of intelligent agents able to improve their own intelligence causing an [[intelligence explosion]].
# [[I.J. Good]]'s singularity refers to the concept of intelligent agents able to improve their own intelligence causing an [[intelligence explosion]].
==References==
<references/>

Revision as of 02:39, 20 March 2008

This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

The term "technological singularity" is used in futurist circles to refer to the phenomenon and resulting effects of reaching a critical threshold of positive-feedback technological change.

Three Models

Eliezer Yudkowsky, co-founder of the Singularity Institute, has suggested that the term 'technological singularity' holds three distinct concepts[1]:

  1. Ray Kurzweil's singularity refers to the concept of technology-driven accelerating change significantly changing society;
  2. Vernor Vinge's singularity refers to the concept of rapidly accelerating change creating an event horizon-like barrier to social prediction;
  3. I.J. Good's singularity refers to the concept of intelligent agents able to improve their own intelligence causing an intelligence explosion.

References

  1. [Introducing the "Singularity": Three Major Schools of Thought. Singularity Summit 2007.