Singularity (technology): Difference between revisions
Jump to navigation
Jump to search
imported>Richard Pinch m (typo) |
imported>James F. Perry m (fix link (direct to "futures studies", not futurist)) |
||
Line 1: | Line 1: | ||
{{subpages}} | {{subpages}} | ||
The term "technological singularity" is used in [[futurist]] circles to refer to the phenomenon and resulting effects of reaching a critical threshold of positive-feedback technological change. | The term "technological singularity" is used in [[futures studies|futurist]] circles to refer to the phenomenon and resulting effects of reaching a critical threshold of positive-feedback technological change. | ||
== Three Models == | == Three Models == |
Revision as of 09:06, 27 July 2011
The term "technological singularity" is used in futurist circles to refer to the phenomenon and resulting effects of reaching a critical threshold of positive-feedback technological change.
Three Models
Eliezer Yudkowsky, co-founder of the Singularity Institute, has suggested that the term 'technological singularity' holds three distinct concepts[1]:
- Ray Kurzweil's singularity refers to the concept of technology-driven accelerating change significantly changing society;
- Vernor Vinge's singularity refers to the concept of rapidly accelerating change creating an event horizon-like barrier to social prediction;
- I.J. Good's singularity refers to the concept of intelligent agents able to improve their own intelligence causing an intelligence explosion.
References
- ↑ Introducing the "Singularity": Three Major Schools of Thought. Singularity Summit 2007.