Technological singularity
| Futures studies |
|---|
| Concepts |
|
| Techniques |
|
| Technology assessment and forecasting |
|
| Related topics |
|
The technological singularity—often called the singularity[1]—is a proposed future event in which technological growth accelerates beyond human control, producing unpredictable changes in human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good's intelligence explosion model of 1965, an upgradable intelligent agent could eventually enter a positive feedback loop of successive self-improvement cycles; more intelligent generations would appear more and more rapidly, causing a rapid increase in intelligence that culminates in a powerful superintelligence, far surpassing human intelligence.[4]
Some scientists, including Stephen Hawking, have expressed concern that artificial superintelligence could result in human extinction.[5][6] The consequences of a technological singularity and its potential benefit or harm to the human race have been intensely debated.
Prominent technologists and academics dispute the plausibility of a technological singularity and associated artificial intelligence "explosion", including Paul Allen,[7] Jeff Hawkins,[8] John Holland, Jaron Lanier, Steven Pinker,[8] Theodore Modis,[9] Gordon Moore,[8] and Roger Penrose.[10] One claim is that artificial intelligence growth is likely to run into decreasing returns instead of accelerating ones. Stuart J. Russell and Peter Norvig observe that in the history of technology, improvement in a particular area tends to follow an S curve: it begins with accelerating improvement, then levels off without continuing upward into a hyperbolic singularity.[11]
- ^ Cadwalladr, Carole (22 February 2014). "Are the robots about to rise? Google's new director of engineering thinks so…". The Guardian. Retrieved 8 May 2022.
- ^ "Collection of sources defining "singularity"". singularitysymposium.com. Archived from the original on 17 April 2019. Retrieved 17 April 2019.
- ^ Eden, Amnon H.; Moor, James H.; Søraker, Johnny H.; Steinhart, Eric, eds. (2012). Singularity Hypotheses: A Scientific and Philosophical Assessment. The Frontiers Collection. Dordrecht: Springer. pp. 1–2. doi:10.1007/978-3-642-32560-1. ISBN 9783642325601.
- ^ Vinge, Vernor. "The Coming Technological Singularity: How to Survive in the Post-Human Era". Archived 2018-04-10 at the Wayback Machine, in Vision-21: Interdisciplinary Science and Engineering in the Era of Cyberspace, G. A. Landis, ed., NASA Publication CP-10129, pp. 11–22, 1993. "There may be developed computers that are "awake" and superhumanly intelligent. (To date, there has been much controversy as to whether we can create human equivalence in a machine. But if the answer is 'yes, we can', then there is little doubt that beings more intelligent can be constructed shortly thereafter.)"
- ^ Sparkes, Matthew (13 January 2015). "Top scientists call for caution over artificial intelligence". The Telegraph (UK). Archived from the original on 7 April 2015. Retrieved 24 April 2015.
- ^ "Hawking: AI could end human race". BBC. 2 December 2014. Archived from the original on 30 October 2015. Retrieved 11 November 2017.
- ^ Cite error: The named reference
Allen2011was invoked but never defined (see the help page). - ^ a b c Cite error: The named reference
ieee-lumiwas invoked but never defined (see the help page). - ^ Cite error: The named reference
modis2012was invoked but never defined (see the help page). - ^ Penrose, Roger (1999). The emperor's new mind: concerning computers, minds and the laws of physics. Oxford: Oxford Univ. Press. ISBN 978-0-19-286198-6.
- ^ Russell, Stuart J.; Norvig, Peter (2021). Artificial Intelligence: A Modern Approach (4th ed.). Hoboken: Pearson. p. 1005. ISBN 978-0-1346-1099-3. LCCN 20190474.