Speaker: Eliezer Yudkowsky

Eliezer Yudkowsky, a researcher attempting to create “Friendly” AI, spoke about the three major schools of thought regarding the Technological Singularity:

  • Accelerating Change – technology trends follow exponential growth
  • Event Horizon – minds that are smarter than humans, including human brain enhancement, interfaces, and even AGI
  • Intelligence Explosion – improving intelligence using technology in a positive feedback cycle

These schools of thoughts can at times contradict each other, even suggesting alternative outcomes. For example, if the Event Horizon theory is correct, then predicting the future may be impossible. Accelerating Change proponents, such as Ray Kurzweil, suggest that the future might actually be predictable, because we can plot progress on an exponential graph with time as one of the axis.

However, these schools of thought can support each other’s core or bold claims. Today, these seem to be merged together into an idea of the Technological Singularity.

Published by Richard Leis

Richard Leis is a poet and writer living in Tucson, Arizona. His poetry has been published in or is forthcoming from Impossible Archetype and The Laurel Review. A work of flash fiction is forthcoming from Cold Creek Review. His essays about fairy tales and technology have been published online at Tiny Donkey and Fairy Tale Review's Fairy-Tale Files. Richard is also Downlink Lead for HiRISE at the University of Arizona.

One Comment

  1. Dienst am Leser: Singularity Summit II…

    Bis jetzt sehe ich 2 Blogs die live vom 2. Singularity Summit aus San Francisco bloggen:
    Michael Anissimov
    Frontier Channel
    Wem die Singularitätshypothese nicht fremd ist, dem dürfte auch die englische Sprache geläufig sein. Die Restmenge könnte au…

Comments are closed.