Eliezer Yudkowsky

SIAI Co-Founder & Research Fellow

Introducing the "Singularity": Three Major Schools of Thought
33 minutes, 15.1mb, recorded 2007-09-08
Eliezer Yudkowsky

Eliezer Yudkowsky, from the Singularity Institute for Artificial Intelligence, humorously examines the three leading schools of thought concerning the singularity: accelerating change, the event horizon and intelligence explosion theories.

The meaning of the term "singularity" has changed over time, ever since the group's founding. One school of thought, called accelerating change, considers that human intuition about future technology is linear, but because technology feeds on technology, the development of technology accelerates. It should therefore be possible to time, to some accuracy, the point at which breakthroughs occur.

The event horizon theory, on the other hand, states that future technology will advance to the point that minds smarter than human beings are created. Yudkowsky points out that one problem with this theory is that humans would have to be as smart as the thing they're trying to second guess in order to predict what's going to happen in the first place.

And given the third theory - intelligence explosion - which stipulates that intelligent minds will carry on inventing even more intelligent minds and thereby get caught in a feedback loop, Yudkowsky asks whether we should really even worry about any of this.


Eliezer Yudkowsky is an American artificial intelligence researcher concerned with the Singularity, and an advocate of Friendly Artificial Intelligence. Yudkowsky is a co-founder and research fellow of the Singularity Institute for Artificial Intelligence (SIAI). He is also the author of the SIAI publications "Creating Friendly AI" and "Levels of Organization in General Intelligence". His most recent academic contributions include two chapters in Oxford philosopher Nick Bostrom's forthcoming edited volume Global Catastrophic Risks. Yudkowsky is an autodidact with no formal education in artificial intelligence.

Resources:

This free podcast is from our Singularity Summit series.

For The Conversations Network: