[Home]TheSingularity

HomePage | RecentChanges | Preferences

If, as seems true, technological progress has been following an exponential curve for a very long time, then there will come a point at which technological progress accelerates essentially to infinity. This is known as TheSingularity.

There are many different specific definitions of the term, but all seem to have a common theme, perhaps summed up best by VernorVinge? in his famous essay Singularity, published in 1993:

"Within thirty years, we will have the technological means to create superhuman intelligence. Shortly thereafter, the human era will be ended."

The basic idea of TheSingularity can be thought of as a natural extrapolation of MooresLaw?. If the computing power per dollar cost continues to double every 18 to 24 months, then it is only a matter of time (some estimates range from 2020-2050) until we have inexpensive computer hardware ($1000, say) that is as powerful as the human brain. When that happens, it seems inevitable that realistic intelligence that equals ours can be created.

But due to such factors as "the AI advantage" (i.e., downloadable consciousness, replicable consciousness), it seems likely that such AIs would quickly be able to create even more intelligent machines. If we can create a single $1000 Einstein, you might say, we can easily create a billion Einsteins. Working together, they can surely generate enough technological progress to build machines so smart that they boggle the human mind.

It's a really interesting idea, no?

I would like to see a list here of prominent theorists/speculators on the subject, including:

VernorVinge?
BillJoy?
HansMoravec?

As well as a discussion of some of the key concepts:

AiAdvantage?

/Talk


HomePage | RecentChanges | Preferences
This page is read-only | View other revisions
Last edited February 13, 2001 2:47 pm by JimboWales (diff)
Search: