[Home]Singularity

HomePage | Recent Changes | Preferences

Showing revision 3
This term has multiple meanings, depending on context.

Astrophysics Term

A singularity is a point in space within a black hole that has is predicted by general relativity to have infinite density and infinite gravitational force. Predicting what actually occurs at a singularity will have to await the development of a theory of quantum gravity.

One question that has been asked are naked singularities possible. The answer has been determined to be yes.

German astrophysicist [Kurt Schwarzschild]? used Albert Einstein's Field Equations to determine the critical radius for any given mass at which matter would collapse into a singularity.

See also Event horizon

Information Theory Term

If, as seems true, technological progress has been following an exponential curve for a very long time, then there will come a point at which technological progress accelerates essentially to infinity. This is known as "the singularity."

There are many different specific definitions of the term, but all seem to have a common theme, perhaps summed up best by Vernor Vinge in his famous essay Singularity, published in 1993:

"Within thirty years, we will have the technological means to create superhuman intelligence. Shortly thereafter, the human era will be ended."

(full article is at: http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html)

The basic idea of "the singularity" can be thought of as a natural extrapolation of Moore's Law. If the computing power per dollar cost continues to double every 18 to 24 months, then it is only a matter of time (some estimates range from 2020-2050) until we have inexpensive computer hardware ($1000, say) that is as powerful as the [human brain]?. When that happens, it seems inevitable that realistic artificial intelligence that equals ours can be created.

But due to such factors as "the [AI advantage]?" (i.e., downloadable consciousness, replicable consciousness), it seems likely that such AIs would quickly be able to create even more intelligent machines. If we can create a single $1000 Einstein, you might say, we can easily create a billion Einsteins. Working together, they can surely generate enough technological progress to build machines so smart that they boggle the human mind.

It's a really interesting idea, no?

Prominent theorists/speculators on the subject include:

Vernor Vinge
Bill Joy
[Hans Moravec]?
FM-2030
Eliezer S. Yudkowsky
Ray Kurzweil

Some key concepts are:

[AI advantage]?
Transhumanism

/Talk


HomePage | Recent Changes | Preferences
This page is read-only | View other revisions | View current revision
Edited November 20, 2001 1:25 pm by Chenyu (diff)
Search: