"Today's problems cannot be solved by thinking the way we thought when we created them" - Albert Einstein

Friday, July 29, 2005

Singularity

People who embrace futuristic thinking well beyond their time run the risk of being labeled a 'nut'. Suggesting the Earth revolved around the Sun would have landed a person in jail as a heretic in the 17th century. Claiming trans-Atlantic travel could be measured in hours, rather than months, would be absurb in the 18th century. Predicting man would walk on the moon in the 19th century would be a sure sign of insanity. With this post, I hereby embrace that risk.

Technological Singularity refers to the point in time at which "technological progress accelerates beyond the ability of present-day humans to fully comprehend or predict." (Effectively synonymous with the development of artificial intelligence [AI].) When this occurs, the Singularitarians believe that AI will be able to use its intelligence to advance technology at a pace millions to billions of times faster than the pace at which technology progresses under humans. Such an intelligence could easily double the current body of human knowledge, something that has taken tens of thousands of years to accumulate, literally in a matter of seconds. A good description comes from I. J. Good:
"Let an ultraintelligent machine be defined as a machine that can far
surpass all the intellectual activities of any man however clever. Since the
design of machines is one of these intellectual activities, an ultraintelligent
machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind.
Thus the first ultraintelligent machine is the last invention that man need ever
make."

The potential for good and bad here are obvious. Cures for incurable diseases, the undiscovered secrets of our universe, solutions to virtually any human problem, all become knowable in a very short period of time. Some also think AI will enable us to develop technology capable of merging computational intelligence with biological intelligence, e.g. digitally replicating a person's brain and consciousness, or nanotechnology capable of fighting cancer cells or enhancing intelligence and/or memory, etc, to the extent of enabling immortality. It sounds crazy, but some of the world's leading scientists and futurists fully embrace this (see Ray Kurzweil's book "Fantastic Voyage: Live Long Enough to Live Forever").

Similarly, technology capable of the destruction of life or other unimaginable horrors also become possible.

The (somewhat scary) fact is humans will no longer be the highest form of intelligence in our world. We simply are incapable of anticipating what AI will be capable of accomplishing, any more than a dog or cat can anticipate what the next miracle drug to be discovered will be. Even scarier is the fact that by definition, disruptive technologies are developed at the fringe of society, and consequently, society is never prepared to deal with their implications. The Wright brothers' seemingly innocuous first flight ended up changing the nature of modern warfare and making all territories, military and civilian, part of the battlefield. The development of nuclear and rocket technologies ushered in the era of Mutually Assured Destruction. Genetic engineering now offers terrorists what could be potentially their most fearsome and devastating weapon.

Not only is it not too soon to begin a serious dialogue on how to prepare for the Singularity, it is probably well past due. A good place to learn more on Singularity is the Singularity Institute's website and Ray Kurzweil's "The Singularity is Near: When Humans Transcend Biology" or his "The Age of Spiritual Machines."

0 Comments:

Post a Comment

<< Home