...the "hard" science-fiction writers are the ones who try to write specific stories about all that technology may do for us. More and more, these writers felt an opaque wall across the future. Once, they could put such fantasies millions of years in the future. Now they saw that their most diligent extrapolations resulted in the unknowable . . . soon. - Vernor Vinge, NASA's Vision 21 Symposium 1993
The world is changing quickly, much faster than it has in the past. The rate of change is accelerating. At some point, the rate of change must either decelerate or change will happen faster than we can keep up with.
I stumbled onto the Singularity five years ago when reading Vernor Vinge's Marooned in Realtime. The basic idea is easy to grasp. We've all seen technological change happen in front of us, in some cases very much faster than anyone expected. And with unforeseen effects: the rise of Globalization and the rise of the Internet are entwined events.
The basis of the vision is Moore's Law - every 24 months, the number of transistors on a microchip will double. This isn't a natural law, but an empirical observation made in 1965. It has held up now for over forty years and, since future chip designs are well known years in advance, we can easily see that it will continue for at least another ten years.
If we postulate a doubling of computing power per chip every 24 months, at some point computers reach a level of human intelligence. It may well be that computers reach the hardware threshold long before they the software is ready, but it seems inevitable that both will be reached. The natural advantage to the 1) military/espionage power, 2) computer manufacturer, 3) plane/car/spaceship manufacturer, 4) pharmaceutical company or 5) underarm deodorant manufacturer seems obvious. Given the competitive nature of all of these, someone somewhere will create a human-equivalent machine-based intelligence at some point.
When? Predications run the gambit from 2015 (Vernor Vinge) to 2059 (Ray Kurzweil). How much "hardware" is in the human brain? What about the software? How long will it take to develop? This is not an exact science, by any means. Indeed, an intelligent computer might not look or think like a human brain at all. ("The question of whether a computer can think is no more interesting than the question of whether a submarine can swim." - Edsgerd Dijkstra.)
A natural chain of events can be imagined. A weakly intelligent computer system might be created, one which is demonstrably intelligent but either slow or not particularly bright. The creators could simply throw more resources at it, building up its speed or connection capacity to improve its intelligence, or harnessing it to create a superior artificial intelligence. And, at the very least, will likely make numerous copies of either or both the weakly or superior intelligence.
From that point on, where do we humans fit in the scheme of things? If machines suddenly become the fountain of intelligence, to the exclusion of humans, where will we be?
I'm going to leave any further discussion to others who do it for a living. And I will give links. Here they are:
Best articles on the Web:
Why the Future Doesn't Need Us, by Bill Joy (a negative view)
David Brin's Essay: Singularities and Nightmares
Economic Growth Given Machine Intelligence (pdf document) Robin Hanson (Economist at George Mason Univ.)
Books:
Marooned in Realtime, by Vernor Vinge. Since there is effectively no math behind the visions of the Singularity, this early work of science fiction on the topic is one of the best possible descriptions.
Ray Kurzweil's The Singularity is Near - This is not a trustworthy author. He is clearly in love with the idea that a Singularity event will push biological research, allowing him (and anyone else who is a Baby Boomer or younger) to live forever. That doesn't mean he's not smart or that he doesn't have good arguments, but he clearly has an agenda.
All that was written above is preamble. My personal position is that a Technological Singularity will likely happen. I think it is inevitable, given the nature of economic and political competition. Early on, it will give a few individuals/companies/countries a strong strategic advantage, but that will quickly (within a few years to a decade at most) devolve to where individual families will have access to the same level of power.
As an entrepreneur in 2007, what should I be doing with these beliefs? Investing my time and money into technology? Buying real estate (one thing which will not change or substantially lose value with a Singularity)? Buying real estate on Second Life? Creating a vast party of level 70 Warcraft characters? Applying to work on near-future mega projects like Lift Port (space elevator concept with big financial problems currently)?