I volunteered at the singularity summit a few weeks ago and. It was quite a weekend. What is the singularity? Well Eliezer Yudkowsky ‘s introductory presentation outlined this conception of it:
Sometime in the future technology will advance to the point of creating minds that are smarter than human through brain computer interfaces or purely biological neurohackery or by constructing true artificial intelligence.
Vernor Vinge was a professor of mathematics were also wrote science fiction. And he realised he was having trouble writing story set in the future past the point where technology creates smarter than human minds because he was having to try to write characters that were smarter than he was, and at that point his crystal ball cracked down the centre.
That is why Vernor Vinge originally called [this] the singularity, after the centre of a black hole where 1970s models of laws of physics break down. Note that it’s the model that is the breaking down not necessarily the future itself. If I am ignorant about the phenomenon. that is a fact about my mind, not a fact about the phenomenon.
Stripped to its barest essentials, the core thesis of the event horizon is that smarter the human minds imply a weirder future than flying cars and amazing gadgets with lots of blinkenlights.
The core of Vinge’s event horizon is about intelligence. Improving the brain is a very serious business it tampers with the roots of the technology tree, goes back to the cause of all technology, and that makes the future a lot more uncertain.
Now the best news is that all the sessions got recorded, and now the audio from all the sessions for the whole weekend is online, and you can listen to the whole weekend’s stuff. They can be a tad confusing, since they were giving accompanied by slideware, and this is just the audio, but I think there will be video too in a while.
And here’s the rest of the conference: Singularity Summit 2007 | The Singularity Institute for Artificial Intelligence