I've created a new website for my ebook Facing the Intelligence Explosion:
Sometime this century, machines will surpass human levels of intelligence and ability, and the human era will be over. This will be the most important event in Earth’s history, and navigating it wisely may be the most important thing we can ever do.
Luminaries from Alan Turing and Jack Good to Bill Joy and Stephen Hawking have warned us about this. Why do I think they’re right, and what can we do about it?
Facing the Intelligence Explosion is my attempt to answer those questions.
This page is the dedicated discussion page for Facing the Intelligence Explosion.
If you'd like to comment on a particular chapter, please give the chapter name at top of your comment so that others can more easily understand your comment. For example:
Re: From Skepticism to Technical Rationality
Here, Luke neglects to mention that...
[A separate issue from my previous comment] There are two reasons that I can give to rationalize my doubts about the probability of imminent Singularity. One is that if humans are only <100 years away from it, then in a universe as big and old as ours I would expect that a Singularity type intelligence would already have been developed somewhere else. In which case I would expect that either we would be able to detect it or we would be living inside it. Since we can't detect an alien Singularity, and because of the problem of evil we are probably not living inside a friendly AI, I doubt the pursuit of friendly AI is going to be very fruitful. The second reason is that while we will probably design computers that are superior to our general intellectual abilities, I judge it to be extremely unlikely that we will design robots that will be as physically versatile as 4 billions years of evolution has designed life to be.