I've created a new website for my ebook Facing the Intelligence Explosion:
Sometime this century, machines will surpass human levels of intelligence and ability, and the human era will be over. This will be the most important event in Earth’s history, and navigating it wisely may be the most important thing we can ever do.
Luminaries from Alan Turing and Jack Good to Bill Joy and Stephen Hawking have warned us about this. Why do I think they’re right, and what can we do about it?
Facing the Intelligence Explosion is my attempt to answer those questions.
This page is the dedicated discussion page for Facing the Intelligence Explosion.
If you'd like to comment on a particular chapter, please give the chapter name at top of your comment so that others can more easily understand your comment. For example:
Re: From Skepticism to Technical Rationality
Here, Luke neglects to mention that...
No doubt evolution is a simplified rules set, but in empirical tests, as well as in historical interpretation of data, it has many failings which, as Luke has pointed out for certain creationists, is something that evolutionary believers shy away from, hiding in self-deception in order to keep their beliefs safe.
But this is not a post about creation/evolution - my point was that his use of creationists was a poor choice because (a) creationism is believed by a majority of Americans, and so will turn them off from his main point, and (b) the idea that the idea is settled scientifically is dubious, since origins science is more interpretation than demonstrable fact, and both sides of that debate have strong ideological reasons to believe and scientific reasons to doubt that they ignore.
Can people who believe in a God that benevolently created us and looks over us even come to consider the possibility of existential dangers or a human-steered Singularity? Frankly, if they are creationists, I think they are largely irrelevant to a Singularity discussion until they shed such beliefs.