I've created a new website for my ebook Facing the Intelligence Explosion:
Sometime this century, machines will surpass human levels of intelligence and ability, and the human era will be over. This will be the most important event in Earth’s history, and navigating it wisely may be the most important thing we can ever do.
Luminaries from Alan Turing and Jack Good to Bill Joy and Stephen Hawking have warned us about this. Why do I think they’re right, and what can we do about it?
Facing the Intelligence Explosion is my attempt to answer those questions.
This page is the dedicated discussion page for Facing the Intelligence Explosion.
If you'd like to comment on a particular chapter, please give the chapter name at top of your comment so that others can more easily understand your comment. For example:
Re: From Skepticism to Technical Rationality
Here, Luke neglects to mention that...
From Scope Insensitivity:
Now I haven't read the paper, but this implies there is only one charity doing the asking. First they ask how much you would give to save 2000 birds? You say, "$100." Then they ask you the same thing again, just changing the number. You still say, "$100. It's all I have." So what's wrong with that?
Agreed: if I assume that there's a hard upper limit being externally imposed on those answers (e.g., that I only have $80, $78, and $88 to spend in the first place, and that even the least valuable of the three choices is worth more to me than everything I have to spend) then those answers don't demonstrate interesting scope insensitivity.
There's nothing wrong with that conclusion, given those assumptions.