I've created a new website for my ebook Facing the Intelligence Explosion:
Sometime this century, machines will surpass human levels of intelligence and ability, and the human era will be over. This will be the most important event in Earth’s history, and navigating it wisely may be the most important thing we can ever do.
Luminaries from Alan Turing and Jack Good to Bill Joy and Stephen Hawking have warned us about this. Why do I think they’re right, and what can we do about it?
Facing the Intelligence Explosion is my attempt to answer those questions.
This page is the dedicated discussion page for Facing the Intelligence Explosion.
If you'd like to comment on a particular chapter, please give the chapter name at top of your comment so that others can more easily understand your comment. For example:
Re: From Skepticism to Technical Rationality
Here, Luke neglects to mention that...
This comment confuses me.
The point of the excerpt you quote has nothing to do with income at all; the point is that (for example) if I have $100 budgeted for charity work, and I'm willing to spend $50 of that to save 2,000 birds, then I ought to be willing to spend $75 of that to save 10,000 birds, because 2000/50 > 10000/75. But in fact many people are not.
Of course, the original point depends on the assumption that the value of N birds scales at least somewhat linearly. If I've concluded that 2000 is an optimal breeding population and I'm building an arcology to save animals from an impending environmental collapse, I might well be willing to spend a lot to save 2,000 birds and not much more to save 20,000 for entirely sound reasons.
If I budgeted $100 for charity work and I decided saving birds was the best use of my money then I would just give the whole hundred. If I later hear more birds need saving, I will feel bad. But I won't give more.