Wei_Dai comments on Exterminating life is rational - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (272)
1: Yes. 2: Yes. 3: No. 4: I see a number of reasons not to do straight multiplication:
This agrees with my intuitive judgment and also seems to have relatively few philosophical problems, compared to valuing everyone equally without any kind of discounting.
My last bullet above already answered this, but I'll repeat for clarification: it's both.
This should be clear from my answers above as well, but yes.
Oh, 'ello. Glad to see somebody still remembers the proximity argument. But it's adapted to our world where you generally cannot kill a million distant people to make one close relative happy. If we move to a world where Omegas regularly ask people difficult questions, a lot of people adopting proximity reasoning will cause a huge tragedy of the commons.
About Eliezer's question, I'd exchange my life for a reliable 0.001 chance of healing reality, because I can't imagine living meaningfully after being offered such a wager and refusing it. Can't imagine how I'd look other LW users in the eye, that's for sure.
I publicly rejected the offer, and don't feel like a pariah here. I wonder what is the actual degree of altruism among LW users. Should we set up a poll and gather some evidence?
Cooperation is a different consideration from preference. You can prefer only to keep your own "body" in certain dynamics, no matter what happens to the rest of the world, and still benefit the most from, roughly speaking, helping other agents. Which can include occasional self-sacrifice a la counterfactual mugging.
I'd be interested to know what you think of Critical-Level Utilitarianism and Population-Relative Betterness as ways of avoiding the repugnant conclusion and other problems.