Ghatanathoah comments on Welcome to Heaven - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (242)
This is one of the most horrifying things I have ever read. Most of the commenters have done a good just of poking holes in it, but I thought I'd add my take on a few things.
Some good and detailed explanations are here, here, here, here, and here.
No, the correct thing for an FAI to do is to use some resources to increase the number of beings and some to increase the utility of existing beings. You are assuming that creating new beings does not have diminishing returns. I find this highly unlikely. Most activities generate less value the more we do them. I don't see why this would change for creating new beings.
Having new creatures that enjoy life is certainly a good thing. But so is enhancing the life satisfaction of existing creatures. I don't think one of these things is categorically more valuable than the other. I think they are both incrementally valuable.
In other words, as I've said before, the question is not, "Should we maximize total utility or average utility?" It's "How many resources should be devoted to increasing total utility, and how many to increasing average utility?"
Wouldn't it be even more efficient to just create creatures that feel nothing except a vague preference to keep on existing, which is always satisfied?
Or maybe we shouldn't try to minmax morality. Maybe we should understand that phrases like "maximize pleasure" and "maximize preference satisfaction" are just rules of thumb that reflect a deeper and more complex set of moral values.
Again, you're assuming all enjoyments are equivalent and don't generate diminishing returns. Pleasure is valuable, but it has diminishing returns, you get more overall value by increasing lots of different kinds of positive things, not just pleasure.
If you're right this is just proof that Christians are really bad at constructing Heaven. But I don't think you are, most Christians I know think heaven is far more complex than just sitting around feeling good.
The alternative you suggest is a very good alternative. Creating all those blissful creatures would be a waste of valuable resources that could be used to enhance the preferences of already existing creatures. Again, creating new creatures is often a good thing, but it has diminishing returns.
Now for some rebuttals to your statements in the comments section:
Again, complex values and diminishing returns. Autonomy is good, but if an FAI can help us obtain some other values it might be good to cede a little of our autonomy to it.
It's immoral and illegal to force people to medicate for a reason. That being said, depression isn't a disease that changes what your desires are. It's a disease that makes it harder to achieve your desires. If you cured it you'd be better at achieving your desires, which would be a good thing. If a cure radically changed what your desires were it would be a bad thing.
That being said, I wouldn't necessarily object to rewiring humans so that we feel pleasure more easily,, as long as it fulfilled two conditions: 1. That pleasure must have a referent. You have to do something to trigger the reward center in order to feel it, stimulating the brain directly would be bad. 2. The increase must be proportional. I should still enjoy a good movie better then a bad movie, even if I enjoy them both a lot more.
I don't think that that it's ethical, or possible to take into account the hypothetical preferences of nonexistant creatures. That's not even a logically coherent concept. If a creature doesn't exist, then it doesn't have preferences. I don't think it's logically possible to prefer to exist if you don't already. Besides, as I said before, it would be even more efficient to create a creature that can't feel pleasure, that just has a vague preference to keep on existing that would always be satisfied as long as it existed. But I doubt you would want to do that.
Besides, for every hypothetical creature that wants to exist and feel pleasure, there's another hypothetical creature that wants that creature to not exist, or feel pain. Why are we ignoring those creature's preferences?
No, it isn't. Selfishness is when you severely thwart someone's preferences to mildly enhance your own. It's not selfish to thwart nonexistant preferences because they don't exist. That's like saying it's gluttonous to eat nonexistant food, or vain to wear nonexistant costume jewelry.
The reason some people find the idea that you have to respect the preferences of all potential creatures is that they believe (correctly) that they have an obligation to make sure people who exist in the future will have satisfied preferences. But that isn't because nonexistant people's preference have weight. It's because it's good for whoever exists at the moment to have highly satisfied preferences, so as soon as a creature comes into existence you have a duty to make sure it is satisfied. And the reason those people's preferences are highly satisfied should be that they are strong, powerful, and have lots of friends, not because they were genetically modified to have really really unambitious preferences.
I want a universally friendly AI, but since nonexistant creatures don't exist in this universe, not creating them isn't universally unfriendly.
Also, I find it highly suspect, to say the least, that you start by arguing for "Heaven" because you think that all human desires can be reduced to the desire to feel certain emotions, but then when the commenters have poked holes in that idea you suddenly change and use a completely different justification (the logically incoherent idea that we have to respect the nonexistant preferences of nonexistant people) to defend it.
I find it helpful to think of having a copy as a form of life extension, except done serially instead of linearly. An exact duplicate of you who lives for 70 years is similar to living an extra 70 years. So torturing everyone because they have duplicates would be equivalent to torturing someone for half their lifespan and then saying that it's okay because they still have half a lifespan leftover.
Again, if these creatures exist somewhere else, then if you create them you aren't really creating them, you're extending their lifespan. Now, having a long lifespan is one way of having a high quality of life, but it isn't the only way, and it does have diminishing returns, especially when it's serial instead of linear, and you don't share your copy's memories. So it seems logical that, in addition to focusing on making people live longer, we should increase their quality of life in other ways, such as devoting resources to making them richer and more satisfied.