You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

RichardKennaway comments on Eudaimonic Utilitarianism - Less Wrong Discussion

7 Post author: Darklight 04 September 2013 07:43PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (34)

You are viewing a single comment's thread. Show more comments above.

Comment author: RichardKennaway 10 September 2013 06:45:02AM *  0 points [-]

any level of brain damage seems equally fine to me as long as the conscious experience stays the same.

What does that even mean? The lower castes in Brave New World are brain-damaged precisely so that their conscious experience will not be the same. A Delta has just enough mental capacity to be an elevator attendant.

However, I'm not a classical utilitarian, I don't believe it is important to fill the universe with intense happiness. I care primarily about reducing suffering

That is exactly what BNW does: blunting sensibility, by surgery, conditioning, and drugs, to replace all suffering by bland contentment.

and wireheading would be one (very weird) way to do that. Another way would be Pearcean paradise engineering

My reading of that maze of links is that Pearcean paradise engineering is wireheading. It makes a nod here and there to "fulfilling our second-order desires for who and what we want to become", but who and what Pearce wants us to become turns out to be just creatures living in permanent bliss by means of fantasy technologies. What these people will actually be doing with their lives is not discussed.

I didn't explore the whole thing, but I didn't notice any evidence of anyone doing anything in the present day to achieve this empty vision other than talk about it. I guess I'm safe from the wireheading police for now.

and a third way would be through preventing new consciousness moments from coming into existence.

Kill every living creature, in other words.

The paradise engineering one seems to be the best starting point for compromising with people who have different values, but intrinsically, I don't have a preference for it.

But presumably, you do have a preference for those options collectively? Stunt everyone into contentment, wirehead them into bliss, or kill them all? But in another comment you say:

My terminal value is about doing something that is coherent/meaningful/altruistic

There doesn't seem to be any scope for that in the Pearcian scenario, unless your idea of what would be coherent/meaningful/altruistic to do is just to bring it about. But after Paradise, what?

Any opinion on this alternative?