John_Maxwell_IV comments on Survey Results - Less Wrong

48 Post author: Yvain 12 May 2009 10:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (210)

Sort By: Popular

You are viewing a single comment's thread. Show more comments above.

Comment author: John_Maxwell_IV 13 May 2009 05:21:23AM *  14 points [-]

Can I make a pro-babyeater argument?

Here is a dialogue between an imaginary six-year-old child named Dennis and myself.

Me: Hi Dennis, do you like broccoli?

Dennis: No, I hate it!

Me: But it's good for you, right?

Dennis: I don't care! It tastes awful!

Me: Would you like to like broccoli?

Dennis: No, I can't stand broccoli! That stuff is gross!

Me: What if I told you some magic words that would make it so that every piece of broccoli you ever ate would taste just like chocolate if you said them? Would you say the magic words?

Dennis: Well...

Me: You like chocolate, don't you?

Dennis: Yes, but...

Me: What?

Dennis: Your questions are too hard.


I think everyone has conflicts between their different wants. I want to do well in my classes, but I don't want to study. And yet I can't think of any conflicts between my metawants: If I could choose to like studying just as much as I like my favorite computer game, I would make that choice. The wants offered to the humans in the babyeaters story seem fairly sensible from a utilitarian perspective. They promote peace throughout the galaxy and mean lots of fun for everyone. What's not to like?

Comment author: dclayh 14 May 2009 11:25:26PM 3 points [-]

I wish someone would do a post on metawants. Personally I view them with deep suspicion.

Comment author: Alicorn 14 May 2009 11:52:12PM 1 point [-]

What about metawants (a.k.a. second-order desire) do you want to see a post on?

Comment author: dclayh 15 May 2009 12:06:12AM 1 point [-]

Well, their ontological, epistemological, and ethical statuses, for three. Specifically, how it's possible to want X and simultaneously want to not want X (while remaining more or less sane/rational). Whether metawants have any special status when making utilitarian ethical calculations. That sort of thing. Even the history of thought on the subject (e.g. Buddhism, where the stated (and only?) metawant is to eliminate all first-order wants).

Comment author: Alicorn 15 May 2009 12:34:15AM 1 point [-]

I'll see what I can do. There was a fair bit about second-order desire in my self-knowledge class and if people would be interested in a distillation/summary of it, I'll provide.

Comment author: andrewc 14 May 2009 12:41:15AM 0 points [-]

I get the argument, but I assign a high value to self-determination. Like Arthur Dent, I don't want my brain replaced (unless by choice), even if the new brain is programmed to be ok with being replaced. Which ending did you pick in Deus Ex 2? I felt guilty gunning down JC and his brother, but it seemed the least wrong (according to my preferences) thing to do.

Comment author: dclayh 14 May 2009 11:30:44PM 0 points [-]

I don't want my brain replaced (unless by choice)

A rather vacuous statement, no?

I felt guilty gunning down JC and his brother, but it seemed the least wrong (according to my preferences) thing to do.

Isn't human nature funny* that we have qualms about behaving immorally in a sufficiently realistic simulation, yet can hear cold numbers about enormous real disutility (genocides, natural disasters, etc.) and feel nothing? That's speaking for myself incidentally, not casting aspersions on you.

*(where by "funny" I mean "designed by a blind idiot god", naturally)

Comment author: John_Maxwell_IV 14 May 2009 10:42:35PM 0 points [-]

Like Arthur Dent, I don't want my brain replaced (unless by choice), even if the new brain is programmed to be ok with being replaced.

I don't think you're being very fair to your new brain. Do you?

I haven't played Deus Ex 2, sorry.