shokwave comments on Secrets of the eliminati - Less Wrong

93 Post author: Yvain 20 July 2011 10:15AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (252)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vaniver 18 August 2011 04:47:19PM 0 points [-]

For example, if it develops some diet drug that lets you safely enjoy eating and still stay skinny and beautiful, that might be a better result than you could provide for yourself, and it doesn't need any special understanding of you to make that happen.

It might not need special knowledge of my psychology, but it certainly needs special knowledge of my physiology.

But notice that the original point was about human preferences. Even if it provides new technologies that dissolve internal conflicts, the question of whether or not to use the technology becomes a conflict. Remember, we live in a world where some people have strong ethical objections to vaccines. An old psychological finding is that oftentimes, giving people more options makes them worse off. If the AI notices that one of my modules enjoys sensory pleasure, offers to wirehead me, and I reject it on philosophical grounds, I could easily become consumed by regret or struggles with temptation, and wish that I never had been offered wireheading in the first place.

Putting an inferior argument first is good if you want to try to get the last word, but it's not a useful part of problem solving. You should try to find the clearest problem where solving that problem solves all the other ones.

I put the argument of internal conflicts first because it was the clearest example, and you'll note it obliquely refers to the argument about status. Did you really think that, if a drug were available to make everyone have perfectly sculpted bodies, one would get the same social satisfaction from that variety of beauty?

If it can do a reasonable job of comparing utilities across people, then maximizing average utility seems to do the right thing here.

I doubt it can measure utilities; as I argued two posts ago, and simple average utilitarianism is so wracked with problems I'm not even sure where to begin.

Comparing utilities between arbitrary rational agents doesn't work, but comparing utilities between humans seems to -- there's an approximate universal maximum (getting everything you want) and an approximate universal minimum (you and all your friends and relatives getting tortured to death).

A common tactic in human interaction is to care about everything more than the other person does, and explode (or become depressed) when they don't get their way. How should such real-life utility monsters be dealt with?

Status conflicts are not one of the interesting use cases.

Why do you find status uninteresting?

Comment author: NancyLebovitz 18 August 2011 05:37:11PM 2 points [-]

I haven't heard of people having strong ethical objections to vaccines. They have strong practical (if ill-founded) objections-- they believe vaccines have dangers so extreme as to make the benefits not worth it, or they have strong heuristic objections-- I think they believe health is an innate property of an undisturbed body or they believe that anyone who makes money from selling a drug can't be trusted to tell the truth about its risks.

To my mind, an ethical objection would be a belief that people should tolerate the effects of infectious diseases for some reason such as that suffering is good in itself or that it's better for selection to enable people to develop innate immunities.

Comment author: soreff 18 August 2011 06:01:42PM 4 points [-]

To my mind, an ethical objection would be a belief that people should tolerate the effects of infectious diseases for some reason such as that suffering is good in itself

That wasn't precisely the objection of Christian conservatives to the HPV vaccine (perhaps more nearly that they wanted sex to lead to suffering?), but it is fairly close

Comment author: Vaniver 18 August 2011 07:02:40PM 1 point [-]

I am counting religious objections as ethical objections, and there are several groups out there that refuse all medical treatment.