In response to I'd take it
Comment author: Laura__ABJ 02 July 2008 04:45:33PM 0 points [-]

Ha Ha Ha...

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 05:37:36PM 0 points [-]

I realize that just because I am fairly confident I wouldn't suffer terribly from killing the baby if my knowledge was fairly complete, I can't say that for all people. People's utility functions differ, as do their biological and learned aversions to certain types of violence. The cognitive dissonance created by being presented with such a situation might be too great for some, causing them to break down psychologically and rationalize their way out of the decision any way they could. What if we upped the stakes and took it from some anonymous baby painlessly being snuffed out, to your own adult child being tortured?

Look at our dear friend C-. I was not thinking about him when I wrote my last post, but for those of you who know the situation, he seems to be the embodiment of this dilemma. What becomes of the man who, knowing the gravity of the situation and the most likely outcome, still decides NOT to kill the baby???

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 03:18:00PM 0 points [-]

Hmm... This whole baby-killing example is making me think...

Knecht: "Even if I thought it probably would substantially increase the future happiness of humanity, I still wouldn't do it without a complete explanation. Not because I think there is a moral fabric to the universe that says killing babies is wrong, but because I am hardwired to have an extremely strong aversion to like killing babies."

This does seem like what a true amoralist might say... yet, what if the *idea* of having forgone the opportunity to substantially increase the future happiness of humanity would haunt you for the rest of your existence, which will be quite long... Then the amoralist might decide indeed that the comparative pain of killing the baby was less than suffering this protracted agony.

Andy: "It's all well and good to speak of utility, but next time, it could be you! How does it come to be that each individual has forfeited control over her/his own destiny? Is it just part of "the contract?"

From how I feel about the world and the people in it now, I would hope I would have the strength to accept my fate and die, if die I must... However, since I really don't believe there is anything 'after,' all utility would drop to 0 if I were to die. However, I think I might very well be tortured for the rest of my existence *that* my existence was to source of torture to so many. This would be negative utility. I can conceive of not wanting to live anymore. I honestly can't say what I would do if asked to make this sacrifice. What would *you* do, if it was *your* life the AI asked you to end?

Laura: "I would need to be pretty fucking sure it really was both friendly and able to perform such calculations before I would kill anyone at its command."

I know I wrote this, but I've been thinking about it. Generally this is true, but we mustn't rationalize inaction by insufficiency of data when probabalistically we have very good reason to believe in the correctness of a conclusion. Be a man, or an adult rather, and take responsibility for the possibility that you may be wrong.

Maybe this is what it is to be a Man/Woman. This is why I was so very impressed with Leonidas and his wife- their ability to make very difficult, unsavory decisions with very good reasons to believe they were correct, but still in the face of uncertainty... Leonidas flaunted fate... his wife, society. Which was more difficult?

OTOH- We can think of King Agammemnon and his ultimate sacrifice of his daughter Iphegenia, demanded by the gods in order to get the ships to set sail. While he clearly *HAD* to do this under the premise that he *should* go to war with Troy, Greek literature seems to be highly critical of this decision and whether or not the war should ever have been fought... If our 'super-intelligent,' 'friendly' AI, were but the Greek Gods unto us, I don't think I would want to be at its moral mercy... I am not a toy.

The Greeks really did get it all right. There have been so few insights into human nature since...

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 03:31:51AM 0 points [-]

Andy- I agree with your skepticism. I was taking for granted that the AI in the scenario was correct in its calculation, since I am 'convinced that it is friendly' but yes, I would need to be pretty fucking sure it really was both friendly and able to perform such calculations before I would kill anyone at its command.

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 02:22:46AM 0 points [-]

Andy: "I can't claim that there's an objective reason to value individual rights so highly, but it is a fact that I do."

Hal: "You argue and ask if there isn't some other way to do it, and the FAI explains that every other alternative will involve much greater human suffering."

These things seem grossly disproportionate. Do you really believe utility(individual rights of one person)>>>utility(end great human suffering)

Andy- A man who is on the brink the death has a key to a safe deposit box in which there is an asthma inhaler. He owns both the inhaler and the safe deposit box. His son's son is having a very serious asthma attack that might lead to death. Since said man currently hates his son, he decides not to tell him where the key is, since it's his property and he doesn't have to. "Call an ambulance and wait." He tells his son. You know where the key is. Do you steal it?

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 01:11:55AM 0 points [-]

Hal Finney-

I probably wouldn't have argued that much with the AI... I've done things I've personally found more morally questionable since I didn't have quite as good a reason to believe I was right about the outcome... Moral luck, I was.

In response to The Moral Void
Comment author: Laura__ABJ 30 June 2008 01:12:20PM 0 points [-]

Eliezer: "Go ahead. Indulge your fantasy. Would you want the stone tablet to say people should die of old age, or that people should live as long as they wanted? If you could write the stone tablet yourself, what would it say?"

Excellent way of putting it... I would certainly want the option of living as long as I liked. (Though I find it worth noting that when I was depressed, I found the idea of needing to choose when to end program abhorrent, since I figured I could go several billion years in agony before making such a choice... Many people you talk to about the meaning of death may be longing for it *now*. Excellent Murakami story on the subject- in the collection 'Super Frog Saves Tokyo', but I forgot the title. Many people are very dissatisfied with their lives.)

Some of my problems with the 'positive singularity' do not involve uploading proper, but parameter manipulation that anything similar to our current identity may get completely lost in... Also not quite off the problem of 'is your clone really you?' ... or what we do with our physical selves after the upload.... All seem troubling to me.

Comment author: Laura__ABJ 30 June 2008 03:36:00AM 0 points [-]

Also Mike- the first portion of your argument was written in such a confusing manner that I had to read it twice, and I know the way you argue... don't know if anyone who didn't already know what you were talking about would have kept reading.

Comment author: Laura__ABJ 30 June 2008 03:31:00AM 0 points [-]

Michael- I have repeatedly failed to understand why this upsets you so much, though it clearly does. It's hard for me to see why I should care if the AI does a pretty fireworks display for 10 seconds or 10,000 years. Perhaps you need to find more intuitive ways of explaining it. A better analogy? At some points you just seem like a mystic to me...

Comment author: Laura__ABJ 29 June 2008 07:33:00PM 1 point [-]

Wow- far too much self-realization going on here... Just to provide a data point, when I was in high school, I convinced an awkward, naive, young catholic boy who had a crush on me of just this point... He attempted suicide that day.

....

For follow up, he has been in a very happy exclusive homosexual relationship for the past three years.

Maybe I didn't do such a bad thing...

View more: Next