In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 03:18:00PM 0 points [-]

Hmm... This whole baby-killing example is making me think...

Knecht: "Even if I thought it probably would substantially increase the future happiness of humanity, I still wouldn't do it without a complete explanation. Not because I think there is a moral fabric to the universe that says killing babies is wrong, but because I am hardwired to have an extremely strong aversion to like killing babies."

This does seem like what a true amoralist might say... yet, what if the *idea* of having forgone the opportunity to substantially increase the future happiness of humanity would haunt you for the rest of your existence, which will be quite long... Then the amoralist might decide indeed that the comparative pain of killing the baby was less than suffering this protracted agony.

Andy: "It's all well and good to speak of utility, but next time, it could be you! How does it come to be that each individual has forfeited control over her/his own destiny? Is it just part of "the contract?"

From how I feel about the world and the people in it now, I would hope I would have the strength to accept my fate and die, if die I must... However, since I really don't believe there is anything 'after,' all utility would drop to 0 if I were to die. However, I think I might very well be tortured for the rest of my existence *that* my existence was to source of torture to so many. This would be negative utility. I can conceive of not wanting to live anymore. I honestly can't say what I would do if asked to make this sacrifice. What would *you* do, if it was *your* life the AI asked you to end?

Laura: "I would need to be pretty fucking sure it really was both friendly and able to perform such calculations before I would kill anyone at its command."

I know I wrote this, but I've been thinking about it. Generally this is true, but we mustn't rationalize inaction by insufficiency of data when probabalistically we have very good reason to believe in the correctness of a conclusion. Be a man, or an adult rather, and take responsibility for the possibility that you may be wrong.

Maybe this is what it is to be a Man/Woman. This is why I was so very impressed with Leonidas and his wife- their ability to make very difficult, unsavory decisions with very good reasons to believe they were correct, but still in the face of uncertainty... Leonidas flaunted fate... his wife, society. Which was more difficult?

OTOH- We can think of King Agammemnon and his ultimate sacrifice of his daughter Iphegenia, demanded by the gods in order to get the ships to set sail. While he clearly *HAD* to do this under the premise that he *should* go to war with Troy, Greek literature seems to be highly critical of this decision and whether or not the war should ever have been fought... If our 'super-intelligent,' 'friendly' AI, were but the Greek Gods unto us, I don't think I would want to be at its moral mercy... I am not a toy.

The Greeks really did get it all right. There have been so few insights into human nature since...

In response to The Opposite Sex
Comment author: Laura__ABJ 01 July 2008 02:05:00PM 0 points [-]

Michael-

I was the top student in my geometry class, thank you very much! Was discussing non-euclidian geometry and mathematics in alternate bases in 4th grade- thank you. And I think I'm a fairly competent sketch artist... That you would try to explain *my* non-desire to run in front of traffic as a function of *my* gender was just... appalling.

In response to The Opposite Sex
Comment author: Laura__ABJ 01 July 2008 04:42:00AM 2 points [-]

Coward: 'If I'd been a woman, I would probably have won a few $100,000 contracts that year, and would now be wealthy. In the SBIR program, grant applications from women-owned companies go to the head of the line, and receive extremely favorable treatment.'

Eliezer! He has it! YOU NEED BABES! Yes! Lots of babes to get you money! To save the world! Why haven't you done this yet??? And you call yourself a rationalist...

I'm joking, though, in all honesty, there is some truth to this, though it is a risky business- I did go to a party for a friend to flirt his investors (who were technical and liked smart women) into being more comfortable... Don't know if I got him any more money that night, but it probably made him look good.

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 03:31:51AM 0 points [-]

Andy- I agree with your skepticism. I was taking for granted that the AI in the scenario was correct in its calculation, since I am 'convinced that it is friendly' but yes, I would need to be pretty fucking sure it really was both friendly and able to perform such calculations before I would kill anyone at its command.

In response to The Opposite Sex
Comment author: Laura__ABJ 01 July 2008 02:47:00AM 3 points [-]

Angel: "The political is the personal. When somebody raises the ugly head of sex stereotypes, my logic and my reason are offended, but the rest of me is flinching back from the endless, historical and ongoing carnival of ugly, cruel things that that sort of thinking is intimately linked with in women's experience."

This I sympathize with. A couple of things most men might not be in on-

What percentage of women that you know have been offered the option of trading sexual favors for career advancement?
My conservative estimate: 20%

What percentage of women that you know have performed sexual favors for money, power, or other material gain?
My conservative estimate: 15%

What percentage of women that you know have been sexually violated?
My conservative estimate: 25%

Question- Would you personally ever consider dating a woman who had sold sexual favors?

The information that you, generally good, guys may not be privy to is just HOW BAD women really have it in this realm... They won't necessarily tell you...

Also- What we're dealing with:
Good friend of mine and I were crossing the street in a foreign country. He had previously characterized me as one of the, if not the smartest woman he had ever met. He plowed ahead in front of the traffic, I waited for the light to turn green. When I got to the other side, he shook his head and said 'women and their decrease spatial abilities...'
I yelled at him, and he said he was just kidding... Was he? This is the name of the game. This is what we're up against. This is why so many women feel they are at war...

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 02:22:46AM 0 points [-]

Andy: "I can't claim that there's an objective reason to value individual rights so highly, but it is a fact that I do."

Hal: "You argue and ask if there isn't some other way to do it, and the FAI explains that every other alternative will involve much greater human suffering."

These things seem grossly disproportionate. Do you really believe utility(individual rights of one person)>>>utility(end great human suffering)

Andy- A man who is on the brink the death has a key to a safe deposit box in which there is an asthma inhaler. He owns both the inhaler and the safe deposit box. His son's son is having a very serious asthma attack that might lead to death. Since said man currently hates his son, he decides not to tell him where the key is, since it's his property and he doesn't have to. "Call an ambulance and wait." He tells his son. You know where the key is. Do you steal it?

In response to The Moral Void
Comment author: Laura__ABJ 01 July 2008 01:11:55AM 0 points [-]

Hal Finney-

I probably wouldn't have argued that much with the AI... I've done things I've personally found more morally questionable since I didn't have quite as good a reason to believe I was right about the outcome... Moral luck, I was.

In response to The Moral Void
Comment author: Laura__ABJ 30 June 2008 01:12:20PM 0 points [-]

Eliezer: "Go ahead. Indulge your fantasy. Would you want the stone tablet to say people should die of old age, or that people should live as long as they wanted? If you could write the stone tablet yourself, what would it say?"

Excellent way of putting it... I would certainly want the option of living as long as I liked. (Though I find it worth noting that when I was depressed, I found the idea of needing to choose when to end program abhorrent, since I figured I could go several billion years in agony before making such a choice... Many people you talk to about the meaning of death may be longing for it *now*. Excellent Murakami story on the subject- in the collection 'Super Frog Saves Tokyo', but I forgot the title. Many people are very dissatisfied with their lives.)

Some of my problems with the 'positive singularity' do not involve uploading proper, but parameter manipulation that anything similar to our current identity may get completely lost in... Also not quite off the problem of 'is your clone really you?' ... or what we do with our physical selves after the upload.... All seem troubling to me.

Comment author: Laura__ABJ 30 June 2008 03:36:00AM 0 points [-]

Also Mike- the first portion of your argument was written in such a confusing manner that I had to read it twice, and I know the way you argue... don't know if anyone who didn't already know what you were talking about would have kept reading.

Comment author: Laura__ABJ 30 June 2008 03:31:00AM 0 points [-]

Michael- I have repeatedly failed to understand why this upsets you so much, though it clearly does. It's hard for me to see why I should care if the AI does a pretty fireworks display for 10 seconds or 10,000 years. Perhaps you need to find more intuitive ways of explaining it. A better analogy? At some points you just seem like a mystic to me...

View more: Prev | Next