Comment author: pleeppleep 16 February 2013 05:17:32AM 4 points [-]

If my inhibitions regarding a certain course of action seem entirely internal, go through with it because I'm probably limiting my options for no good reason.

Comment author: Douglas_Knight 04 February 2013 09:22:36PM 5 points [-]

In the end, I think you interpreted pp's question correctly, but that the pronoun is "you" rather than "I" and the disclaimer "if you don't mind my asking" suggests that the question is how big is your donation budget. One answer to that is in the original post, $20k available for this offer. Another answer is here.

(The disclaimer seems to me to require a strong update, but my prior is even higher.)

Comment author: pleeppleep 05 February 2013 01:37:03AM 0 points [-]

You would be correct. Thanks for the link.

Comment author: pleeppleep 04 February 2013 05:52:39PM 1 point [-]

Day = Made

Comment author: pleeppleep 04 February 2013 05:37:31PM 4 points [-]

How much money do you have to donate, if you don't mind my asking?

Comment author: pleeppleep 31 January 2013 02:27:40PM 0 points [-]

Kinda awkward to say aloud. I think Institute for the Research of Machine Intelligence would sound better. Minor nitpick.

Comment author: pleeppleep 31 January 2013 01:28:37AM -2 points [-]

No

Comment author: pleeppleep 27 January 2013 12:14:39AM *  2 points [-]

This is a question about utilitarianism, not AI, but can anyone explain (or provide a link to an explanation) of why reducing the total suffering in the world is considered so important? I thought that we pretty much agreed that morality is based on moral intuitions and it seems pretty counterintuitive to value the states of mind of people too numerous to sympathize with as highly as people here do.

It seems to me that reducing suffering in a numbers game is the kind of thing you would say is your goal because it makes you sound like a good person, rather than something your conscience actually motivates you to do, but people here are usually pretty averse to conscious signaling, so I'm not sure that works as an explanation. I'm certain this has been covered elsewhere, but I haven't seen it.

Comment author: Qiaochu_Yuan 22 January 2013 01:49:23AM 3 points [-]

For all but the last one it seems like you'd need an in-depth knowledge of the gatekeeper's psyche and personal life.

Of course. How else would you know which horrible, horrible things to say? (I also have in mind things designed to get a more visceral reaction from the gatekeeper, e.g. graphic descriptions of violence. Please don't ask me to be more specific about this because I really, really don't want to.)

Comment author: pleeppleep 22 January 2013 01:51:40AM 1 point [-]

You don't have to be specific, but how would grossing out the gatekeeper bring you closer to escape?

Comment author: Qiaochu_Yuan 22 January 2013 01:29:58AM 4 points [-]

Finding such a movie clip sounds extremely unpleasant and I would need more of an incentive to start trying. (Playing the AI in an AI box experiment also sounds extremely unpleasant for the same reason.)

I know it sounds like I'm avoiding having to justify my assertion here, and... that's because I totally am. I suspect on general principles that most successful strategies for getting out of the box involve saying horrible, horrible things, and I don't want to get much more specific than those general principles because I don't want to get too close to horrible, horrible things.

Comment author: pleeppleep 22 January 2013 01:44:04AM 0 points [-]

Like when you say "horrible, horrible things". What do you mean?

Driving a wedge between the gatekeeper and his or her loved ones? Threats? Exploiting any guilt or self-loathing the gatekeeper feels? Appealing to the gatekeeper's sense of obligation by twisting his or her interpretation of authority figures, objects of admiration, and internalized sense of honor? Asserting cynicism and general apathy towards the fate of mankind?

For all but the last one it seems like you'd need an in-depth knowledge of the gatekeeper's psyche and personal life.

Comment author: Qiaochu_Yuan 22 January 2013 01:14:59AM *  9 points [-]

I'd prefer not to. If I successfully made my point, then I'd have posted exactly the kind of thing I said I wouldn't want to be known as being capable of posting.

Comment author: pleeppleep 22 January 2013 01:27:19AM 0 points [-]

But you wouldn't actually be posting it, you would be posting the fact that you conceive it possible for someone to post it, which you've clearly already done.

View more: Prev | Next