PlaidX comments on What bothers you about Less Wrong? - Less Wrong

18 Post author: Will_Newsome 19 May 2011 10:23AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (160)

You are viewing a single comment's thread.

Comment author: PlaidX 19 May 2011 06:16:22PM 13 points [-]

Creepily heavy reliance on torture-based what-if scenarios.

Comment author: cousin_it 19 May 2011 06:30:39PM 15 points [-]

If you try to do moral philosophy, you inevitably end up thinking a lot about people getting run over by trolleys and such. Also if you want to design good chairs, you need to understand people's butts really well. Though of course you're allowed to say it's a creepy job but still enjoy the results of that job :-)

Comment author: PlaidX 19 May 2011 09:52:03PM 4 points [-]

I haven't read TOO much mainstream philosophy, but in what I have, I don't recall even a single instance of torture being used to illustrate a point.

Maybe that's what's holding them back from being truly rational?

Comment author: Dreaded_Anomaly 19 May 2011 07:18:49PM 4 points [-]

One of the major goals of Less Wrong is to analyze our cognitive algorithms. When analyzing algorithms, it's very important to consider corner cases. Torture is an example of extreme disutility, so it naturally comes up as a test case for moral algorithms.

Comment author: PlaidX 19 May 2011 09:29:15PM *  12 points [-]

I've heard that before, and I grant that there's some validity to it, but that's not all that's going on here. 90% of the time, torture isn't even relevant to the question the what-if is designed to answer.

The use of torture in these hypotheticals generally seems to have less to do with ANALYZING cognitive algorithms, and more to do with "getting tough" on cognitive algorithms. Grinding an axe or just wallowing in self-destructive paranoia.

If the point you're making really only applies to torture, fine. But otherwise, it tends to read like "Maybe people will understand my point better if I CRANK MY RHETORIC UP TO 11 AND UNCOIL THE FIREHOSE AND HALHLTRRLGEBFBLE"

There's a number of things that make me not want to self-identify as a lesswrong user, and not bring up lesswrong with people who might otherwise be interested in it, and this is one of the big ones.

Comment author: Bongo 27 May 2011 11:21:22PM *  0 points [-]

"Maybe people will understand my point better if I CRANK MY RHETORIC UP TO 11 AND UNCOIL THE FIREHOSE AND HALHLTRRLGEBFBLE"

Not necessarily even wrong. The higher the stakes, the more people will care about getting a winning outcome instead of being reasonable. It's a legit way to cut through the crap to real instrumental rationality. Eliezer uses it in his TDT paper (page 51):

... imagine a Newcomb's Problem in which a black hole is hurtling toward Earth, to wipe out you and everything you love. Box B is either empty or contains a black hole deflection device. Box A as ever transparently contains $1000. Are you tempted to do something irrational? Are you tempted to change algorithms so that you are no longer a causal decision agent, saying, perhaps, that though you treasure your rationality, you treasure Earth's life more?

Comment author: TimFreeman 20 May 2011 04:47:16PM 4 points [-]

Creepily heavy reliance on torture-based what-if scenarios.

I agree. I wrote the article you're citing. I was hoping that by mocking it properly it would go away.