Roko comments on A Suite of Pragmatic Considerations in Favor of Niceness - Less Wrong

82 Post author: Alicorn 05 January 2010 09:32PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (183)

You are viewing a single comment's thread.

Comment deleted 06 January 2010 02:36:06PM [-]
Comment author: Corey_Newsome 06 January 2010 03:52:48PM 7 points [-]

Not for us 'average accuratarians'.

Comment author: RichardKennaway 06 January 2010 10:11:03PM 8 points [-]

Why are you assuming that being nice must decrease accuracy? Several of the points that Alicorn mentioned increase accuracy.

Comment author: ciphergoth 06 January 2010 10:52:28PM *  21 points [-]

Strongly agreed. It's been observed elsewhere on LW that sarcasm can hide gaping logical flaws in an argument that straightforward statement would immediately reveal. I have more than once found that I have been forced to be more intellectually rigorous when I'm unable to cover the gaps in my argument with dripping scorn.

Consider P Z Myers' scorn for transhumanists - I'd love to hear his "being nice" argument against us, and I think he'd have a much harder time sounding convincing.

Comment deleted 07 January 2010 10:16:35AM *  [-]
Comment author: ciphergoth 07 January 2010 10:42:32AM 3 points [-]

It's possible that this is a disagreement on the meaning of "effortful niceness". As I've said, I find that it is often more effort to express a point straightforwardly than to express it meanly, because I'm forced to think through the argument more carefully. I'm guessing that doesn't fall under the banner of "effortful niceness" for you, even though it involves effort and makes the comment nicer? Could you give an example of counterproductive effortful niceness?

Comment author: Eliezer_Yudkowsky 06 January 2010 06:32:12PM 2 points [-]

If it's literally 99%, then maybe maybe. If it's actually more like 95%, then hell no.

Comment author: ciphergoth 06 January 2010 08:02:06PM 11 points [-]

I can't extract any meaning from these percentages. Well over 99% of an ordinary person's beliefs are true, because they are about prosaic, uncontroversial things like "I have fingernails".

Comment author: MichaelVassar 06 January 2010 11:42:44PM 1 point [-]

I can't either, but my basic reaction is simply that in practice purity is critical here. If, in order to act correctly, a person needs to do more than 70 cognitive things correctly, their expected value falls by half for every 1% that they are wrong.

Comment author: dansmith 07 January 2010 12:19:13AM 0 points [-]

Assuming any action anywhere short of optimal results in zero value, sure. In practice?

Comment author: MichaelVassar 07 January 2010 01:05:19AM 0 points [-]

In practice, if you are only talking about the 70 most important steps that people are prone to messing up, that could easily be correct. Not to mention the probability of doing harm. Certainly there are a lot more than 10 steps that people are prone to messing up which reduce value by more than 80% in practice.

Comment author: dansmith 07 January 2010 01:27:11AM 1 point [-]

I suppose it depends what kinds of decisions you're talking about making. (eg keeping AIs from destroying humanity.) I was thinking along the lines of day-to-day decision making, in which people generally manage to survive for decades in spite of ridiculously flawed beliefs -- so it seems there are lots of situations where performance doesn't appear to degrade nearly so sharply.

At any rate, I guess I'm with ciphergoth, the more interesting question is why 99% accurate is "maybe maybe" okay, but 95% is "hell no". Where do those numbers come from?

Comment author: Eliezer_Yudkowsky 07 January 2010 02:09:34AM 1 point [-]

Someone who gets it 99% right is useful to me, someone who gets it 95% right is so much work to deal with that I usually don't bother.

Comment author: PhilGoetz 07 January 2010 02:51:50AM *  3 points [-]

No one gets it 99% right. (Modulo my expectation that we are speaking only of questions of a minimal difficulty; say, at least as difficult as the simplest questions that the person has never considered before.)

When I was a cryptographer, an information source with a .000001% bulge (information content above randomness) would break a code wide open for me. Lack of bias was much more important than % right.

Comment author: Tyrrell_McAllister 07 January 2010 03:30:14AM *  1 point [-]

When I was a cryptographer, an information source with a .000001% bulge (information content above randomness) would break a code wide open for me.

In that case, a second information source of that quality wouldn't have been that much use to you.

The first person who gets it 95% right would be very valuable. But there are diminishing returns.

Comment author: Cyan 07 January 2010 03:04:47AM 1 point [-]

From a curious non-cryptographer: what size of corpus are you talking about here?

Comment author: bentarm 06 January 2010 11:54:19PM 0 points [-]

I was thinking exactly the same thing. I have literally no idea what 'percentage' of the things I believe are true, and certainly wouldn't be willing to put a figure on what percentage is acceptable.

Comment author: Bongo 06 January 2010 04:41:32PM *  0 points [-]

Repugnant conlusion: 3^^^3 people with 0.00001% accurate beliefs is worth more than 100000 people with 100% accurate beliefs?