Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

nerzhin comments on Strategic ignorance and plausible deniability - Less Wrong

36 Post author: Kaj_Sotala 10 August 2011 09:30AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (55)

You are viewing a single comment's thread. Show more comments above.

Comment author: nerzhin 10 August 2011 04:13:04PM 4 points [-]

There are three things you could want:

  1. You could want the extra dollar. ($6 instead of $5)

  2. You could want to feel like someone who care about others.

  3. You could genuinely care about others.

The point of the research in the post, if I understand it, is that (many) people want 1 and 2, and often the best way to get both those things is to be ignorant of the actual effects of your behavior. In my view a rationalist should decide either that they want 1 (throwing 2 and 3 out the window) or that they want 3 (forgetting 1). Either way you can know the truth and still win.

Comment author: atucker 10 August 2011 04:23:12PM *  2 points [-]

The problem with strategic ignorance is if the situation is something like 6/1 vs. 5/1000.

Most people care more about themselves than others, but I think that at that level most people would just choose to lose a dollar and give 999 more.

If you choose to not learn something, then you don't know what you're causing to happen, even if it would entirely change what you would want to do.

Comment author: JackEmpty 10 August 2011 05:09:01PM 2 points [-]

So it's not only strategic ignorance, but selective ignorance too. By which I mean to say it only applies highly selectively.

If you have enough knowledge about the situation to know it's going to be 6/1 and 5/5, or 5/1 and 6/5, then that's a pretty clear distinction. You have quite a bit of knowledge, enough to narrow it to only two situations.

But as you raised, it could be 6/1 & 5/5, or 6/1 & 5/1000 or 6/(.0001% increase of global existential risk) & 5/(.0001% increase of the singularity within your lifetime).

The implications of your point being, if you don't know what's at stake, it's better to learn what's at stake.

Comment author: atucker 10 August 2011 05:11:10PM 0 points [-]

Yeah, pretty much.