sfb comments on Tallinn-Evans $125,000 Singularity Challenge - Less Wrong

27 Post author: Kaj_Sotala 26 December 2010 11:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (369)

You are viewing a single comment's thread. Show more comments above.

Comment author: Quirinus_Quirrell 01 January 2011 11:12:03PM 10 points [-]

I'm beginning to think that LW needs some better mechanism for dealing with the phenomenon of commenters who are polite, repetitive, immune to all correction, and consistently wrong about everything.

The problem is quite simple. Tim, and the rest of the class of commenters to which you refer, simply haven't learned how to lose. This can be fixed by making it clear that this community's respect is contingent on retracting any inaccurate positions. Posts in which people announce that they have changed their mind are usually upvoted (in contrast to other communities), but some people don't seem to have noticed.

Therefore, I propose adding a "plonk" button on each comment. Pressing it would hide all posts from that user for a fixed duration, and also send them an anonymous message (red envelope) telling them that someone plonked them, which post they were plonked for, and a form letter reminder that self-consistency is not a virtue and a short guide to losing gracefully.

Comment author: sfb 20 January 2011 05:17:43PM *  2 points [-]

and a form letter reminder that self-consistency is not a virtue [..] making it clear that this community's respect is contingent on [..]

Is changing professed beliefs to something else without understanding / agreeing with the new position, but just doing it to gain community respect, a virtue?

Tim, and the rest of the class of commenters to which you refer, simply haven't learned how to lose.

Or still isn't convinced that he is wrong by the time you have passed your tolerance of explaining so you give up and decide he must be broken. Your proposed 'solution' is a hack so you can give up on convincing him but still have him act convinced for the benefit of appearances - maybe you are simply expecting far far too short inferential distances?