Comment author: Vladimir_Golovin 09 March 2009 08:31:52PM *  1 point [-]

Just installed it, works like a charm -- I'll keep it this way. If only it could hide the sidebar as well.

Also, thanks to this post, I'm now aware of the fact that there are posts other than those listed on the homepage! I have no prior experience with Reddit, so I assumed that the front page is completely community-driven, but it turns out that it is for 'PROMOTED' posts only!

I spotted the word 'anti-kibitzer' in the sidebar, but was unable to find the post on the front page -- that's how I discovered the 'NEW' and 'POPULAR' links under the site banner.

Comment author: Marcello 09 March 2009 09:04:16PM 2 points [-]

I've upgraded the LW anti-kibitzer so that it hides the taglines in the recent comments sidebar as well. (Which is imperfect, because it also hides which post the comment was about, but better will have to wait until the server starts enclosing all the kibitzing pieces of information in nice tags.) No such hack was possible for the recent posts sidebar.

LessWrong anti-kibitzer (hides comment authors and vote counts)

59 Marcello 09 March 2009 07:18PM

Related to Information Cascades

Information Cascades has implied that people's votes are being biased by the number of votes already cast.  Similarly, some commenters express a perception that higher status posters are being upvoted too much.

If, like me, you suspect that you might be prone to these biases, you can correct for them by installing LessWrong anti-kibitzer which I hacked together yesterday morning.  You will need Firefox with the greasemonkey extention installed.  Once you have greasemonkey installed, clicking on the link to the script will pop up a dialog box asking if you want to enable the script.  Once you enable it, a button which you can use to toggle the visibility of author and point count information should appear in the upper right corner of any page on LessWrong.  (On any page you load, the authors and pointcounts are automatically hidden until you show them.)  Let me know if it doesn't work for any of you.

Already, I've had some interesting experiences.  There were a few comments that I thought were written by Eliezer that turned out not to be (though perhaps people are copying his writing style.)  There were also comments that I thought contained good arguments which were written by people I was apparently too quick to dismiss as trolls.  What are your experiences?

Comment author: theotetia 09 March 2009 06:36:08PM 3 points [-]

I'm a lifelong atheist trying to see my way clear to [one of the more serious branches of] Christianity. Eliezer's posts, especially, have helped me draw my map of the strengths and limitations of rationalism. I'm not here to troll, I'm here to learn.

Comment author: Marcello 09 March 2009 06:45:11PM 3 points [-]

A phrase like trying to see my way clear to should be a giant red flag. If you're trying to accept something then you must have some sort of motivation. If you have the motivation to accept something because you actually believe it is true, then you've already accepted it. If you have that motivation for some other reason, then you're deceiving yourself.

Comment author: Marcello 07 March 2009 05:22:46PM *  14 points [-]

It strikes me that it's not necessarily a bad thing if people are, right now, posting articles faster than they could sustainably produce in the long term. One thing you could do is not necessarily promote things immediately after they're written. Stuff on LW should still be relevant a week after it's written.

If there's a buffer of good posts waiting to be promoted, then we could make the front page a consistent stream of good articles, as opposed to having to promote slightly lower quality posts on bad days, and missing out on a few excellent posts on fast days.

EDIT: Another reason to wait before promoting things is that the goodness of some kinds of posts might really depend on the quality of the discussion that starts to form around them.

Comment author: Marcello 06 March 2009 07:29:10AM 1 point [-]

The Wikipedia link is broken.

Comment author: Roko 05 March 2009 06:30:37PM 6 points [-]

"Disclaimer: I don't actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry."

  • well exactly... If the person were thinking rationally enough to contemplate that argument, they really wouldn't need it.

I have never successfully converted a religious person to atheism, but my ex-girlfriend did. I am a more rational person than her, I know more philosophy, I have earnestly tried many times, she just did this once, etc. How did she do it? The person in question was male and his religion forbade him from sex outside marriage. Most people are mostly ruled by their emotions.

Comment author: Marcello 06 March 2009 12:47:39AM *  4 points [-]

well exactly... If the person were thinking rationally enough to contemplate that argument, they really wouldn't need it.

My working model of this person was that the person has rehearsed emotional and argumentative defenses to protect their belief, or belief in belief, and that the person had the ability to be reasonably rational in other domains where they weren't trying to be irrational. It therefore seemed to me that one strategy (while still dicey) to attempt to unconvince such a person would be to come up with an argument which is both:

  • Solid (Fooling/manipulating them into thinking the truth is bad cognitive citizenship, and won't work anyway because their defenses will find the weakness in the argument.)

  • Not the same shape as the argument their defenses are expecting.

Roko: How is your working model of the person different from mine?

Comment author: Eliezer_Yudkowsky 05 March 2009 06:54:33PM 5 points [-]

To be clear, she never did say, "I am deceiving myself" or "I falsely believe that there is a God".

Comment author: Marcello 06 March 2009 12:16:49AM 1 point [-]

I stand corrected. I hereby strike the first two sentences.

Comment author: Marcello 05 March 2009 06:23:36PM *  12 points [-]

If I had been talking to the person you were talking to, I might have said something like this:

<del>Why are you deceiving yourself into believing Orthodox Judaism as opposed to something else? If you, in fact, are deriving a benefit from deceiving yourself, while at the same time being aware that you are deceiving yourself, then why haven't you optimized your deceptions into something other than an off-the-shelf religion by now?</del> Have you ever really asked yourself the question: "What is the set of things that I would derive the most benefit from falsely believing?" Now if you really think you can make your life better by deceiving yourself, and you haven't really thought carefully about what the exact set of things about which you would be better off deceiving yourself is, then it would seem unlikely that you've actually got the optimal set of self-deceptions in your brain. In particular, this means that it's probably a bad idea to deceive yourself into thinking that your present set of self deceptions is optimal, so please don't do that.

OK, now do you agree that finding the optimal set of self deceptions is a good idea? OK, good, but I have to give you one very important warning. If you actually want to have the optimal set of self deceptions, you'd better not deceive yourself at all while you are constructing this set of self deceptions, or you'll probably get it wrong, because if, for example, you are currently sub-optimally deceiving yourself into believing that it is good to believe X, then you may end up deceiving yourself into actually believing X, even if that's a bad idea. So don't self deceive while you're trying to figure out what to deceive yourself of.

Therefore, to the extent that you are in control of your self deceptions, (which you do seem to be) the first step toward getting the best set of self deceptions is to disable them all and begin a process of sincere inquiry as to what beliefs it is a good idea to have.


And hopefully, at the end of the process of sincere inquiry, they discover the best set of self deceptions happens to be empty. And if they don't, if they actually thought it through with the highest epistemic standards, and even considered epistemic arguments such as honesty being one's last defence, slashed tires, and all that.... Well, I'd be pretty surprised, but if I were actually shown that argument, and it actually did conform to the highest epistemic standards.... Maybe, provided it's more likely that the argument was actually that good, as opposed to my just being deceived, I'd even concede.

Disclaimer: I don't actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry. Regardless, if this sort of thought got stuck in their head, it could at least increase their cognitive dissonance, which might be a step on the road to recovery.

Comment author: Marcello 01 March 2009 08:11:07PM 8 points [-]

The idea that that you shouldn't internally argue for or against things or propose solutions too soon is probably the most frequently useful thing. I sometimes catch myself arguing for or against something and then I think "No, I should really just ask the question."

Comment author: Marcello 27 February 2009 05:46:06AM *  10 points [-]

I think I began as a rationalist when I read this story. (This was before I had run across anything Eliezer wrote.) I had rationalist tendencies before that, but I wasn't really trying very hard to be rational. Back then my "pet causes" (as I call them now) included things like trying to make all the software transparent and free. These were pet causes simply because I was interested in computers. But here, I had found something that was sufficiently terrible and sufficiently potentially preventable that it utterly dwarfed my pet causes.

I learned a simple lesson: If you really want the things you really want, then you need to think carefully about what those things are and how to accomplish them.

View more: Prev | Next