You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Will_Newsome comments on Intellectual insularity and productivity - Less Wrong Discussion

53 [deleted] 11 June 2012 03:10PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (169)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 12 June 2012 06:52:47AM *  6 points [-]

Also in the past two years while we do link to new ideas, they don't seem to propagate and are hardly ever referenced a year later. Sequence posts however are.

E.g., despite two to five posts on the matter and many comments, there seems to be a huge disconnect between how folk like Wei_Dai, cousin_it, Vladimir_Nesov, &c. interpret Solomonoff induction, and how average LW commenters interpret Solomonoff induction, with the latter group echoing a naive, broken interpretation of the math and thus giving newer people mistaken ideas. It's frustrating because probability theory is one of few externally-credible things that sets LW's epistemology apart and yet a substantial fraction of LW folk who bring up algorithmic probability do so for bad reasons and in completely inappropriate contexts. Furthermore because they think they understand the math they also think they have special insight into why the person they disagree with is wrong.

Comment author: wedrifid 12 June 2012 07:10:46AM 2 points [-]

E.g., despite two to five posts on the matter and many comments, there seems to be a huge disconnect between how folk like WeiDai, cousinit, Vladimir_Nesov, &c. interpret Solomonoff induction, and how average LW commenters interpret Solomonoff induction, with the latter group echoing a naive, broken interpretation of the math and thus giving newer people mistaken ideas.

For example? (I don't recall average users mentioning the subject all that much, right or wrong.)

Furthermore because they think they understand the math they also think they have special insight into why the person they disagree with is wrong.

I haven't seen this as applied to Solomonoff induction.

Comment author: Will_Newsome 12 June 2012 06:20:49PM 2 points [-]

(I don't recall average users mentioning the subject all that much, right or wrong.)

I suppose I meant "relatively average". Anyway I don't know where to find examples off the top of my head, sorry.

I haven't seen this as applied to Solomonoff induction.

IIRC I've seen it two to five times, so this specifically is not a big deal in any case.

I've seen more general errors pertaining to algorithmic probability much more often than that, sometimes committed by high-status folk like lukeprog, who wrote a post (sequence?) allegedly explaining Solomonoff induction.

Comment author: wedrifid 13 June 2012 02:00:55AM 2 points [-]

Thank you. While I don't recall the examples myself I believe your testimony regarding the two to five examples you've noticed. I expect I am much more likely to notice such comments in the future given the prompting and so take more care when parsing.

I've seen more general errors pertaining to algorithmic probability much more often than that, sometimes committed by high-status folk like lukeprog, who wrote a post (sequence?) allegedly explaining Solomonoff induction.

I can see why that would be disconcerting.