eli_sennesh comments on Beyond Statistics 101 - Less Wrong

19 Post author: JonahSinick 26 June 2015 10:24AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (129)

You are viewing a single comment's thread. Show more comments above.

Comment author: JonahSinick 26 June 2015 10:04:21PM *  3 points [-]

Thanks so much for sharing. I'm astonished by how much more fruitful my relationships have became since I've started asking.

I think that a lot of what you're seeing is a cultural clash: different communities have different blindspots and norms for communication, and a lot of times the combination of (i) blindspots of the communities that one is familiar with and (ii) respects in which a new community actually is unsound can give one the impression "these people are beyond the pale!" when the actual situation is that they're no less rational than members of one's own communities.

I had a very similar experience to your own coming from academia, and wrote a post titled The Importance of Self-Doubt in which I raised the concern that Less Wrong was functioning as a cult. But since then I've realized that a lot of the apparently weird beliefs on LWers are in fact also believed by very credible people: for example, Bill Gates recently expressed serious concern about AI risk.

If you're new to the community, you're probably unfamiliar with my own credentials which should reassure you somewhat:

  • I did a PhD in pure math under the direction of Nathan Dunfield, who coauthored papers with Bill Thurston, who formulated the geometrization conjecture which Perelman proved and in doing so won one of the Clay Millennium Problems.

  • I've been deeply involved with math education for highly gifted children for many years. I worked with the person who won the American Math Society prize for best undergraduate research when he was 12.

  • I worked at GiveWell, which partners with with Good Ventures, Dustin Moskovitz's foundation.

  • I've done fullstack web development, making an asynchronous clone of StackOverflow (link).

  • I've done machine learning, rediscovering logistic regression, collaborative filtering, hierarchical modeling, the use of principal component analysis to deal with multicollinearity, and cross validation. (I found the expositions so poor that it was faster for me to work things out on my own than to learn from them, though I eventually learned the official versions).You can read some details of things that I found here. I did a project implementing Bayesian adjustment of Yelp restaurant star ratings using their public dataset here

So I imagine that I'm credible by your standards. There are other people involved in the community who you might find even more credible. For example: (a) Paul Christiano who was an international math olympiad medalist, wrote a 50 page paper on quantum computational complexity with Scott Aaronson as an undergraduate at MIT, and is a theoretical CS grad student at Berkeley. (b) Jacob Steinhardt, a Hertz graduate fellow who does machine learning research under Percy Liang at Stanford.

So you're not actually in some sort of twilight zone. I share some of your concerns with the community, but the groupthink here is no stronger than the groupthink present in academia. I'd be happy to share my impressions of the relative soundness of the various LW community practices and beliefs.

Comment author: [deleted] 26 June 2015 11:24:53PM 2 points [-]

There are other people involved in the community who you might find even more credible. For example: (a) Paul Christiano who was an international math olympiad medalist, wrote a 50 page paper on quantum computational complexity with Scott Aaronson as an undergraduate at MIT, and is a theoretical CS grad student at Berkeley. (b) Jacob Steinhardt, a Hertz graduate fellow who does machine learning research under Percy Liang at Stanford.

Of course, Christiano tends to issue disclaimers with his MIRI-branded AGI safety work, explicitly stating that he does not believe in alarmist UFAI scenarios. Which is fine, in itself, but it does show how people expect someone associated with these communities to sound.

And Jacob Steinhardt hasn't exactly endorsed any "Twilight Zone" community norms or propaganda views. Errr, is there a term for "things everyone in a group thinks everyone else believes, whether or not they actually do"?

Comment author: JonahSinick 27 June 2015 01:50:52AM 3 points [-]

I'm not claiming otherwise: I'm merely saying that Paul and Jacob don't dismiss LWers out of hand as obviously crazy, and have in fact found the community to be worthwhile enough to have participated substantially.

Comment author: [deleted] 28 June 2015 07:10:24PM 3 points [-]

I think in this case we have to taboo the term "LWers" ;-). This community has many pieces in it, and two large parts of the original core are "techno-libertarian Overcoming Bias readers with many very non-mainstream beliefs that they claim are much more rational than anyone else's beliefs" and "the SL4 mailing list wearing suits and trying to act professional enough that they might actually accomplish their Shock Level Four dreams."

On the other hand, in the process of the site's growth, it has eventually come to encompass those two demographics plus, to some limited extent, almost everyone who's willing to assent that science, statistical reasoning, and the neuro/cognitive sciences actually really work and should be taken seriously. With special emphasis on statistical reasoning and cognitive sciences.

So the core demographic consists of Very Unusual People, but the periphery demographics, who now make up most of the community, consist of only Mildly Unusual People.

Comment author: JonahSinick 29 June 2015 07:18:41AM 0 points [-]

Yes, this seems like a fair assessment o the situation. Thanks for disentangling the issues. I'll be more precise in the future.