XiXiDu comments on Goals for which Less Wrong does (and doesn't) help - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (101)
My impression is that XiXiDu is curious and that what you're frustrated by has more to do with his difficulty expressing himself than with closed-mindedness on his part. Note that he compiled a highly upvoted list of references and resources for Less Wrong - I read this as evidence that he's interested in Less Wrong's mission and think that his comments should be read more charitably.
I'll try to recast what I think he's trying to say in clearer terms sometime over the next few days
Though there are many brilliant people within academia, there is also shortsightedness and group-think within academia which could have led the academic establishment to ignore important issues concerning safety of advanced future technologies.
I've seen very little (if anything) in the way of careful rebuttals of SIAI's views from the academic establishment. As such, I don't think that there's strong evidence against SIAI's claims. At the same time, I have the impression that SIAI has not done enough to solicit feedback from the academic establishment.
John Baez will be posting an interview with Eliezer sometime soon. It should be informative to see the back and forth between the two of them.
Concerning the apparent group think on Less Wrong: something relevant that I've learned over the past few months is that some of the vocal SIAI supporters on LW express views that are quite unrepresentative of those of the SIAI staff. I initially misjudged SIAI on account of past unawareness of this point.
I believe that if you're going to express doubts and/or criticism about LW and/or SIAI you should take the time and energy to express these carefully and diplomatically. Expressing unclear or inflammatory doubts and/or criticism is conducive to being rejected out of hand. I agree with Anna's comment here.
Wow, that's cool! They read my mind :-)
Even Eliezer Yudkowsky doesn't believe he's the smartest person alive. He's the founder of the site and set its tone early, but that's not the same thing.
Finding people smarter than oneself is essential to making oneself more effective and stretching one's abilities and goals.
For an example I'm closely familiar with: I think one of Jimmy Wales' great personal achievements with Wikipedia, as an impressively smart fellow himself, is that he discovered an extremely efficient mechanism for gathering around him people who made him feel really dumb by comparison. He'd be first to admit that a lot of those he's gathered around him outshine him.
Getting smarter people than yourself to sign up for your goals is, I suspect, one marker of success in selecting a good goal.
I agree; the average quality of your comments and posts has been increasing with time and I commend you for this.
This statement carries the connotation that I'm very important. At present I don't think that there's solid evidence in this direction. In any case; no need to feel self-conscious about taking my time, I'm happy to make your acquaintance and engage with you.