You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Mirzhan_Irkegulov comments on Leaving LessWrong for a more rational life - Less Wrong Discussion

33 [deleted] 21 May 2015 07:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (268)

You are viewing a single comment's thread. Show more comments above.

Comment author: Mirzhan_Irkegulov 21 May 2015 11:32:55PM 13 points [-]

Thank you for your response, that's really important for me.

I've never seen disparaging of actually helping people on LW. Can you point to examples? Can you argue that it is a tendency? You say that there is lots of outright hostility to anything against x-risks and human misery, except if it's MIRI. I wouldn't even imagine anyone would say that of LW, but maybe I'm blind, so I'll be grateful if you prove me wrong. Yudkowsky is definitely pro-immortality and supported donating to SENS.

I don't even think MIRI and MIRI-leaning LWers are against ongoing AI research. I've never heard anything like “please stop doing any AI until we figure out friendliness”, only “hey, can you please put more effort into friendliness too, it's very important?” And even if you think that MIRI's focus on friendliness is order of magnitude misplaced, it's just a mistake of prioritizing, not a fundamental philosophical blunder. Again, if you can expand on this topic, I would only say thank you.

Maybe “reform” isn't the right word. The Sequences aren't going anywhere, so of course LW will be FAI-centric for a long time, but within LW there is already a substantial amount of people (that's my impression, I never actually counted) who are not simply contrarian, but actually assign different priorities on what should be done about the world. More inline with your thoughts, than Yudkowsky's. Maybe you can still stay and steer this substantial minority in the right direction, instead of useless splitting.

I bet most people on LW are not even high-karma prolific writers, they are less knowledge, less confident, but also more open to contrary views, such as yours. Just by writing one big article about how you think LW's focus is misplaced can be of extreme help for such people. Which, BTW, includes me, because I never posted anything.

I'd actually would love to see you writing articles on all your theses here, on LW. LW-critical articles were already promoted a few times, including Yvain's article, so it's not like LW is criticism-intolerant.

If you actually do that, and provide lots of examples and evidence, it would be a breathe of fresh air for all those people, who will continue to be attracted to LW. You don't have to put titanic effort into “reform”, just erect a pole.

Comment author: [deleted] 23 May 2015 02:08:06PM 2 points [-]

You say that there is lots of outright hostility to anything against x-risks and human misery, except if it's MIRI.

I was actually making a specific allusion to the hostility towards practical, near-term artificial general intelligence work. I have at times in the past advocated for working on AGI technology now, not later, and been given robotic responses that I'm offering reckless and dangerous proposals, and helpfully directed to go read the sequences. I once joined #lesswrong on IRC and introduced myself as someone interested in making progress in AGI in the near-term, and received two separate death threats (no joke). Maybe that's just IRC—but I left and haven't gone back.

Comment author: TheAncientGeek 22 May 2015 10:50:39AM 2 points [-]

actually would love to see you writing articles on all your theses here, on LW. LW-critical articles were already promoted a few times, including Yvain's article, so it's not like LW is criticism-intolerant.

Things have changed, believe me.

Comment author: Mirzhan_Irkegulov 22 May 2015 01:10:35PM 4 points [-]

Can you point to some examples? Yvain's article was recently on the Main page under Featured articles, for example.

Comment author: Nornagest 22 May 2015 06:19:33PM 1 point [-]

I don't know exactly what process generates the featured articles, but I don't think it has much to do with the community's current preoccupations.

Comment author: Mirzhan_Irkegulov 22 May 2015 06:30:24PM *  4 points [-]

I don't know exact process either, but I always thought somebody deliberately chooses them each week, because often they are around the same topic. So somebody thought it's a good idea to encourage everybody to read an LW-critical article.

My point is, I don't believe LW community suddenly became intolerant to criticism. Or incapable of dialog on whether FAI is a good thing. Or fanatically believing in FAI and Yudkowsky's ideas. Oh, and I'm happy to be proven otherwise!

Seriously, look at top 30 day contributors:

  • Lumifer (629)
  • JonahSinick (626)
  • Vaniver (355)
  • ChristianKl (329)
  • DeVliegendeHollander (251)
  • Richard_Loosemore (242)
  • NancyLebovitz (232)
  • Viliam (209)
  • gjm (184)
  • So8res (180)
  • VoiceOfRa (178)
  • IlyaShpitser (166)
  • Error (162)
  • Mark_Friedenbach (146)
  • JohnMaxwellIV (135)

Only So8res is associated with MIRI, AFAIK. My impression from comments of the people above is that they are pretty much capable of dialog and are not fanatical about FAI at all.

Meaning that in Mark's map LW community is something different than in territory. He think he leaves a crazy cult producing a memetic hazard. I think he leaves a community of pretty much independent-thinking people, who could easily counter MIRI's memes.

That is, even if Mark is completely correct about MIRI, his leaving is irrelevant, it's not a net improvement, but some strange unrelated act with negative utility.

Comment author: TheAncientGeek 23 May 2015 11:34:19AM 1 point [-]

My point is, I don't believe LW community suddenly became intolerant to criticism.

My point was that it has become a lot more tolerant.

Comment author: [deleted] 23 May 2015 02:10:43PM 1 point [-]

Maybe, but the core beliefs and cultural biases haven't changed, in the years that I've been here.

Comment author: TheAncientGeek 23 May 2015 02:35:08PM 4 points [-]

But you didn't get karmassinated or called an idiot.

Comment author: [deleted] 23 May 2015 02:55:12PM 3 points [-]

This is true. I did not expect the overwhelmingly positive response I got...

Comment author: TheAncientGeek 26 May 2015 07:22:29PM 0 points [-]