LessWrong would have to somehow distance itself from MIRI and Eliezer Yudkowsky.
And become just another procrastination website.
Okay, there is still CFAR here. Oh wait, they also have Eliezer in the team! And they believe they can teach the rest of the world to become more rational. How profoundly un-humble or, may I say, cultish? Scratch the CFAR, too.
While we are at it, let's remove the articles "Tsuyoku Naritai!", "Tsuyoku vs. the Egalitarian Instinct" and "A Sense That More Is Possible". They contain the same arrogant ideas, and encourage the readers to think likewise. We don't want more people trying to become awesome, or even worse, succeeding at that.
Actually, we should remove the whole Sequences. I mean, how could we credibly distance ourselves from Eliezer, if we leave hundreds of his articles as the core of this website? No one reads the Sequences anyway. Hell, these days no one even dares to say "Read the Sequences" anymore. Which is good, because telling people to read the Sequences has been criticized as cultish.
There are also some poisonous memes that make people think bad of us, so we should remove them from the website. Specifically: humans could be smarter than Einstein, machines could be smarter than humans, many worlds in quantum physics, atheism... I probably forgot some.
Oh, don't forget to remove the "bragging threads"! Seriously, how immature.
What I meant by distancing LessWrong from Eliezer Yudkowsky is to become more focused on actually getting things done rather than rehashing Yudkowky's cached thoughts.
LessWrong should finally start focusing on trying to solve concrete and specific technical problems collaboratively. Not unlike what the Polymath Project is doing.
To do so LessWrong has to squelch all the noise by stopping to care about getting more members and starting to strongly moderate non-technical off-topic posts.
I am not talking about censorship here. I am talking about something unpr...
WARNING: Memetic hazard.
http://www.slate.com/articles/technology/bitwise/2014/07/roko_s_basilisk_the_most_terrifying_thought_experiment_of_all_time.html?wpisrc=obnetwork
Is there anything we should do?