An useful device here might be the word "about", LW is framed to be about rationality, so everyone who think they know anything about rationality think they can participate. However, in practice it is about a specific type of rationality (that it happens to be the type that can be considered the only one is for the moment irrelevant) that requires having read the sequences. From an outside view one might even argue that LW is "about" the sequences "rather than" rationality.
I was recently thinking about the possibility that someone with a lot of influence might at some point try to damage LessWrong and the SIAI and what preemptive measures one could take to counter it.
If you believe that the SIAI does the most important work in the universe and if you believe that LessWrong serves the purpose of educating people to become more rational and subsequently understand the importance of trying to mitigate risks from AI, then you should care about public relations, you should try to communicate your honesty and well-intentioned motives as effectively as possible.
Public relations are very important because a good reputation is necessary to do the following:
An attack scenario
First one has to identify characteristics that could potentially be used to cast a damaging light on this community. Here the most obvious possibility seems to be to portray the SIAI, together with LessWrong, as a cult.
After some superficial examination an outsider might conclude the following about this community:
Most of this might sound wrong to the well-read LessWrong reader. But how would those points be received by mediocre rationalists who don't know what you know, especially if eloquently summarized by a famous and respected person?
Preemptive measures
How one might counter such conclusions:
So what do you think needs improvement and what would you do about it?