Alicorn comments on Poll - Is endless September a threat to LW and what should be done? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (259)
It's the Center for Applied Rationality, not Modern Rationality.
No, actually, there is a "Center for Modern Rationality" which Eliezer started this year:
http://lesswrong.com/lw/bpi/center_for_modern_rationality_currently_hiring/
Here is where they selected the name:
http://lesswrong.com/lw/9lx/help_name_suggestions_needed_for_rationalityinst/5wb8
The reason I selected it for the poll is because they are talking about creating online training materials. It would be more effective to send someone to something online from a website than to send them somewhere IRL from a website as only half of us are in the same country.
No. You're wrong. They changed it, which you would know if you clicked my link.
I don't see how clicking the link you posted would have actually demonstrated her wrong.
Just as it didn't occur to her that the organization could have changed its name, it didn't occur to me that she could seriously think there were two of them.
We have both acknowledged our oversights now. Thank you.
I thought there were two centers for rationality, one being the "Center for Modern Rationality" and the other being the "Center for Applied Rationality". Adding a link to one of them didn't rule out the possibility of there being a second one.
So, you assigned a higher probability to there being two organizations from the same people on the same subject at around the same time with extremely similar names and my correction being mistaken in spite of my immersion in the community in real life... than to you having out-of-date information about the organization's name?
The possibility that the organization had changed it's name did not occur to me. I wish you would have just said "It changed it's name."
As for why I did not assume you knew better than me: The fact that the article was right there talking about the "Center for Modern Rationality" contradicted your information.
I have never met an infallible person, so in the event that I have information that contradicts yours, I will probably think that you're wrong.
It's nice when all the possibilities for why my information contradicts others occurs to me so that I can do something like go search for whether the name of an organization was changed, but that doesn't always happen.
If you knew that it used to be called "Center for Modern Rationality" and changed it's name to "Center for Applied Rationality" why did you not say "It changed it's name."?
I've noticed a pattern with you: Your responses are often missing some contextual information such that I respond in a way that contradicts you. I think you would find me less frustrating if you provided more context.
I think LessWrong as a whole would find you less frustrating if you assumed most comments from established users on domain-specific concepts or facts were more likely to be correct than your own thoughts and updated accordingly.
Established users can be wrong about many things, including domain-specific concepts or facts.
A more general heuristic that I do endorse, from Cromwell:
Agreed. That's easier. However, sometimes the easier way is not the correct way.
In a world where the authoritative "facts" can be wrong more often than they're right, scientists often take a roughly superstitious approach to science and the educational system isn't even optimized for the purpose of educating what reason do I have to believe that any authority figure or expert or established user is more likely to be correct?
I wish I could trust other's information. I have wished that my entire life. It is frequently exhausting and damn hard to question this much of what people say. But I want to be correct, not merely pleasant, and that's life.
Eliezer intended for us to question authority. I'd have done it anyway because I started doing that ages ago. But he said in no uncertain terms that this is what he wants:
In Two More Things to Unlearn from School he warns his readers that "It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism."
In Cached Thoughts he tells you to question what HE says. "Now that you've read this blog post, the next time you hear someone unhesitatingly repeating a meme you think is silly or false, you'll think, "Cached thoughts." My belief is now there in your mind, waiting to complete the pattern. But is it true? Don't let your mind complete the pattern! Think!"
Perhaps there is a way to be more pleasant while still questioning everything. If you can think of something, I will consider it.
I'm not saying that a hypothetical vague "you" shouldn't question things. I'm saying that you specifically, User: Epiphany, seem to not be very well-calibrated in this respect and should update towards questioning things less until you have a better feel for LessWrong discussion norms and epistemic standards.
You might think about the reasons people have for saying the things they say. Why do people make false statements? The most common reasons probably fall under intentional deception ("lying"), indifference toward telling the truth ("bullshitting"), having been deceived by another, motivated cognition, confabulation, or mistake. As you've noticed, scientists and educators can face situations where complete integrity and honesty comes into conflict with their own career objectives, but there's no apparent incentive for anyone to distort the truth about the name of the Center for Applied Rationality. There's also no apparent motivation for Alicorn to bullshit or confabulate; if she isn't quite sure she remembers the name, she doesn't have anything to lose by simply moving on without commenting, nor does she have much to gain by getting away with posting the wrong name. That leaves the possibility that she has the wrong name by an unintended mistake. But different people's chances of making a mistake are not necessarily equal. By being more directly involved with the organization, Alicorn has had many more opportunities to be corrected about the name than you have. That makes it much more likely that you are the one making the mistake, as turned out to be the case.
You could phrase your questions as questions rather than statements. You could also take extra care to confirm your facts before you preface a statement with "no, actually".
This seems like a risky heuristic to apply generally, given the volume of domain-specific contrarianism floating around here. My own version is more along the lines of "trust, but verify".
It's a specific problem Epiphany has that she assumes her own internal monologue of what's true is far more reliable than any evidence or statements to the contrary.