There's an excellent definition of how "obvious" should be used in mathematics: something is obvious if and only if "a proof immediately springs to mind".
The problem remains that it's not particularly helpful to know that a proof immediately springs to the lecturer's or textbook author's mind. And so, operating under the assumption that people are trying to communicate only relevant information, whenever I see 'obvious' or 'easily seen' in a mathematical text, I can't help but read it as an obnoxious 'you should know this already -- unless you're dumb or something'. I think that the best norm for using the word 'obvious' and its variations would be to not use it at all.
This might be a defensive mechanism. B...
Related to: Generalizing from One Example, Connecting Your Beliefs (a call for help), Beware the Unsurprised
The idea of this article is something I've talked about a couple of times in comments. It seems to require more attention.
As a general rule, what is obvious to some people may not be obvious to others. Is this obvious to you? Maybe it was. Maybe it wasn't, and you thought it was because of hindsight bias.
Imagine a substantive Less Wrong comment. It's insightful, polite, easy to understand, and otherwise good. Ideally, you upvote this comment. Now imagine the same comment, only with "obviously" in front. This shouldn't change much, but it does. This word seems to change the comment in multifarious bad ways that I'd rather not try to list.
Uncharitably, I might reduce this whole phenomenon to an example of a mind projection fallacy. The implicit deduction goes like this: "I found <concept> obvious. Thus, <concept> is inherently obvious." The problem is that obviousness, like probability, is in the mind.
The stigma of "obvious" ideas has another problem in preventing things from being said at all. I don't know how common this is, but I've actually been afraid of saying things that I thought were obvious, even though ignoring this fear and just posting has yet to result in a poorly-received comment. (That is, in fact, why I'm writing this.)
Even tautologies, which are always obvious in retrospect, can be hard to spot. How many of us would have explicitly realized the weak anthropic principle without Nick Bostrom's help?
And what about implications of beliefs you already hold? These should be obvious, and sometimes are, but our brains are notoriously bad at putting two and two together. Luke's example was not realizing that an intelligence explosion was imminent until he read the I.J. Good paragraph. I'm glad he provided that example, as it has saved me the trouble of making one.
This is not (to paraphrase Eliezer) a thunderbolt of insight. I bring it up because I propose a few community norms based on the idea:
I'm not sure if these are good ideas, but I think implementing them would decrease the volume of thoughts we cannot think and things we can't say.