It should be obvious that people need to learn what they know, but a tremendous amount of effort goes into blaming them for not knowing whatever would have been useful if only they'd known it.

A lot of this appears in the form of "should have known better". Sometimes (especially after a low-probability disaster), it will be people tormenting themselves about ordinary behavior. If only they had left a little earlier or later, they wouldn't have been in the car crash. As far as I can tell, people are apt to expect themselves to be clairvoyant, and they expect other people to be telepathic.

The latter comes into play when they're shocked and infuriated that other people don't have the same views they do, or don't understand what is vividly clear to the telepathy expecter. This plays out both politically and personally.

One thing I've learned from arguing about torture is that moral intuitions don't transfer very well. To some people, it's completely obvious that torture is a bad thing and doesn't work, and to others, it's completely obvious that getting important information for one's own side is an emergency, people can't lie if they're suffering enough, and people on the other side don't deserve careful treatment. (The overlap between what people believe is moral and what they believe is effective is probably worth another essay.) When you're in the grasp of a moral intuition, it can be very hard to believe that people who don't share it aren't trying to hide bad motivations.

On the personal level, it's probably more complex. You do want to be around people who are reasonably clueful about how you want to be treated, but on the other hand, I've seen and done a certain amount of ranting about how people should just know how to behave. How are they to know it? They just should.

I suggest that as a matter of epistemic and emotional hygiene, downplaying "should have known better". As far as I can tell, if taken literally, it invokes an unavailable counterfactual and just leads to fury. I think I have better sense when I don't deploy it, but it's hard work to inhibit the reaction, and it's tempting to think that other people should know better than to invoke "should have known better".

It's possible that "should have known better" is one of those normal people things that geeks need to figure out. I can interpret it as "I am lowering your status because I want you to never make that mistake again". I think it can be effective for specific cases, but at a cost of cranking up anxiety and despair because it isn't actually possible to know what one could be blamed for in the future.

This post is a rough draft. It could probably use more examples, and I'm not sure whether the material has already been covered. I think the idea of knowledge needing a source has been, but the emotional and cultural effects of not having a gut level understanding of the fact haven't been.

A philosophy professor recently told me that one of the few things philosophers agree on is that there can't be a moral obligation to do the impossible-- ought implies can. On the other hand, there hasn't been significant work done on figuring out what actually is possible.

On the epistemic side, I've been distracted by whether there's explanation required for how people recognize sound arguments, or if it's enough to say that they just do it some of the time.

New to LessWrong?

New Comment
23 comments, sorted by Click to highlight new comments since: Today at 5:13 AM

An obvious implication of this post is that if someone tells you that you "should have known better", then rather than getting upset and instantly trying to defend yourself, it might be a better idea to calmly ask the person "How should I have known better?".

Possible answers include:

1) "using this simple and/or obvious method that I recommend as a general strategy" 2) "using this not simple and/or not obvious method that I didn't think of until just now" 3) "I don't know" 4) "how dare you ask that!"

The first two of those answers are useful information about how to do things, and thus valuable. You can then perform a quick cost/benefit analysis to check if the cost of implementing the suggested strategy outweighs the cost of risking another instance of whatever mistake you just made.

The third is a result of a successful use of the technique, generally. The speaker now realizes that maybe you didn't have any way to know better, and so maybe it would be inappropriate to blame you for whatever went wrong.

The fourth is a sign that the person you're talking to is probably someone you would be better off not interacting with if you can help it (and thus is useful information). There are ways of dealing with that kind of person, but they involve social skills that not everyone has.


Another obvious implication of this post is, if you're about to tell someone else that they "should have known better", then it might be a good idea to take a moment to think how they should have known better.

The same 4 possible answers apply here.

Again, in cases 1 and 2, you might want to take a moment to perform a quick cost/benefit analysis to check if the cost of implementing the suggested strategy outweighs the cost of risking another instance of whatever mistake the person just made. If your proposed solution makes sense as a general strategy, then you can tell the person so, and recommend that they implement it. If your proposed solution doesn't make sense as a general strategy, then you can admit this, and admit that you don't really blame the person for whatever went wrong. Or you can let the other person do this analysis themself.

In case 3, you can admit that you don't know, and admit that you don't blame the person for whatever went wrong. Or you can just not tell the person that you think they "should have known better", skipping this whole conversation.

In case 4, you obviously need to take a moment to calm down until you can give one of the other answers.

I am new here, so I do not know how much I can contribute to the growing discussion.

Perhaps it may be useful to understand where something comes from in order to better handle it. A well known evolutionary argument, popularized by biologists such as Dawkins, suggests that there is an asymmetry in the evolutionary pay off between making false positives and false negatives. A Pleistocene hominid, as the argument goes, might at most waste some energy running away from a noise in the bushes that turns out to be nothing, but may waste its life if it does not run away when there is a predator in the bushes.

I do not know of any case where someone has said that they "should have known better" after making a false positive, say, "I knew I shouldn't have used the seat belt on the buss, we did not crash after all". I'm sure that more exclamations of this sort comes after a bush crashes and a person did not wear a seat belt, provided of course that the person survived.

I do not know how helpful or relevant this comment will be to the discussion, though.

[-]sfb13y20

I do not know of any case where someone has said that they "should have known better" after making a false positive, say, "I knew I shouldn't have used the seat belt on the buss, we did not crash after all".

Possibly the phrase "I needn't have bothered [..] after all"? e.g.:

Your place is delightfully homely and very tastefully decorated with a kitchen that was so well equipped with quality cooking implements that I needn’t have bothered bringing my own!

It's not directly saying they should have known, but it is saying they judged so inaccurately that reality took them by surprise so much it was worth commenting on.

Also the phrase "I don't know what we were worried about" fits a similar template - not a scolding for not knowing, like "should have known" is, but yes an admission of feeling overprepared for something which didn't happen and now questioning the reasons they had earlier.

I think you're right-- "you should have known better" seems to come into play after something goes wrong, rather than for taking possibly excessive precautions if such precautions are merely a waste of effort.

A philosophy professor recently told me that one of the few things philosophers agree on is that there can't be a moral obligation to do the impossible-- ought implies can

Some dispute this; there is a concept of (bad) "moral luck".

To some people, it's completely obvious that torture is a bad thing and doesn't work, and to others, it's completely obvious that getting important information for one's own side is an emergency, people can't lie if they're suffering enough, and people on the other side don't deserve careful treatment.

This example seems a bit odd, as half of it is about a factual matter...

I generally unpack "X should have known better than to believe Y" as a way of burying the assertions that X believed Y and that Y is obviously false into the "tail" (that is, the de-emphasized part) of a claim, with a status-challenge tacked onto the front as bait.

This is a common rhetorical technique when people want to direct skeptical attention away from an assertion; the hope is that others will respond to the bait (in this case, the status challenge) and let the assertion go by unchallenged. For example, the hope is that a discussion will ensue about whether or not it's really true that X should have known better, which implicitly concedes that X believed Y and Y is false.

It tends to be a warning sign for me... when someone uses a construction like that, I try to remember to pay particular attention to whatever the bait is distracting me from. Did X really believe Y? Is Y really false? Has the speaker actually given me any reason to believe those things?

This is similar to the urban self-defense principle that when something really dramatic happens down the block, that's a very good time to take a step back and look carefully around you for potential threats.

A philosophy professor recently told me that one of the few things philosophers agree on is that there can't be a moral obligation to do the impossible-- ought implies is.

I think you mean "ought implies can".

I've corrected it.

Related post on consequentialist analysis of blame-assignment:

I think that "X should have known better" can be legitimately true, but the implications and level of accusation/nurture needs to be calibrated in radically different ways when X is, for example: a child, an adult idiot, a college professor, or the leader of a large organization.

The idea of a teachable moment seems to be largely about noticing when someone has a problem that was caused by a more abstract process failure and intervening in the moment when they care deeply about the object level issue to help them solve that and also to notice the process failure. This happens all the time with children.

I think part of why the issue gets complicated is that competence of this sort seems like one of the obvious criteria that should be used to select a leader. Even though humans are not automatically strategic, a competent leader in their 40's or 50's should, by that age, have already cultivated strategic planning abilities that include proactively seeking out information related to the domain for which they have official responsibility, figuring it out, and applying it as appropriate.

If an organization experiences a major failure, people who don't really understand all the details of what it would have taken to prevent can still infer that the existing leadership didn't have the ability prevent it for some reason and to forestall a demand for regime change the leadership needs to explain why no alternative leaders could have handled the issue any better. This could be a legitimate point: the best poker player in the world will still lose some hands.

In our daily life, if someone says to us that "tragedy T happened because you didn't check S first" that can be an act of helpful teaching or a social power grab or both. If someone is predisposed to see that sort of thing as a power grab, or is inordinately fond of autonomy, they might end up get a bit defensive when someone tries to teach them something. This might be adaptive, or not, depending on other factors.

A philosophy professor recently told me that one of the few things philosophers agree on is that there can't be a moral obligation to do the impossible

Oh dear, Eliezer won't like this...

You make some good points, but I still say this guy should have known better. (Warning: Violent video.)

This seems to me like yet another case where a label is sometimes valid and sometimes invalid, and we have to use our judgements in applying it, and our judgements aren't very good.

The overlap between what people believe is moral and what they believe is effective is probably worth another essay

Yes, this occurs for so many people over all sorts of beliefs, when there's no good reason for their to be much of a correlation at all (I suppose some form of divine command theory theist could argue that God has made the world so that correct morals also give practically the best results.)

Upvoted for content. Though it strikes me as slightly redundant (gut feeling), I think that it covers some important points. Organizationally, I would change your thesis so that it better reflected what the rest of the post was going to be about -- the post struck me as rambling a little too much.

Thoughts:

Since I cannot prove my value system on an absolute scale, I sometimes restrict use of the word 'should' to technical applications where there's no ambiguity about the objective. For moral arguments, I like to use 'I think that you would want to' as a replacement for 'You should' -- it seems more intellectually honest, less opaque, and less provocative.

I interpret 'should have known better' as a data-sparse, unnecessarily antagonistic comment -- it doesn't tell the listener much about how they should revise their updating system and risks causing negative emotions toward the speaker. I also think that it causes future action to be more pain motivated because 'avoiding harsh correction' may become one of the listener's motivations for improving.

That reminds of a quote from Babylon 5:

"Narns, Humans, Centauri… we all do what we do for the same reason: because it seems like a good idea at the time." - G’Kar

We are doing our best. Often our best turns out to be irrational or not what we really wanted to do. How to handle it? I think this problem is adequately dissolved by one of Yvain's posts, as previously mentioned by Vladimir Nesov. And that you can only do your best is self-evident, even in deliberately trying to do your worst.

Approximate quote from Being Wrong: "How does being wrong feel? Exactly like being right."

I think this is a slight exaggeration-- sometimes I can feel it when I haven't been thinking clearly. This doesn't necessarily stop me, though.

Do you ever have the feeling you haven't been thinking clearly when you're right?

This is interesting in that I have a present application: someone else's terrible, awful, no good, very bad document I have to destroy, and frankly their status deserves a penalty. Quite a lot of this will involve the phrase "knew or should have known" in my response.

Saying this without seeming to expect telepathy or clairvoyance will be important to watch for, e.g. noting in every case the "should have known" vector and describing bad consequences of actions without ascribing a personal incompetence or malice to them. (I suspect incompetence combined with self-righteousness rather than malice per se. Would demonstrating the self-righteousness count as ad hominem?)

The tricky bit is the fine balance between "you catch more flies with honey than vinegar" and "kill it with fire for your own self-defence." I'm inclined to the latter by sheer annoyance, which means watch myself.

[-]sfb13y20

and frankly their status deserves a penalty

?

I suspect incompetence combined with self-righteousness

Lack of ability combined with wanting to be right? Two things normal people never exhibit which deserve punishment?

Would demonstrating the self-righteousness count as ad hominem?

Would it count as useful? Would it help?

I'm told balsamic vinegar works better than honey, actually. I have never run the experiment myself, though, nor seen peer-reviewed published results.

"Honey catches more flies than vinegar, but hurled bucketloads of shit work pretty well too. Also, overextending analogies gang aft agley."

I tend to frame things as "for future reference" rather than "you should have known better".

I have no idea why your comment's been voted down.