Followup toThe Most Important Thing You Learned

What's the most frequently useful thing you've learned on OB - not the most memorable or most valuable, but the thing you use most often?  What influences your behavior, factors in more than one decision?  Please give a concrete example if you can.  This isn't limited to archetypally "mundane" activities: if your daily life involves difficult research or arguing with philosophers, go ahead and describe that too.

New Comment
57 comments, sorted by Click to highlight new comments since:

The most frequently useful thing I've gotten out of Overcoming Bias is not a technique or lesson so much as it is an attitude. It's the most ridiculously simple thing of all: to be in the habit of actually, seriously asking: is (this idea) really actually true? You can ask anyone if they think their beliefs are true, and they'll say yes, but it's another thing to know on a gut level that you could just be wrong, and for this to scare you, not in the sense of "O terror!--if my cherished belief were false, then I could not live!" but rather the sense of "O terror!--my cherished belief could be false, and if I'm not absurdly careful, I could live my whole life and not even know!"

Expecting Short Inferential Distances

One of many posts that gave me a distinct concept for something I previously had been only vaguely aware of, and this one kept coming back to me all the time. By now, I don’t think it’s an extreme exaggeration to say that I make use of this insight every time I communicate with someone, and of all the insights I picked up from OB, this might be the one I most frequently try to explain to others. It doesn’t seem like the most important thing, but for some reason, it immediately struck me as the most frequently useful one.

I think I'll second that, though the attitude that there's an exact right amount to update on every piece of information is really important too.

For me it's between inferential distance and cached thoughts, at least for ones I explain to other people. For ones I use myself, Line of Retreat is probably the one I actively pursue most frequently.

Though I end up using Absence of Evidence pretty often as well.

Inferential distance is the most frequently useful thing I learned at OB, followed by leaving a line of retreat. However, I use other insights I had previously encountered elsewhere more frequently.

The Bottom Line.

Taking care, every time I think, to make sure I'm following out actual uncertainty or curiosity and am actually gathering evidence on what conclusion to write in, rather than rehearsing evidence or enjoying the sound of my own thoughts.

[-]Roko20

'yep. I'll second this.

Really, if you understand "the bottom line", you could reliably reinvent the whole of rationality for yourself. And once you have read and internalized it, you walk around the world noticing people justifying things to themselves all the time.

[-]badger100

On the previous thread I mentioned the Mind-Projection fallacy and "the opposite of stupidity != intelligence" as being most frequently referenced, but on reflection, I think a reminder of the strictness of rationality makes the biggest difference in practice.

This passage from Technical Explanation sums it up: "But the deeper truth of Bayesianity is this: you cannot game the system. You cannot give a humble answer, nor a confident one. You must figure out exactly how much you anticipate the Sun rising tomorrow, and say that number. [...] You cannot do better except by guessing better and anticipating more precisely."

At this stage in my life, I find it easy to avoid to dogmaticism. False modesty, uncertainty, vagueness, and skepticism are all much more seductive. I work as a policy analyst in state government, and am frequently asked to provide forecasts. The dangers of narrow prediction intervals are well known, but I am tempted to be cautious and not focus my estimate. The field I work in is notoriously uncertain, but I can't do better just by being vague. Confidence and uncertainty have to be precisely balanced.

[-]Mary90

...the doctrine of non-reductionism is a confusion, rather than a way that things could be, but aren't. -EY, Excluding the Supernatural

Turned me into an atheist. Damn you.

The idea that that you shouldn't internally argue for or against things or propose solutions too soon is probably the most frequently useful thing. I sometimes catch myself arguing for or against something and then I think "No, I should really just ask the question."

Eliezer, I suspect you might find the answers to these questions less useful than you expect. The most useful things we've learned from you are probably going to be those things that we've already forgotten you wrote, because they've become a part of us -- because they've become background in how we live, how we think, and thus are completely invisible to us at any given time.

I think the answers will be useful, even if they don't exactly represent the set of "most frequently useful things from OB" but instead the set of "most frequently useful among the very memorable and surprising OB posts".

Maybe Eliezer asked about the first set fully expecting to get answers from the (still useful) second set.

Having particular names which may not be in common usage makes it easier for me to identify the things that I've picked up from OB that are now a part of me. Cached Thoughts, Inferential Distance, Mind-Projection Fallacy - those are all terms I use now when referring to things that are a part of me, but not many other people use those terms often. It makes it somewhat easier to identify those things.

Yes -- and easier to invoke the principles in social contexts. I suspect Eliezer's OB posts gain a significant fraction of their usefulness from the names and from the chunk-by-chunk useability of the named principles/methods.

I agree. I find it funny that you lead your list of examples with "cached thoughts", because that exactly what these are. Not that that's a bad thing.

If that's the case though, maybe we need to be proactive in preventing them from becoming cached thoughts of the bad kind. Eliezer's posts serve as a good introduction, but I don't think they are the ideal reference. Maybe a rationalist dictionary would do the trick. I envision something like urban dictionary where multiple definition/explanations can be submitted and voted on.

[-][anonymous]20

deleted

Most frequently useful - that my interest in being unbiased can become a sort of bias of its own, when I hear arguments from others, I can easily spot the biases, and I've worked hard to recognize that I have built-in biases as well that I can't discount.

A few ideas:

  • the difference between Nobly Confessing One's Limitations and actually preparing to be wrong. I was pretty guilty of the former in the past. I think I'm probably still pretty guilty of it, but I am on active watch for it.

  • the idea that one should update on every piece of evidence, however slightly. This is something that I "knew" without really understanding its implications. In particular, I used to habitually leave debates more sure of my position than when I went in---yet this can't possibly be right, unless my opposition were so inept as to argue against their own position. So there's one bad habit I've thrown out. I've gone from being "open-minded" enough to debate my position, to being actually capable of changing my position.

  • That I should go with the majority opinion unless I have some actual reason to think I can do better. To be fair, on the matters where I actually had a vested interest, I followed this advice before receiving it; so perhaps this shouldn't be under 'useful' per se, although I've improved my predictions drastically by just parroting InTrade. (I don't bet on InTrade because I've never believed I could do better.)

  • Sticking your neck out and making a prediction, so that you have the opportunity to say "OOPS" as soon as possible.

I used to habitually leave debates more sure of my position than when I went in---yet this can't possibly be right, unless my opposition were so inept as to argue against their own position.

This isn't quite right - for example, the more I search and find only bad arguments against cryonics, the more evidence I have that the good arguments just aren't out there.

This isn't quite right - for example, the more I search and find only bad arguments against cryonics, the more evidence I have that the good arguments just aren't out there.

If all you did was argue with stupid people you would become erroneously self-confident. Also, two people who argued and didn't convert would both walk away feeling better about their own positions. Something seems wrong here. What am I missing? Doesn't this only make sense of there was some sort of weight attached to the argument your opponent used that was detached during the argument?

Apply Bayes theorem. P(you don't find a good argument among stupid people | there is a good argument) is high. P(you don't find a good argument when you've made a true effort to scour high and low | there is one) is lower. Obviously the existence or otherwise of a good argument is only indirect information about the truth, but it still helps.

That it seems there are no two people who cannot disagree without both having the strong feeling that their argument was clearly the stronger must of course be borne in mind when weighing this evidence, but it's evidence all the same.

You're right that it's not as simple as that - if you set out to talk to idiots, you may well find that you can demolish all of them - but if you search for the strongest arguments from the best qualified and most intelligent proponents, and they're rubbish? But they still persistently get cited as the best arguments, in the face of all criticism? That's fairly strong evidence that the field might be bogus.

(Obvious example: intelligent design creationism. It's weaksauce religion and incompetent science.)

(Obvious example: intelligent design creationism. It's weaksauce religion and incompetent science.)

But why does dealing with intelligent design increase your probability in the alternative? Why were you assigning weight to intelligent design?

This isn't meant to be nitpicky. I suppose the question behind the question is this: When dividing up probability mass for X, how do you allot P(~X)? Do you try divvying it up amongst competing theories or do you simply assign it to ~X?

For some reason I thought that divvying it up amongst competing theories was Wrong. Was this foolish of me?

But why does dealing with intelligent design increase your probability in the alternative? Why were you assigning weight to intelligent design?

Not much - it increased slightly when I saw it proposed, and decreased precipitously when I saw it refuted.

This isn't meant to be nitpicky. I suppose the question behind the question is this: When dividing up probability mass for X, how do you allot P(~X)? Do you try divvying it up amongst competing theories or do you simply assign it to ~X?

Well, it has to be divvied up. It's just that there are so many theories encompassed in ~X that it is not easy to calculate the contribution to any specific theory except when the network is pretty clear already.

Well, it has to be divvied up.

Not to be a chore, but can you explain why?

The sum of your probabilities must add to 1. If you reduce the probability assigned to one theory, the freed probability mass must flow into other theories to preserve the sum.

But why are we assigning probability across a spectrum of competing theories? I thought we were supposed to be assigning probability to the theories themselves.

In other words, P(X) is my best guess at X being true. P(Y) is my best guess at Y being true. In the case of two complex theories trying to explain a particular phenomenon, why does P(X) + P(Y) + P(other theories) need to equal 1?

Or am I thinking of theories that are too complex? Are you thinking of X and Y as irreducible and mutually exclusive objects?

Or am I thinking of theories that are too complex? Are you thinking of X and Y as irreducible and mutually exclusive objects?

...yes? It's not a matter of complexity, though; the problem you might be alluding to is that the groups of theories we describe when we enunciate our thoughts can overlap.

I think the most frequently useful thing I've learned is not in the content, but in the approach. The way of breaking things down, pulling together the themes, and looking at things in a way to makes the problems that seem to be there disappear. I guess that's what's been most frequently useful: when you get stuck, trying to see if you're looking at a problem in the wrong way, since often even if you're not it gets you unstuck.

The knowledge that communication with another brain through words and/or body language is hard. It's very lossy and almost always the source of error when I think someone has just said something absurd or incomprehensible. I may be ignorant of their train of thought, but that does not mean it's inherently random.

I use this constantly to quickly identify non-surface-level differing usages of terms, or to tell when I'm interpreting a phrase someone said differently than they mean me to. Latest concrete example: a couple hours ago, when I suggested that a D&D 4e melee was not well supported by the rules at all, and specifically that "what would the DM do?" summed up my objections. My roommate replied that he would do exactly what he always did, which didn't jive with what I (thought I) was saying, and I immediately knew we were interpreting "melee" in different ways.

Either that's availability bias, or it comes up very frequently, since the most recent event was mere hours ago. :)

Taboo Your Words and Wrong Questions together. There are numerous debates over definitions that I'd previously have happily engaged in, but now I just take a glance at them and shrug them off as trivially solvable by tabooing. Applied this into just about everything, from philosophy and Searle's Chinese Room to politics and "is online piracy theft". I feel it has considerably clarified my thinking (and it was my second candidate to the "most useful thing" thread).

Shut up and do the impossible, or lesser levels of 'dont run away from the problem'

In particular, the AI box experiment applied to persuasion in general. I make a lot more progress in argument when I focus on changing the persons mind, rather than picking apart their flawed arguments or defending mine

This one comes up in philosophical discussions of ethics all the time: Sorting Pebbles into Correct Heaps

I end up referring to right vs right' a lot. Especially w.r.t. ethical naturalism. Though now that I think of it, the actual content there might have been in a follow-up post.

Most frequent would have to go to my avoidance of settling with cached thoughts. I notice, revise, and completely discard conclusions much more regularly and effectively when I recognize the conclusion was generated as soon as a question was asked.

reversed stupidity

followed by

dissolving the question and mind-projection fallacy.

Asking myself “Couldn't I get Y without X?” whenever I'm considering doing expensive/time-consuming/non-fun action X in order to get benefit Y.

...is also the most important thing, though that wasn't necessarily the case.

I find that over the last year or so, whenever a debate ceases to be about something in the world and becomes about pure semantics, a little bulb flashes up in my head - Standard Debate! "Yes, I know it's only tiny, but the question I'm asking is, is Pluto a planet?"

[-][anonymous]30

Inferential distances. Most frequently when calibrating the best way to explain something. More drastically on the few occasions I lacked the cached thoughts to absorb what I was hearing.

[-]Roko10

Oh yes, this is important for talking to other people. True.

[-]Alan30

The most frequently useful thing I have learned from OB is to update assumptions based on new information on an ongoing basis. I think this idea ties in nicely with that of standing against maturity, if maturity is taken to mean a certain rigidity, an inflexibility of purpose and outlook.

The most frequently useful thing I've learned is to be much more distrustful of my motives, my reasoning, my judgments, and my capacity for self-understanding and being aware of what is going on in my head (and why).

The series on free will, probably, because it short-circuits a lot of fruitless contemplation of the meaning of life and such.

[-][anonymous]30

deleted

How to take Joy In The Merely Real.

The most frequently useful is map and territory. It is something that I deploy on a daily level to remind myself of the value of never believing that I know everything, and to leave myself open constantly to the possibility of making mistakes and updating my beliefs. It is also one of the most basic things I teach at Intentional Insights.

the informal use of the bayesian probability method to analyze the plausibility of various statements...

Updating my beliefs based on new information. This has helped me to move away from emotional defensiveness and toward intellectual curiosity when I'm confronted with new information that is contrary to my beliefs or desires.

Reminding myself that "I want to become stronger" emboldens me to make rational choices that require more courage than usual. This has caused me to take a class on entrepreneurship, for example, when otherwise I may have felt too shy to do so.

[-][anonymous]10

Noticing subtle confusion.

Applying this skill has consistently led me to surprising and useful information. So far the gains seemed to have grown with the degree of subtleness I can recognize.

Mostly things that I've observed. For instance:

  • If you try to anticipate someone else's misinterpretations of what you say, you are likely to match their reply to your expectations using regular-expression-like matching and reject their response, without fully parsing it.

  • Often, it's better to say one insightful thing than to say one insightful thing and two less-insightful things, because the less-insightful things are easier to respond to.

  • Developing expertise in overcoming bias, and "trying really hard" to overcome bias, doesn't overcome bias. It can make bias worse, by becoming an excuse not to update in response to the ideas of others.

I had already been a Bayesian and fan of Kahneman-Tversky for 15 years before I started reading OB. So I didn't learn the basics there.

Given that, I'd have to say the most important thing I learned was to exercise unflagging discipline when thinking about values. For me, this means (a) remember to keep track of terminal versus instrumental values in any problem framing and (b) realize that most people's terminal personal values are complex and != "maximize pleasure".

I tend to agree with MBlume - the most frequently used principles are probably assimilated too well. But let's see... the Bayesian worldview in general made me much more interested in probability, making me take the most "mathy" probability course in Uni early on and to plan on reading Jaynes and Pearl within the next half a year. Maybe it was The Dilemma: Science or Bayes that clinched the deal?

Skimming the list - Mind Projection Fallacy, Nobody Knows What Science Doesn't Know and Science as Attire often come to mind in contexts of what other people do wrong (I quoted the first and second principles in a few discussions). Making Beliefs Pay Rent, Mysterious Answers to Mysterious Questions, Making History Available, Cached Thoughts, Bind Yourself to Reality - I try to apply to myself on a regular basis. In my professional capacity, I try to apply (not very successfully at the moment) The Planning Fallacy and Hold Off Postponing Solutions.

The Robber's Cave Experiment was the post that got me hooked on OB in the first place. I have cited it many times. Finally, the posts on morality are frequently used in the sense that I refer myself to them every time moral discussions crop up.

I also think you should include other writings. The list you gave in the previous post does not include Robin's articles (obviously), but he certainly did leave a mark as well. (To mind come his posts on medical spending and Politics is not about Policy, but undoubtedly there are many more.)

But this is already a large list. Perhaps a series is in order, with the first book being "A Gentle Introduction".

[-][anonymous]10

for me, guessing the teacher's password was a real 'oh wow' moment for me that changed the way i went about learning and approaching my classes. it definitely awoke an awareness that i, myself, was simply guessing the password most of the time and it illuminated that this was the way the lecture system primarily functioned in my classes. as a result, i feel i mentally 'engage' more with the material presented and think critically not only about the material itself but on the meta-level of how and why the material is being presented in the way it is.

this was most apparent in an abnormal psychology class where we were presented with case studies containing the story of an individual and then meant to diagnose that person according to DSM IV criteria. you might read about an eccentric individual who dresses in eccentric fashions, has occasional muscle tremors, and will ramble incoherently, jumping from topic to topic. "aha! schizophrenia, disorganized type!" and that was that.

there was no follow up on exactly what that meant. there was simply a bucket of symptoms with a label on it and lo-and-behold this person had a cluster of symptoms from that bucket.

though schizophrenia may be a poor example due to science's weak understanding of it, this process was repeated across the board regardless of the mental ailment. this may simply have been due to the broad nature of the class and time constraints but i remember coming away with nothing but a bunch of labels and little understanding about what was actually happening.

link: http://www.overcomingbias.com/2007/08/guessing-the-te.html

[-][anonymous]10

In "teaching" situations with adults I often try to impart the modesty argument. I want my audience to become less sure of themselves, less sure of their judgements, less sure of their brains, less ready to disagree and thus more open to suggestions.This is a theme that repeats itself many times. Progress is rather uncertain so maybe this isn't very useful! Darn! Perhaps I should change course and go onto signalling and status issues instead, so we can arrive at the truth about ourselves.

[-][anonymous]00

On second thoughts maybe the most useful and frequently used thing is the daily discipline and the daily pleasure of reading OB - it primes the mind and in a sense gives membership to an international club of people who want to think about things - and reach many different answers.

Generalising from one example.

"I notice that I am confused." That phrase pops into my head unbidden at least once a week now.