We have many built in heuristics, and most of them are trouble. The absurdity heuristic makes us reject reasonable things out of hand, so we should take the time to fully understand things that seem absurd at first. Some of our beliefs are not reasoned, but inherited; we should sniff those out and discard them. We repeat cached thoughts, so we should clear and rethink them. The affect heuristic is a tricky one; to work around it, we have to take the outside view. Everything we see and do primes us, so for really important decisions, we should never leave our rooms. We fail to attribute agency to things which should have it, like opinions, so if less drastic means don't work, we should modify English to make ourseves do so.

All of these articles bear the same message, the same message that can be easily found in the subtext of every book, treatise and example of rationality. Think more. Look for the third alternative. Challenge your deeply held beliefs. Drive through semantic stop signs. Prepare a line of retreat. If you don't understand, you should make an extraordinary effort. When you do find cause to change your beliefs, complete a checklist, run a script and follow a ritual. Recheck your answers, because thinking helps; more thought is always better.

The problem is, there's only a limited amount of time in each day. To spend more time thinking about something, we must spend less time on something else. The more we think about each topic, the fewer topics we have time to think about at all. Rationalism gives us a long list of extra things to think about, and angles to think about them from, without guidance on where or how much to apply them. This can make us overthink some things and disastrously underthink others. Our worst mistakes are not those where our thoughts went astray, but those we failed to think about at all. The time between when we learn rationality techniques and when we learn where to apply them is the valley.

Reason, like time and money, is a resource. There are many complex definitions of reason, but I will use a simple one: reason is time spent thinking. We mainly use our reason to make decisions and answer questions; if we do it right, the more reason we spend, the more likely our answer will be correct.. We might question this analogy on the basis that we can't directly control our thoughts, but then, many people can't directly control their monetary spending, either; they impulse buy. In both cases, we can control our spending directly, using willpower (which is also a limited resource), or indirectly by finding ways to adjust our routine.

This model is convenient enough to be suspicious, so we should apply some sanity checks to make sure it all adds up to normality. The utility we get from thinking about a decision is the cost of deciding incorrectly times the probability that we'll change our mind from incorrect to correct, minus the probability that we'll change our mind from correct to incorrect. From this, we get the highly normal statements thinking has higher expected utility when you're likely to change your mind and thinking has higher expected utility when the subject is important. With a resource model of reason, we should also expect simple representations for surpluses and shortages. A surplus of reason manifests as boredom; we are bored when we have nothing to do but think, and nothing interesting to think about. A shortage or reason manifests as stress; we're stressed when we have too much to think about.

 

When we consider costs as well as benefits, it becomes possible to reason about which techniques are worthwhile. It is not enough to show that a technique will sometimes illuminate truth; to justify its cost, it must be marginally more likely to illuminate truth than the next best technique. On easy questions of little consequence, a single cached thought or a simple heuristic will suffice. On hard problems, most techniques will fail to produce any insight, so we need to try more of them.

Our mind is built on heuristics because they're efficient. A heuristic is not a bad word, but a way of answering questions cheaply. You shouldn't base core beliefs or important choices on heuristics alone, but for minor decisions a few simple heuristics may be all you can afford. Core beliefs and important choices, on the other hand, spawn a tree of sub-questions, the leaves of which are answered by heuristics or cached thoughts.

The Overcoming Bias articles on heuristics treat them like villains that sabotage our thoughts. The standard way to prove that a heuristic exists is to present an example where it leads us astray. That means teaching readers, not to avoid using heuristics where they're inappropriate, but to avoid using them entirely. Fortunately, the architecture of our minds won't let us do that, since eliminating a heuristic entirely would make us much stupider. Instead, we should focus on learning and teaching what they feel like from the inside, with examples where they lead us astray and examples where they work properly.

 

In general, the expected return on investment for thinking about a topic starts high, as initial thoughts cut through confusion and affect our decision greatly, then drops as the most productive lines of reasoning are depleted. Once the expected return drops below some threshold, we should stop thinking about it.

Normally, the process for allocating reason works automatically and works well. However, sometimes it breaks. Sometimes we run into questions that we simply can't resolve with the information we have available. If it's important, we run through our entire repertoire of techniques before giving up, perhaps guessing, and moving on. If it's less important, we try only the techniques that we think are likely to work before we give up. If you teach someone more techniques, then you increase the amount of time they can spend on a topic before running out of angles and being forced to move on. If those techniques fail to produce insight, then they make him stupider; he will spend more time on questions for little benefit, and ignore more. Some people are completely unable to budget their reason, like the man who spends ten minutes deciding what to order in a restaurant, knowing fully that he would be happier spending those ten minutes focused on conversation instead. If you teach him enough statistics, he might be foolish enough to try to calculate the probability of various dishes making him happy. He'll fail, of course, because statistics can't answer that question with the data he has, but he'll waste even more time trying.

It would be nice to have more reason, but evidence points to cognitive capacity being a fixed quantity. We can, however, allocate the reason we do have more efficiently. We can set cutoff points to limit the time spent on silly things like restaurant orders. Some things are known to be wastes of reason; politics is the mind killer because it can use an unlimited amount of mental energy without producing the slightest bit of utility. We can identify the thoughts that are least valuable to us by observing what our mind goes to when we're bored: mostly, we daydream and retread old lines of thought. That means that when there are worthwhile topics to think about, daydreaming and retreading should be the first things to go. This conclusion shouldn't surprise anyone, but it's good to have theoretical justification.

 

Take a moment to think about what you spend time thinking about, and where your cutoff point is. Do you keep thinking about the same topic well past the point where insights stop coming? Do you get distracted and stop too early? If you decide unconsciously, would your conscious choice be the same or different?

New Comment
26 comments, sorted by Click to highlight new comments since:

Serious question: how do people on LW experience thinking?

I find that the heavy lifting of my thinking is nonverbal and fast. Verbalizing (and often re-re-verbalizing) takes time, running explicit math takes (a lot of) time, and worrying at a verbalized idea without getting anywhere can waste indefinite time, but actual intelligent thought feels like a burst of focused attention/effort and a gestalt answer that arrives in an instant.

I get a vague strained feeling to which I don't have full conscious access, which transforms into a form that makes sense to me only after I've verbalized it.

I find that my answers arrive as gestalts, without any experience of effort at that moment. I may have to do a lot of reading and failing and sometimes writing to get to that point.

Hmm, there's effort and effort.

The kind of effort of straining at a word that's "on the tip of your tongue" and other sorts of overt "trying" feel different, and I've never had productive results out of them.

The feeling of effort that does produce something is more like a "pull" on my energy reserves, and the comedown feels like tiredness. I notice it particularly when I'm trying to progress learning something. I often feel temporarily sleepy after that (and not the bored kind of sleepy but the worn down kind).

My hypothesis here, is that priming the caches of thought may be slow, but running it is very fast and the above article may be barking up the wrong tree as a consequence. Thought done right ought to be able to produce its results in negligible time. The kinds of thought that take time seem to be mostly useless. Budgeting chunks of time to "thought" risks filling up the budgeted span with noise and not advancing in the problem.

Most of the time I work with spatial/visual intuition, building analogies to the structure of the problem under study as "mechanical systems", similar to what you do to non-analytically solve stereometrical problems. When the models get sufficiently developed, they usually show the answer directly, or allow to build or recall relevant concepts. It's very unreliable, but given enough work on getting to know a model and starting to intuitively feel it allows to read off the conclusions.

This is often a difference between getting an intuitive understanding and basic knowledge. One doesn't often imply another: learning something explicit doesn't give the intuitive understanding, which leads to missing the conclusions obvious to an expert, and conversely getting the gist of some idea or developing an intuitive solution allows to see many sides of the problem, but has a fair chance of being factually wrong. A solid grip on a concept requires getting both sides of the medal, learning an intuitive model, and enforcing correctness using a more formal and certain foundation.

I don't know how to do anything constructive with verbalization, I mainly use verbal buffer to store little pieces of data, supporting visual problem-solving.

I find some questions have gestalt answers, but for others I really do need to run through the full search tree.

I feel that I start out by thinking sort of in a free-associative manner - a lot of things related to the problem pass through my mind. Then my answers kind of connect together out of the stuff and begin to arrive in a very general sense, like, "I should try making soup", and then get more and more fleshed out with details, and sometimes survive all the way to a full plan, and usually there's more than one answer getting fleshed out. Its usually auditory/verbal or visual or both or sort of like a movie. I might have more than one of these that I'm playing with. I usually get a feeling looking at my own thoughts, when I'm thinking them through / checking them for consistency and arrive at something where there's a problem with it, like, "this is a bad one". Not that I am by any means always logical, or catch every error, or anything like that, just the catching of my own errors feels a lot like hearing someone sing off key in a song or noticing a fruit is unripe or something - I'm not sure I can put it in words but there's a definite feeling to it, like there's a cleft where the thoughts don't connect together like they should.

The weird thing is that now that its been several hours since I wrote this, I'm not even sure if this is how I actually think about things. There is definitely this feeling of visualising the situation and making changes to it, and working from general, kind of like mission statements, to specific plans.

Mainly things just come, but I sometimes verbally brainstorm and free associate in my mind (mainly solving cryptic crosswords) and this can shake loose things that weren't going to come.

If I'm designing a plan of action or a system I verbally pick components of the whole and then break them down and think of a possibility for that component and see if I think they would work in a gestalt fashion.

The top level of my thoughts is verbalized. On most occasions I have enough time to have an internal debate or conversation, polishing and ordering my words like beads on a string.

The lower layer, which comes just before that, is what I suppose you call gestalt., yes ?

Putting a word on a concept, or elaborating an idea using a sentence, takes time, and isn't the point of origin of my thought. It runs deeper. Usually, anything I'll say or do, comes from such a ... well the 'vague strained feeling' Yvain talks of, is what seems to be closest to those points of origin as I feel them. Except that the feeling isn't so much strained as it is a compressed burst, quick, slippery, elusive, small, and simple, yet pretty much as complete as the verbalization that is going to be based upon it a split second later.

I've tried, as an experiment, to deny myself verbalizing, to think only using strings of such "feelings", pre-thoughts. When I try, it feels like my deliberative thinking processes are running faster, but I can't easily connect them together, in a coherent whole.

In such a situation my main thought / idea thread also forks every few seconds, and after a moment the number of different, unique, threads of thought, every one of which would deserve a few sentences in itself, each one vying for a bit of attention, become too great to hold, and I randomly dump some, forgetting about them, following new ideas as they pop in my mind. Without much coherence or continuity, though how new ideas appear is always related to the ideas I had just before, ideas associated, one calling for the other pretty naturally.

But still I don't find my verbalized ideas or concepts to be different, better or more complex, than those non verbal thoughts. They're only more coherent, structured, and permanent, yes, but the real source is down there.

After having described this to some people, one told me that this looked a lot like how people on amphetamine actually describe their thoughts and ideas too.

I think there's multiple types of thinking. For example, formal thinking that uses math and logic, and is applicable to a large set of problems, but incredibly inefficient for others. Problems with a near infinite search space, problems that require "creativity", I can spend a lot of time thinking, and get good results thereby, but cannot explain my thought processes. If possible, these can be more efficient to think about over a long period rather than "intensely". Other thought processes are near instant, also mostly subconscious.

You can think of these as full-conscious, both conscious and sub-conscious, and full sub-conscious.

[-][anonymous]00

Agreed.

"worrying at a verbalized idea without getting anywhere can waste indefinite time" especially rings true: semantics is a really huge problem for me. We think in symbols and then have to convert these symbols to the set of words that have the same properties.

In science writing, there are big returns for spending the time -- translating the symbols to specific words that are accurate forces you to square mental impressions with empirical fact and greatly enhance your understanding of the science.

I have been wondering (and worrying about) if there will be similar returns in posting on LW. Can anyone comment whether improving your ability to post on LW has had tangible benefit?

What about the benefit of carefully writing posts verses shooting them out? Perhaps a better route would be to practice sending "5 minute" posts (with a strict rule on the timing) and get better at doing it fast, first.

[-]gjm80

This is a subject that I always thought was underrepresented on OB. I'm glad to see it coming up on LW.

The need to budget limited mental resources provides another way in which rational people can hold beliefs that seem obviously irrational: a religious or political opinion may be deeply ingrained (therefore, hard to change without a lot of effort even if it's very wrong) and have few consequences in everyday life (therefore, not pay off much for rethinking it even if it's very wrong), so in principle at least someone could be rational in not reconsidering such beliefs even in the face of much troubling counter-evidence. I wonder how much that actually happens; most people think, or at least think they think, or at least say they think, that their religious and political opinions are important and consequential.

Deciding how much thinking to allocate to any given question is, itself, something that takes up time and mental effort. Perhaps we should be budgeting for that too. (There would be an infinite regress here, but in practice we're generally happy to truncate it rather early and just do what feels best. This is a Good Thing, though we should probably watch ourselves for a while every now and then to see whether we're choosing sensible truncation points.)

[-]Nebu20

a religious or political opinion may be deeply ingrained (therefore, hard to change without a lot of effort even if it's very wrong) and have few consequences in everyday life (therefore, not pay off much for rethinking it even if it's very wrong)

I think this is a very insightful point. It illustrates a situation that in order to be rational (win more, spending fewer resources), you need to be irrational (your map does not reflect the territory).

I wish I could "save" comments in the same way I can save posts, because I'll want to re-read yours many times over the next few months.

[I]n principle at least someone could be rational in not reconsidering [irrational] beliefs even in the face of much troubling counter-evidence. I wonder how much that actually happens; most people think, or at least think they think, or at least say they think, that their religious and political opinions are important and consequential.

The other side of this would be people who hold beliefs they fear may be irrational but really do not want to think about it and convince themselves that it does not matter; they would probably act the same anyway.

Since I tried to argue something similar, but didn't argue it nearly so well, I hope you don't mind that I linked your comment in my post here.

Almost any real-world problem is much too complex to arrive at an answer using explicit logic. Logic is great for checking an answer, but not so good for generating answers. Almost all books on problem solving present heuristics for solving problems, and show how to generate heuristics for different types of problems, see, for example Polya's "How to Solve It" and Wicklegren's "How to Solve Mathematical Problems".

We can identify the thoughts that are least valuable to us by observing what our mind goes to when we're bored: mostly, we daydream and retread old lines of thought. That means that when there are worthwhile topics to think about, daydreaming and retreading should be the first things to go.

Have a minor point with your connotations. Do note the limits of introspection. We do not have conscious access to everything that goes on while daydreaming. Since daydreaming, itself, has been linked to problem-solving - not to mention that it's often relaxing and helps you gather your strength before resuming conscious problem-solving - I'm not nearly as certain as you that daydreaming is as useless as you make it sound.

The metaphor of resource for both reason and willpower, while having some valuable implications, also seems to leave out an important element. With willpower, particularly, I find it more useful to think in terms of a muscle, rather than a limited resource. It's limited at any point in time, certainly, but the more you exercise it, the more is available later. I think the same is true, though possibly to a lesser extent, with reason.

If you think of both of them as depletable or renewable resources, your incentive is always to reduce your use to just those times that they're necessary. When you think of them as muscles, you realize that you need to use them regularly, and that finding new ways to exercise them makes them more valuable. Also like exercise, using them makes you tired, but if done correctly, it also makes you stronger.

The issue is not so much that reason and willpower need to be exercised, as that reason (time spent thinking) is lost if it isn't spent immediately (the time passes with no insights). Since thinking produces insights that feed into future insights, spending time not thinking about anything is like... keeping money in an account that doesn't generate interest.

I'm not getting the metaphor of "reason ... is lost if isn't spent immediately." Do you mean something like "wasted" or "unproductive"? Can you say the same thing again in more words?

I think the confusion stems from a conflict between my definition in the article (reason is time spent thinking) and some of the more common, but less precise definitions (reason is the output of thinking time, or reason is the efficiency or effectiveness of thinking time).

Under the definition that reason is time spent thinking, if you don't spend it by thinking about something, then that time wasn't reason at all, so it's 'lost'. This is different than spending time thinking but failing to produce any output.

I have a fairly high degree of conscious control over how much time and effort I put into thinking. Which especially helps, when I'm trying to solve multiple problems at once or free up serious effort for long term (multi-day or even multi-year) cogntive problems. The exception is when I'm emotionally charged about something, and haven't had a chance to utilized cognitive and emotional medititation techniques well enough.

Patience is a supremely useful thing, it's vital not to get frustrated with lack of progress and to label a problem too early as insovlable or impossible. Thinking laterally, and routing around emotional or cognitive distress can be vital. as well as periodic doubts on assumptions of varying levels of severity (which as a whole subsumes most of the techniques discussed here and at OB). And a willingness to revise your goals, as you learn more about what is possible and you think more deeply about the situation.

The critical resource is often time spent with dimensions of the problem simultaneously available for mutual consideration. This feels like cognitive expansion as the problem is being fitted in a too small thinking space. It has some resemblence to multi-tasking in a half-dozen directions, but on the same task. It's not unpleasant, but it has a definite willpower cost and becomes much more taxing when I suffer a migraine (which can be as often as half the time).

Beyond that, I can work with my subconscious to handle some less immediate computations more precisely, but these will always require conscious mediation before they become a course of action.

These modes of thinking are essential, because for droping a standard heuristic to be fully beneficial on the decision's outcome means taking into account the dimension which the heuristic was replacing. The alternative, which is not nearly as helpful, but cheaper, is to replace the heuristic with a conditional, but better heuristic.

Or to put it in my language, when you question an assumption, and find it lacking, you should not simply replace it with another assumption. Instead you need to consider the added complexity that it brings to the problem.

So, in short, my economy of willpower, thinking, and health are not a simple management problem on their own.

[-][anonymous]20

deleted

I agree that "think more" is a bare bones summary of a bunch of OB/LW posts.

We may have finite time to think in a day, but most people aren't constantly thinking about things, and especially not thinking about if the way they're thinking works.

I've found that the main difference that I've noticed since reading OB is that I think more and question my thinking constantly.

thinking has higher expected utility when you're likely to change your mind and thinking has higher expected utility when the subject is important.

Conditioning on you changing your mind from incorret to correct.