Comment author: NancyLebovitz 26 August 2012 10:02:33PM 1 point [-]

Any thoughts about protecting intuition? Some types of useful intuition come from experience, but there are people (the outrage industry, advertising) are trying to hijack other people's intuition by supplying large quatities of emotionally intense simulated experience.

Comment author: wattsd 26 August 2012 10:59:50PM *  0 points [-]

Something like this was discussed by Kelly McGonigal in "The Willpower Instinct". A couple things that might help:

Avoidance - Complete avoidance is probably impossible, but you might try limiting your exposure to such things, particularly when you are vulnerable to making poor decisions. The old advice "don't go to the store when your hungry", might be related to low glucose levels (which affect decisionmaking).

Controlled exposure w/ reflection - I remember wanting toys when I was younger based on what was shown in commercials. After a couple disappointments, I got a little better resisting the ads. That said, I could probably use some recalibration...

All in all, mindfulness and an information diet. I've seen this particular field (ads, store layouts, etc) referred to as choice architecture, perhaps you could do some choice architecture of your own, to guard when your defenses are down. Essentially, develop good routines and make good choices ahead of time and stick to them.

Comment author: buybuydandavis 26 August 2012 08:04:43PM 0 points [-]

For probably a couple of decades now, I've wanted some planning software where I input goals, utilities, activities, and results, and the software plans my day, makes suggestions, tracks progress on those goals, and charts overall utility.

What's out there like this?

Comment author: wattsd 26 August 2012 08:23:49PM 1 point [-]

tracks progress on those goals, and charts overall utility.

I don't think it works very well for what you are envisioning, but something like spaced repetition software might help.

With SRS, the idea is that the software tries to figure out when you are going to forget something and prompts you at that time, when the reminder will be most effective.

Comment author: buybuydandavis 26 August 2012 08:03:31PM 0 points [-]

I'd focus more on getting effective more than reducing or managing biases. The latter often serves the former, but I think the answers come fairly directly if you start there.

I see most of the methods you list as general process improvement methods whether or not they actually improve intuition.

An alternative to improving your intuition and removing your biases would be to find other and better processes and tools to rely on. And then actually use them.

That last part is probably the main failing. We all have a boatload of good ideas that would make us a zillion times more effective if we actually used them.

How often do you plan? How often do you monitor your plans? Measure the results? Provide summary statistics? Do you avail yourself of any tools to do this?

For probably a couple of decades now, I've wanted some planning software where I input goals, utilities, activities, and results, and the software plans my day, makes suggestions, tracks progress on those goals, and charts overall utility.

I bet that pencil and paper process monitoring would be a huge advance. Yet I don't do it. I don't need a lot of fancy research about process monitoring to improve, I need to do it.

I would think this is true for most everyone here. We indulge our enjoyment of thinking, and fell justified in doing so, when it really isn't much different than watching porn. Mental masturbation. Wank wank wank, instead of getting things done. It's actually a bit worse than porn, because we feel justified when we mentally wank, we feel we're accomplishing something respectable, and for the most part, society agrees.

Comment author: wattsd 26 August 2012 08:13:04PM 0 points [-]

An alternative to improving your intuition and removing your biases would be to find other and better processes and tools to rely on. And then actually use them.

I think that is part of what I was attempting to get at, though I probably didn't do a very good job. In a sense we are biased to use certain processes or tools. The only way to change those "default settings" is to deliberately practice something better, so that when the time comes, you'll be ready.

Comment author: Oscar_Cunningham 26 August 2012 06:59:52PM *  5 points [-]

Nice post! You didn't explicitly ask for criticism, but I'm going to give some anyway:

I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.

I think there's definitely interest on LessWrong for improving intuition, but I would frame it as "Training intuition to make its judgements more rational" rather than (as your post leans towards) "Forget rationality and harness our natural biases!". This is mostly just a terminological difference.

The System 1/System 2 distinction is really between System 1 being (fast, intuitive, subconscious) and System 2 being (slow, deliberative, conscious). Around these parts, the word "rationality" tends to be used to mean something like "succeeding by using any and all means". Under this definition, rationality can use both System 2 and System 1 type thinking. Thus I believe your post could be improved by taking the sentences where intuition is being contrasted with "rationality" and replacing the word "rationality" with something like "deliberate thought" or "System 2".

As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.

Some bits of the two "Concluding thoughts" paragraphs seem especially washy. A general sentiment of "System 1 should work in harmony with System 2" sounds nice, but without any data to back it up it could just be complete bollocks. Maybe we should all be using System 1 all the time. Or maybe there are some activities where System 1 wins and some where System 2 wins. If so, which activities are which? Are firefighters actually successful decision makers?

One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6, and 7 are actually more System 2.

Comment author: wattsd 26 August 2012 08:05:50PM *  2 points [-]

Thanks for the comments, criticism is welcomed.

I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.

Apologies for the font size, I was editing in Google Docs rather than the post editor...

As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.

I admit, terminology is an issue. I perhaps bit off a bit more than I can chew for a first post. I'll try to fix that.

One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6,and 7 are actually more System 2.

From the way Klein describes them, they are meant to accelerate expertise. If my interpretation is correct, they are using system 2 to develop system1 for the next scenario. I think part of the problem with how I'm describing this is that experience, which is instrumental in developing expertise, develops intuition. Intuition can either help or hurt. Sometimes we won't know which until after a decision has been made, other times we might be able to prevent mistakes by running through a checklist of cognitive biases. In the former case, the methods should help next time. In the latter case, you need something (from system 1 for example) to run through the checklist. The checklist on its own isn't very useful.

Again, thanks for the feedback.

Comment author: Jonathan_Graehl 03 August 2012 05:57:58AM 0 points [-]

Thanks for the links, but I didn't care for Simon's paper at all. I recall Hamming's inspiring me for a few hours at least - perhaps just making up the time spent reading it :)

Comment author: wattsd 11 August 2012 03:32:03AM 1 point [-]

Simon's writing style seems a little strange to me for what its worth...

There are few others who have worked with with him and described their impressions of how he worked. Those might be more readable, but Hamming's lecture/paper is hard to beat in my opinion.

http://web.cs.dal.ca/~eem/gradResources/HerbertSimon.pdf http://www.isle.org/~langley/papers/has.essay.pdf

I attempted to summarize the three papers and incorporate a few other things a while ago, inspired in part by a post by Cal Newport of StudyHacks on the methods of Feynman and a few others. Incidentally, Cal has colloborated in the past with the author of the Holistic Learning ebook in the OP.

Cal's post: http://calnewport.com/blog/2012/06/18/impact-algorithms-strategies-remarkable-people-use-to-accomplish-remarkable-things/ My summary of Simon's Methods: https://sites.google.com/site/wattsd/simplesimon

The summary is still rough and incomplete, so the sources might be more interesting/useful.

Comment author: wattsd 03 August 2012 04:53:16AM 6 points [-]

Hamming's "You and Your Research" and Herbert Simon's "The Scientist as Problem Solver" are good "How I do research" papers. Hamming's paper was described in the other comments. Simon won both a Turing award and a Nobel prize.

Simon's paper is here: http://repository.cmu.edu/cgi/viewcontent.cgi?article=1425&context=psychology Hamming's: http://www.cs.virginia.edu/~robins/YouAndYourResearch.html

Comment author: [deleted] 10 April 2012 08:07:53PM 1 point [-]

Comparing pain to chess and music was intriguing. Intuitively, it seems that attention to pain is qualitatively different. Pain impinges on our attentions, while the other two activities are objects of attention. On the other hand, it is certainly possible to focus on pain or distracting one self from pain. The thesis of the article suggests that by directing attention towards pain, it gets worse, while directing attention away from pain can reduce it. This seems to be a testable hypothesis. Is there any study about this?

Comment author: wattsd 10 April 2012 08:24:35PM *  1 point [-]

I'm not sure there is a study about directing attention to pain, but there is a video game being used to reduce pain, presumably by directing attention away from it.

http://www.hitl.washington.edu/research/vrpain/

Edit: From the page:

Patients often report re-living their original burn experience during wound care, SnowWorld was designed to help put out the fire. Our logic for why VR will reduce pain is as follows. Pain perception has a strong psychological component. The same incoming pain signal can be interpreted as painful or not, depending on what the patient is thinking. Pain requires conscious attention. The essence of VR is the illusion users have of going inside the computer-generated environment. Being drawn into another world drains a lot of attentional resources, leaving less attention available to process pain signals. Conscious attention is like a spotlight. Usually it is focused on the pain and woundcare. We are luring that spotlight into the virtual world. Rather than having pain as the focus of their attention, for many patients in VR, the wound care becomes more of an annoyance, distracting them from their primary goal of exploring the virtual world.

Comment author: Vladimir_Golovin 10 April 2012 06:53:34PM 28 points [-]

A possible caveat:

The main premise of the article is that directing one's attention to a sensory input can make one better at processing this input (where "better" may mean "higher resolution and/or sensitivity") by "growing" the associated area of the cortex.

However, the article does not give a clear reason for the assumption that the same principle should apply to higher-level mental behaviors not directly related to sensory inputs -- e.g. playing chess.

(I'm not familiar with the relevant science, so I'm just voicing my doubt.)

Comment author: wattsd 10 April 2012 07:59:31PM 9 points [-]

The concept being described in the article sounds very similar to deliberate practice, which I think might be described as keeping what you are trying to practice at conscious level instead of going on autopilot.

Many of those studies are actually based on chess, so if this describes how deliberate practice changes the brain, it should also map to higher level activities.

Of course, I'm not terribly familiar with all of the relevant science either.

View more: Prev