The 'how to think of planning fallacy' I grokked was 'people while planning don't simulate the scenario in enough detail and don't see potential difficulties,'* so this is new to me. Or rather, what you say is in some sense part of the way I thought, except I didn't simulate it in enough detail to realise that I should understand it in a probabilistic sense as well, so it's new to me when it shouldn't be.
*In fact, right now I'm procrastinating goign and telling my prof that an expansion I told him I'd do is infinite.
I like your example but there is additional evidence that could be gathered to refine your premise. You can check the traffic situation along your route and make summations about travel time. So there is a chance, given additional tools to up the chances of "everything is fine" to be the more likely scenario over not. I think this is especially true for those of us that drive cars. If you and I decide to go to the Denver Art Museum and you are coming from a hotel in downtown Denver and I'm driving from my house out of town whether I'm gong to be on time or not depends on all the factors you mentioned. However, I can mitigate some of those factors by adding data. I can do the same thing for you by empowering you with a map or by guiding you towards a tool like Google maps to get you from your hotel to the museum more efficiently. I think when you live someplace for a time and you make a trip regularly you get used to certain ideas about your journey which is why "everything is fine" is usually picked by people. To try to compensate for every eventuality is mind-numbing. However, I think making proper use of tools to make things as efficient as possible is also a good idea.
However, I am very much in favor of this line of thinking.
Making sure I understood you: you are saying that people sometimes pick "everything is fine" because:
1) they are confident that if anything goes wrong, they would be able to fix it, so everything is fine once again
2) they are so confident in it they aren't making specific plans, beliving that they would be able to fix everything on the spur of the moment
aren't you?
Looks plausible, but something must be wrong there, because planning fallacy:
a) exists (so people aren't evaluating their abilities well)
b) exists even people aren't familiar with the situation they are predicting (here, people have no ground for "ah, I'm able to fix anything anyway" effect)
c) exists even in people with low confidence (however, maybe the effect is weaker here; it's an interesting theory to test)
I blame overconfidence and similar self-serving biases.
Off-topic: you seem to be one of the organizers of the Houston meetup. I'll be in town the week of Nov 16, feel free to let me know if there is anything scheduled.
Hi shminux. Sorry, just saw your comment. We don't seem to have a date set for November yet, but let me check with the others. Typically we meet on Saturdays, are you still around on the 22nd? Or we could try Sunday the 16th. Let me know.
I'm leaving on Thu very early, so Sunday is better. However, I might be occupied with some family stuff instead, so please do not change your plans because of me. I'll check the Google group messages and contact you if I can make it. Thanks!
There are two insights from Bayesianism which occurred to me and which I hadn't seen anywhere else before.
I like lists in the two posts linked above, so for the sake of completeness, I'm going to add my two cents to a public domain. Second penny is here.