beoShaffer comments on Skill: The Map is Not the Territory - Less Wrong

49 Post author: Eliezer_Yudkowsky 06 October 2012 09:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread.

Comment author: beoShaffer 04 October 2012 03:24:53AM *  3 points [-]

When I was trying to solve the koan I focused on a few interrelated subproblems of skill one. It seems like this sort of thinking is particularly useful for reminding yourself to consider the outside view and/or the difference between confidence levels inside and outside an argument.
Also, I think the koan left out something pretty important.
Under what circumstances, if any, is it harmful to consciously think of the distinction between the map and the territory - to visualize your thought bubble containing a belief, and a reality outside it, rather than just using your map to think about reality directly? How exactly does it hurt, on what sort of problem?

.

.

.

.

.

It looks pretty solid for describing unbounded epistemic rationality. It's slightly iffier from a bounded instrumental perspective in that it probably imposes some mental cost to apply it and their are many circumstances were its not noticably helpful. There's also the matter of political situations and similar were its -arguably- good to be generally overconfident.

Comment author: Morendil 04 October 2012 09:43:00AM 5 points [-]

How exactly does it hurt, on what sort of problem?

Beliefs are part of reality too. The image "thought bubble containing a belief, and a reality outside it" is a good map, but it's not itself the territory.

In particular, the mantra "Reality is that which, when we stop believing in it, doesn't go away" can be harmful in areas such as psychology and sociology, and in domains which have a large component of these, such as finance, politics or software engineering. In these domains you must account for phenomena such as self-fulfilling or self-cancelling prophecies. Concrete example: stock market crashes.

Comment author: [deleted] 04 October 2012 01:20:31PM 1 point [-]

So you're saying if stop believing in stock market crashes, they go away?

I think what you mean is that if you intervened to change everyone's beliefs away from "oh shit, sell!", then stock market crashes would not happen. That is a different matter than talking about just my or your belief.

Comment author: Morendil 04 October 2012 02:31:57PM 5 points [-]

So you're saying if stop believing in stock market crashes, they go away?

More often it works the other way around: the fact that someone stops believing in an overinflated stock market (i.e. claims a "bubble" is about to burst) acts as a self-fulfilling prophecy, causing others to also stop believing which -if this information cascade propagates enough- will cause a crash, therefore bringing reality in line with the original belief.

But information cascades can also cause booms, as I understand it more likely of individual stocks.

The "someone" above is underspecified: it can be one particularly influential person - Nate Silver recounts how Amazon stock surged 25% after Henry Blodget hyped it up in 1998. But it can also be a larger group, who, looking at small fluctuations in the market, panic and start a stampede.

My point is that "thought bubbles" in general are part of reality. Your believing in things has causal influence on reality (another concrete example: romantic relationships - the concept "love", which can be cashed out in terms of blood levels of various hormones, is one of those things that go away because people stop believing in it). It is generally bad epistemic practice to overstate this influence, but it can also be bad to understate it.

Comment author: [deleted] 04 October 2012 02:56:36PM 1 point [-]

Agreed.

My point was that your examples were a part of reality in a way that the ideal belief-of-observer used in the "reality is that which..." mantra isn't.

Comment author: [deleted] 04 October 2012 01:17:02PM 2 points [-]

There's also the matter of political situations and similar were its -arguably- good to be generally overconfident.

No. It may be good to talk shit like you're overconfident. Actually being overconfident is just unnecesarily shooting yourself in the foot.

Comment author: RichardKennaway 04 October 2012 10:21:06AM 2 points [-]

Under what circumstances, if any, is it harmful to consciously think of the distinction between the map and the territory

If you can ever gain by being ignorant, you can gain more by better knowledge still.

Cf. E.T. Jaynes: "It appears to be a quite general principle that, whenever there is a randomized way of doing something, then there is a nonrandomized way that delivers better performance but requires more thought", quoted here.