iarwain1 comments on Rationality Reading Group: Introduction and A: Predictably Wrong - Less Wrong

11 [deleted] 17 April 2015 01:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (29)

You are viewing a single comment's thread.

Comment author: iarwain1 19 April 2015 03:46:29PM 6 points [-]

Rereading "Illusion of transparency" and "Expecting short inferential distances", it finally dawned on me that the concept cuts both ways: When in the role of explainers we need to be careful to explain ourselves fully and to account for the inferential gaps between ourselves and those to whom we are talking. On the flip side (and this is the part that I didn't fully appreciate before), when in the role of listeners we sometimes read / hear something from an expert and think, "wait, there are a few gaps in that argument", or even "that's just ridiculous". Thoughts like these should raise an "inferential distance!" flag in our mind. It's much more likely that the gaps are due to inferential distance rather than to any actual flaw in the argument.

Comment author: Mirzhan_Irkegulov 05 May 2015 03:19:09PM 1 point [-]

You're right, and inferential distance is a massively underestimated problem in communication and teaching. If a professional physicist explains how quantum physics is really correct, but you don't understand a thing, you just conclude that they are not a good teacher and inferential distance is too big.

But if your friend tells you that libertarianism, Marxism, radical feminism, Bitcoin, or even Bayesianism is the best idea ever, but can't explain how and gets mad at you, most likely they don't understand their idea themselves. It's very possible to have only vague understanding of the position you espouse, it's easy to deceive yourself that you really do understand the idea (especially when your understanding is superficial, see Dunning–Kruger effect). It's even easier if you support an idea for tribal/political reasons or you don't have a good epistemology.

I don't know about you, but for me 99% of a time I don't understand a thing behind somebody's reasoning is because they're bullshitting me. Almost always it's not intentional, they deceived themselves too. They think they know their ideas, they know what they're talking about, but they actually don't.

The vast majority of RAZ articles is to explain, that most of the time people's beliefs are not even inaccurate, they're meaningless, they don't anticipate anything, they use words incorrectly, but from the inside it feels like their beliefs make sense. I guess logical positivists hammered this point much stronger than Yudkowsky does, because they constantly repeated that most things people talk about are meaningless, not worth talking to being with, because they ultimately are not reduced to sense-impressions.

That's what beliefs of most people are, meaningless sequences of symbols or sounds, not having any connection to observable phenomena. When your average Trotkyist says something like “According to dialectics, communism is achievable only through proletarian revolution”, they have no idea, what dialectics, communism or proletarian revolution mean at all. But from the inside it feels like they know, at least in general sense. Maybe some Marxist knows exactly what communism or proletarian revolution is, but most of them don't. Mind you, this extends to almost everyone. People slip into bad epistemology by default, because correct epistemology is counter-intuitive, otherwise we wouldn't need LessWrong.

But maybe you're right. Every time somebody tells you something you don't get, ask them to cross this inferential distance, to step as many steps back as possible and carefully explain their concepts until you see for yourself that the idea is correct. If they can't do that, you then may be certain they're bullshitting.