Will_Newsome comments on Expecting Short Inferential Distances - Less Wrong

107 Post author: Eliezer_Yudkowsky 22 October 2007 11:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (91)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Arandur 31 July 2011 09:41:27PM 4 points [-]

I think our disagreement stems from a fuzzy definition of the word "best". I believe that it is better to believe something for good (read: valid) reasons than to believe it for bad reasons, regardless of the truth value of the thing being believed. So yes, your suggestion may lead more Christians to toss their Christianity, but mine makes them more rational thinkers, which (under the assumption that their Christian beliefs are wrong, which assumption I decline to assign a truth value in this post) leads them to atheism as a side benefit.

Essentially, this is the question posed: Which is the greater sin, if Christianity is wrong? Christianity, or irrationality?

Comment author: wedrifid 31 July 2011 09:51:10PM 7 points [-]

So yes, your suggestion may lead more Christians to toss their Christianity, but mine makes them more rational thinkers

The same influences that make people toss Christianity are also what will influence people to become more rational. Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

Which is the greater sin, if Christianity is wrong?

If Christianity is wrong then I'd say neither. ;)

Comment author: Will_Newsome 31 July 2011 11:39:34PM 0 points [-]

Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

You often say things with a certain simple realism that jives with me. I've definitely learned to appreciate the style more since I joined LW, and 10 times moreso since really absorbing a few subskills of a few SingInst folk. How much social psychology-like stuff have you studied? I get a weak impression that it's not much more than the average LW regular but that unlike the average LW regular you have the good habit of regularly explicitly talking about (and thus assuredly explicitly thinking about) certain simple but oft-ignored phenomena of standard social epistemology---or perhaps they'd generally be better described as signalling games/competitions with an epistemic flavor. The very-related skill of "being constantly up a meta level" is really the only prerequisite skill for building the master-skill of being able to automatically immediately generate decent models of any real or imagined social epistemic scenario or automatically with-some-effort generate thorough complex models. You strike me as one of the people on LW who could build up this skill and make it a very sharp weapon, which would be generally useful to any community or organization in the coming years that is trying to raise its sanity waterline. (Vladimir_M also obviously has some kind of related skillset.)

I could link you to a concrete example or two in LW comments if you don't quite follow what skill it is I'm getting at or how it's cool.

Comment author: wedrifid 01 August 2011 09:22:06AM 1 point [-]

How much social psychology-like stuff have you studied?

Quite a lot but it is not specialised (into PUA etc). I've also probably forgotten a lot, since my interest peaked a few years back.