andreas comments on Open Thread, September, 2010-- part 2 - Less Wrong

3 Post author: NancyLebovitz 17 September 2010 01:44AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (858)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 18 September 2010 11:20:47PM 6 points [-]

Something interesting I've noticed about myself. Recently I've been worrying if I'm an atheist and my mindset is often something akin to "science as a way to see the world, not just a discipline to be studied" is less because I've found good reason to accept the former as fact and the latter as a good mindset, and more because of a socialization effect of being around Less Wrong. Meaning, even as a somewhat lurker with 48 karma total whose made no comment above 9 karma (as of this one), I'm wondering if my thoughts are less due to my own personal reasoning abilities and more due to a cached self created by being in a certain atmosphere (namely, here).

So my question is this: Is there a way I could test whether the socialization of being around a certain atmosphere changes my views more or less than my acceptance of reasons for those views? And is this possibly a part of understanding my understanding or am I misapplying that idea?

Comment author: andreas 19 September 2010 03:12:52AM 4 points [-]

Ask yourself: If the LW consensus on some question was wrong, how would you notice? How do you distinguish good arguments from bad arguments? Do your criteria for good arguments depend on social context in the sense that they might change if your social context changes?

Next, consider what you believe and why you think you believe it, applying the methods you just named. According to your criteria, are the arguments in favor of your beliefs strong, and the arguments against weak? Or do your criteria not discriminate between them? Do you have difficulty explaining why you hold the positions you hold?

These two sets of questions correspond to two related problems that you could worry about and that imply different solutions. The former, more fundamental problem is broken epistemology. The latter problem is knowledge that is not truly part of you, knowledge disconnected from your epistemic machinery.

I don't see an easy way out; no simple test you could apply, only the hard work of answering the fundamental questions of rationality.