Vladimir_Nesov comments on The uniquely awful example of theism - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (169)
For what it's worth, I've recently started reading this site and am an Orthodox Jew. I have no particular plans to stop reading the site for the time being, because it's often rather interesting.
It may be worth considering that while rationalists may feel they don't need religion, almost all religious people would acknowledge the need for rationality of some kind. If rationality is about achieving your goals as effectively as possible (as some here think), then does it suddenly not work if your goals are "obey the Bible"? No -- your actions will be different from someone with different goals (utilitarianism, etc.), but most of the thought-process is the same.
Suppose you have an extremely high prior probability for God sending doubters to Hell, for whatever reason. Presumably the utility of going to Hell is very, very low. Then, as a rational Bayesian, you should avoid any evidence that would tend to cause you to doubt God, shouldn't you? I certainly don't know much of Bayesian probability, but I can't see any flaw in that logic.
The question seems rather similar to that of Omega. The winners are those who can convince themselves, by any means, that a particular belief is right. In that sense, God could be said to reward irrationality, just like Omega. The only real difference is that in Omega's case, nobody doubts the fact that Omega exists and is doing the judging in the first place. I don't think that's essential to the nature of the problem, although it makes it harder for most rationalists to dismiss.
Of course, "rationalism" as used on this site often implies acceptance of empiricism, Occam's razor, falsifiability, and things like that, not just pure Bayesian logic with arbitrary priors. But of course, I almost completely accept all those things, and am tolerant of those who accept them more thoroughly than I. It should therefore not be very surprising that I'd see value in this site, along with other religious people with similar attitudes (however few there may be).
I do think that at least being polite toward religion (which doesn't always happen here) is more likely to advance the goals of this site than otherwise. It doesn't help anyone's goals to drive people away before you can deconvert them; and even if you can't deconvert them, you still gain by helping them think more logically (by your definitions) in other areas.
This is a preference over rituals of cognition, choosing not just decisions, but the algorithms with which you arrive at those decisions. It is usually assumed that only the decisions matter, not the thought process. If you did live in such a world, I agree, you should avoid getting into a doubt-state, although it might be the case that you'd benefit from building an external reasoning device that would resolve the problem for you, not being hindered by limitations on the allowed cognitive algorithms.
Also, I guess that an altruistic person should still undergo a conversion to rationality, on the chance that the evidence points out that the inborn priors are incorrect, thus sparing his fellow people living under such limitations on thought.
Well, if you're altruistic in the sense you describe, you don't have the utility function I gave in my scenario, so your result will vary. If you don't really mind going to hell too much, comparatively, then the argument doesn't work well.
Of course.