Selfreferencing
Selfreferencing has not written any posts yet.

Selfreferencing has not written any posts yet.

You'll find the whole thing pretty interesting, although it concerns decision theory more than the rationality of belief, although these are deeply connected (the connection is an interesting topic for speculation in itself). Here's a brief summary of the book. I'm pretty partial to it.
Thinking about Acting: Logical Foundations for Rational Decision Making (Oxford University Press, 2006).
The objective of this book is to produce a theory of rational decision making for realistically resource-bounded agents. My interest is not in "What should I do if I were an ideal agent?", but rather, "What should I do given that I am who I am, with all my actual cognitive limitations?"
The book has three parts.... (read more)
Should be 'a serious challenge'
There's also an assumption that ideal rationality is coherent (and even rational) for bounded agents like ourselves. Probability theorist and epistemologist John Pollock has launched a series challenge to this model of decision making in his recent 06 book Thinking About Acting.
Eliezer said: "If Overcoming Bias has any religious readers left, I say to you: it may be that you will someday lose your faith: and on that day, you will not lose all sense of moral direction."
He's addressing all religious people here, right? I responded to this comment as a theistic philosopher.
Further, specifically to Eliezer, I consider myself a religious fundamentalist (many Christian philosophers do), so I took him to be addressing me on that score as well. I guess I don't know what you mean by it. Plantinga suggests that most people who use the term mean something like, "Sum'bitch." I take it you mean something more.
I think that most theists... (read more)
Overcoming Bias DOES have a religious reader left. Me. I'm a philosopher with strong interests in political philosophy and philosophy of religion. I had several problems with the post:
You say that if we lose our belief in God that we won't lose our moral compass altogether. But that isn't the only issue for the theist, it's also whether the moral compass will point in the right direction all the time. If I become an atheist, I might still believe that murder is wrong, but I won't believe that a respect for the sacred is particularly important, and I'll probably start to reject, say, traditional teachings about sexual morality. Theists might well be... (read more)
Eliezer,
You say: "if you can invent an equally persuasive explanation for any outcome, you have zero knowledge."
You'll want to read Quine on this. Quine thought that for nearly any sufficiently large data set there were an infinite number of theories that could accurately explain it. Now, granted, some theories are better than others, but many theories are harder to compare with others. Here are some examples:
Suppose you have three theoretical values: simplicity, coherence, and accommodation of the data. Different parts of a given scientific community may have distinct value rankings; they may consider some values more important than others. As a result, they end up gravitating towards different classes of theories.
Further, different... (read more)
Unless of course you've already piled through these matters. If so, then link me and I'll shut up. A cursory check yielded little.
Eliezer,
You say: "If you genuinely subject your conclusion to a criticism that can potentially de-conclude it - if the criticism genuinely has that power - then that does modify "the real algorithm behind" your conclusion."
Why do you think it's an epistemic duty to appeal to subject your views to criticisms that can potentially de-conclude it? Or do you think this? If you think it, do you think the duty is restricted? Or is it universal?
If you say that it's not a duty, then fine. But you seem to think it is. If you think that it's universal, you're going to undermine your normative beliefs, I think, including your beliefs about the normativity... (read more)
Eliezer, Thomas Scanlon discusses this issue in the 'Aggregation' Section of Chapter 5 of his What We Owe To Each Other. Philosophers have been on it for awhile.
I once spoke with David Schmidtz, a philosophy at the University of Arizona, about Scwartz's work. All he shows is that more choices makes people anxious and confused. But Dave told me that he got Scwartz to admit that being anxious and confused isn't the same way as having a net utility decrease. It's not even close.