"A Confucian has stolen my hairbrush! Down with Confucianism!"
-GK Chesterton (on ad hominems)
"A Confucian has stolen my hairbrush! Down with Confucianism!"
-GK Chesterton (on ad hominems)
"If it ever turns out that Bayes fails - receives systematically lower rewards on some problem, relative to a superior alternative, in virtue of its mere decisions - then Bayes has to go out the window."
This is such an important concept.
Yes, but like falsifiability, dangerous. This also goes for 'rationalists win', too.
'We' (Bayesians) face the Duhem-Quine thesis with a vengeance: we have often found situations where Bayes failed. And then we rescued it (we think) by either coming up with novel theses (TDT) or carefully analyzing the problem or a related problem and saying that is the real answer and so Bayes works after all (Jaynes again and again). Have we corrected ourselves or just added epicycles and special pleading? Should we just have tossed Bayes out the window at that point except in the limited areas we already proved it to be optimal or useful?
This can't really be answered.
I liked the quote not because of any notion that Bayes will or should "go out the window," but because, coming from a devout (can I use that word?) Bayesian, it's akin to a mathematician saying that if 2+2 ceases to be 4, that equation goes out the window. I just like what this says about one's epistemology -- we don't claim to know with dogmatic certainty, but in varying degrees of certainty, which, to bring things full circle, is what Bayes seems to be all about (at least to me, a novice).
More concisely, I like the quote because it draws a line. We can rail against the crazy strict Empiricism that denies rationality, but we won't hold to a rationality so devoutly that it becomes faith.
Upvoted for this sentence:
"If it ever turns out that Bayes fails - receives systematically lower rewards on some problem, relative to a superior alternative, in virtue of its mere decisions - then Bayes has to go out the window."
This is such an important concept.
I will say this declaratively: The correct choice is to take only box two. If you disagree, check your premises.
"But it is agreed even among causal decision theorists that if you have the power to precommit yourself to take one box, in Newcomb's Problem, then you should do so. If you can precommit yourself before Omega examines you; then you are directly causing box B to be filled."
Is this your objection? The problem is, you don't know if the superintelligent alien is basing anything on "precommital." Maybe the superintelligent alien has some technology or understanding that allows him to actually see the end result of your future contemplation. Maybe he's solved time travel and has seen what you pick.
Unless you understand not only the alien's mode of operation but also his method, you really are just guessing at how he'll decide what to put in box two. And your record on guesses is not as good as his.
There's nothing mystical about it. You do it because it works. Not because you know how it works.
Hello All. I came across Less Wrong via Common Sense Atheism a few weeks ago. I have enjoyed it so far, but I have yet to put in the time to get up to speed on the sequences. Plan to, though.
I'm a Financial Accountant in Birmingham, AL. I'm not sure I would (yet) identify myself as a rationalist, but as for what I value, I value truth above all. And if I'm not mistaken, valuing truth seems a big step toward becoming a rationalist. I also value life, liberty, happiness, fun, music, pizza, and many other things.
Here's a little more about me:
Height: 6'0" Shoe Size: 12 Favorite Sport: Basketball Favorite Philosophers: Calvin & Hobbes Greatest Weakness: Distinguishing between reality and fantasy Greatest Strength: I'm Batman
For people who use pseudonyms, would you care to explain why you chose yours? I don't necessarily mean why you're using a pseudonym at all, I'm more interested in why you chose the particular one you've got.
View more: Prev
"A man's gotta know his limitations." - Dirty Harry