cousin_it comments on Causation as Bias (sort of) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (88)
From your first comment to my post on you were really agressive. Arguments are fine, but why always the personal attacks? I tell you what might be going on here: You saw the post, couldn't make sense of it after a quick glance and decided it was junk and an easy way to gain reputation and boost your ego by bashing. And you are not alone. There are lots of haters, and nobody who just said, Ok, I don't believe it, but let's discuss it, and stop hitting the guy over the head.
The theory is highly counterintuitive, I said as much, but it is worth at least a few minutes of discussion, and i discussed it with quite a few eminent philosophers already. None was convinced (which is hardly surprising), but they found the discussion interesting and the theory consistent. So something has gone wrong here. Maybe all this talk of "winning" and "bayesian conspiracy" and whatever really does not do a favor to the principle goal of the site of being as unbiased as possible.
Your theory says you can't cause our beliefs to change and you shouldn't be surprised about it. It also implies that you defend it by accident, not because it's true.
The good news is that you have an obvious upgrade right ahead. Not all of us are so lucky.
Why does everybody assume I'm a die-hard believer in this theory?
No such assumption required. For example, if you have 10% credence in your theory, the same 10% says you're defending it by accident. Viewed another way, we have no reason to listen to you if your theory is false and no reason to listen if it's true either. Please apply this logic to your beliefs and update.
Seems to me you're conflating different concepts: "being the reason for" and "being the cause of":
compare what an enemy of determinism could say: "we have no reason to listen to you if your theory is false and no reason to listen if it's true either". Now what?
Let's drop abstract truth-seeking for a moment and talk about instrumental values instead.
Believing in causality is useful in a causal world and neutral in an acausal one. Disbelieving in causality is harmful in a causal world and likewise neutral in an acausal one. So, if you assign nonzero credence to the existence of causality (as you implied in a comment above: "why does everybody assume I'm a die-hard believer?"), you'd do better by increasing this credence to 100%, because doing so has positive utility in the causal world (to which you have assigned nonzero credence) and doesn't matter in the acausal one.
Well, if you stipulate that "abstract truth-seeking" has nothing whatsoever to do with my getting along in the world, then you're right I guess.
I would say, "increasing this credence toward 100%" - without mathematical proof that the familiar sort of causation is the only such scheme that is feasible, absolute certainty is (slightly) risky. (Even with such proof, it is risky - proofs aren't perfect guarantees.)