Dagon comments on A rational unfalsifyable believe - Less Wrong

1 Post author: Arielgenesis 25 July 2016 02:15AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (46)

You are viewing a single comment's thread.

Comment author: Dagon 25 July 2016 03:11:36AM 1 point [-]

This belief pays no rent. It's unfalsifiable precisely because it's irrelevant - there is no prediction that Eve can make which would give different outcomes based on Adam's past behavior. The belief just doesn't matter.

Separately, if she assigns 0.0 probability to anything, she's probably not actually as rational as she claims.

Comment author: Arielgenesis 25 July 2016 04:01:22AM 1 point [-]

What if we were to take one step back and Adam didn't die. Eve claims that her believe pays rent because it could be falsified if Adam changed in character. In this scenario, I suppose that you would agree to say that Eve is still rational.

Now, I cannot formulate my arguments properly at the moment, but I think it is weird that Adam's death make Eve's belief irrational, as per:

So I do not believe a spaceship blips out of existence when it crosses the cosmological horizon of our expanding universe, even though the spaceship's existence has no further experimental consequences for me.

http://lesswrong.com/lw/ss/no_logical_positivist_i/

Comment author: Dagon 25 July 2016 04:08:52PM 2 points [-]

I think you're focusing too much on the label "rational", and not enough on the actual effect of beliefs.

I'll admit I'm closer to logical positivism than is Eliezer, but even if you make the argument (which you haven't) that the model of the universe is simpler (in the Kolmogorov complexity sense) by believing Adam killed Able, it's still not important. Unless you're making predictions and taking actions based on a belief (or on beliefs influenced by that belief), it's neither rational nor irrational, it's irrelevant.

Now, a somewhat more complicated example, where Eve has to judge Cain's likelihood of murdering her, and thinks the circumstances of the locked room in the past are relevant to her future, there are definite predictions she should be making. Her confidence in Adam's innocence implies Cain's guilt, and she should be concerned.

It's still the case that she cannot possibly have enough evidence for her confidence to be 1.00.

Comment author: Arielgenesis 25 July 2016 05:22:56PM 1 point [-]

Thank you, that was a very nice extension to the story. I should have included the scenario to make her belief relevant. I agree with you, assigning 100% probability is irrational in her case. But, if she is not rationally literate enough to express herself in fuzzy, non-binary way, I think she would maintain rationality through saying "Ceteris paribus, I prefer to be not locked in the same room with Cain because I believe he is a murder because I believe Adam was innocent" (ignoring ad hominem)

I was under the impression that the golden standard for rationality is falsifiability. However, I now understand that Eve is rational despite unfalsifiablity, because she remained Bayesian.

Comment author: Dagon 25 July 2016 09:20:11PM 1 point [-]

I'm still deeply troubled by the focus on labels "rational" and now "Bayesian", rather than "winning", "predicting", or "correct".

For epistemic rationality, focus on truth rather than rationality: do these beliefs map to actual contingent states of the universe? Especially for human-granularity beliefs, Bayesian reasoning is really difficult, because it's unlikely for you to know your priors in any precise way.

For instrumental rationality, focus on decisions: are the actions I'm taking based on these beliefs likely to improve my future experiences?

Comment author: Arielgenesis 27 July 2016 03:07:05AM *  0 points [-]

human-granularity

I don't understand what does it mean, even after a google search, so please enlighten me.

For epistemic rationality

I think so. I think she has exhausted all the possible avenue to reach the truth. So she is epistemically rational. Do you agree?

For instrumental rationality

Now this is confusing to me as well. Let us forget about the extension for the moment and focus solely on the narrative as presented in the OP. I am not familiar how does value and rationality goes together, but, I think there is nothing wrong if her value is "Adam's innocence" and that it is inherently valuable, and end to it self. Am my making any mistake in my train of thought?

Comment author: Dagon 27 July 2016 02:10:11PM 1 point [-]

By human-granularity, I mean beliefs about macro states that can be analyzed and manipulated by human thought and expressed in reasonable amounts (say, less than a few hundred pages of text) of human language. As contrasted with pure analytic beliefs about the state of the universe expressed numerically.

For instrumental rationality, what goals are furthered by her knowing the truth of this fact? Presuming that if Adam is innocent, she wants to believe that Adam is innocent and if Adam is guilty, she wants to believe Adam is guilty, why does she want to be correct (beyond "I like being right")? What decision will she make based on it?

Comment author: Arielgenesis 28 July 2016 06:14:27AM 0 points [-]

why does she want to be correct (beyond "I like being right")?

I think that's it. "I like knowing that the person I love is innocent." Which implies that Adam is not lying to her and "I like being in healthy, fulfilling and genuine marital relationship"

Comment author: Dagon 28 July 2016 02:05:08PM 0 points [-]

That's a reason to want him to be innocent, not a reason to want to know the truth. What's her motivation for the necessary second part of the litany: "if Adam is guilty, I want to believe that Adam is guilty"?

Comment author: Arielgenesis 29 July 2016 03:19:33AM 0 points [-]

genuine marital relationship

"If Adam is guilty, then the relationship was not genuine." Am I on the right track? or did I misunderstood your question?