Desrtopa comments on Practicing what you preach - Less Wrong

2 Post author: TwistingFingers 23 October 2011 06:12PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (294)

You are viewing a single comment's thread. Show more comments above.

Comment author: Desrtopa 25 October 2011 02:24:40PM *  1 point [-]

The fact of a person's belief is evidence weighted according to the reliability of that person's mechanisms for establishing the belief. To refuse to update on another person's belief means supposing that it is uncorrelated with reality.

Comment author: Logos01 25 October 2011 02:53:42PM *  0 points [-]

To fail to allow for others to be mistaken when weighting your own beliefs is to risk forming false beliefs yourself. Furthermore; establishing the reliability of a person's mechanisms for establishing a belief is necessary for any given specific claim before expertise on said claim can be validated. The process of establishing that expertise then becomes the argument, rather than the mere assertion of the expert.

We use trust systems -- trusting the word of experts without investigation -- not because it is a valid practice but because it is a necessary failing of the human condition that we lack the time and energy to properly investigate every possible claim.

Comment author: Desrtopa 25 October 2011 03:22:52PM *  1 point [-]

You must of course allow for the possibility of the other person being mistaken, otherwise you would simply substitute their probability estimate for your own. But to fail to update on the fact of someone's belief prior to obtaining further information on the reliability of their mechanisms for determining the truth means defaulting to an assumption of zero reliability.

Comment author: Logos01 25 October 2011 03:25:55PM *  0 points [-]

But to fail to update on the fact of someone's belief prior to obtaining further information on the reliability of their mechanisms for determining the truth means defaulting to an assumption of zero reliability.

One should always assign zero reliability to any statement in and of itself, at which point it is the reliability of said mechanisms which is the argument, rather than the assertion of the individual himself. I believe I stated something very much like this already.

-- To rephrase this: it is not enough that Percival the Position-Holder tell me that Elias the Expert believes X. Elias the Expert must demonstrate to me that his expertise in X is valid.

Comment author: Desrtopa 25 October 2011 03:41:04PM 2 points [-]

If you have no evidence that Elias the Expert has any legitimate expertise, then you can reasonably weight his belief no more heavily than any random person holding the same belief.

If you know that he is an expert in a legitimate field that has a track record for producing true information, and he has trustworthy accreditation as an expert, you have considerably more evidence of his expertise, so you should weight his belief more heavily, even if you do not know the mechanisms he used to establish his belief.

Suppose that a physicist tells you that black holes lose mass due to something called Hawking radiation, and you have never heard this before. Prior to hearing any explanation of the mechanism or how the conclusion was reached, you should update your probability that black holes lose mass to some form of radiation, because it is much more likely that the physicist would come to that conclusion if there were evidence in favor of it than if there were not. You know enough about physicists to know that their beliefs about the mechanics of reality are correlated with fact.

Comment author: Logos01 25 October 2011 03:57:16PM -2 points [-]

Suppose that a physicist tells you that black holes lose mass due to something called Hawking radiation, and you have never heard this before. Prior to hearing any explanation of the mechanism or how the conclusion was reached, you should update your probability that black holes lose mass to some form of radiation,

No. What you should do is ask for a justification of the belief. If you do not have the resources available to you to do so, you can fail-over to the trust system and simply accept the physicist's statement unexamined -- but utilization of the trust-system is an admission of failure to have justified beliefs.

You know enough about physicists to know that their beliefs about the mechanics of reality are correlated with fact.

I know enough about physicists, actually, to know that if they cannot relate a mechanism for a given phenomenon and a justification of said phenomenon upon inquiry that I have no reason to accept their assertions as true, as opposed to speculation. If I am to accept a given statement on any level higher than "I trust so" -- that is, if I am to assign a high enough probability to the claim that I would claim myself that it were true -- then I cannot rely upon the trust system but rather must have a justification of belief.

Justification of belief cannot be "A person who usually is right in this field claims this is so" but can be "A person who I have reason to believe would have evidence on this matter related to me his assessment of said evidence."

The difference here is between having a buddy who is a football buff who tells you what the Sportington Sports beat the Homeland Highlanders by last night -- even though you don't know whether he had access to a means of having said information -- as opposed to the friend you know watched the game who tells you the scores.

Comment author: Desrtopa 25 October 2011 04:29:30PM 1 point [-]

No. What you should do is ask for a justification of the belief. If you do not have the resources available to you to do so, you can fail-over to the trust system and simply accept the physicist's statement unexamined -- but utilization of the trust-system is an admission of failure to have justified beliefs.

If you want to increase the reliability of your probability estimate, you should ask for a justification. But if you do not increase your probability estimate contingent on the physicist's claim until you receive information on how he established that belief, then you are mistreating evidence. You don't treat his claim as evidence in addition to to evidence on which it was conditioned, you treat it as evidence of the evidence on which it was conditioned. Once you know the physicist's belief, you cannot expect to raise your confidence in that belief upon receiving information on how he came to that conclusion. You should assign weight to his statement according to how much evidence you would expect a physicist in his position to have if he were making such a statement, and then when you learn what evidence he has you shift upwards or downwards depending on how the evidence compares to your expectation. If you revised upwards on the basis of the physicist's say-so, and then revised further upwards based on his having about as much evidence as you would expect, that would be double-counting evidence, but if you do not revise upwards based on the physicist's claim in the first place, that would be assuming zero correlation of his statement with reality.

Justification of belief cannot be "A person who usually is right in this field claims this is so" but can be "A person who I have reason to believe would have evidence on this matter related to me his assessment of said evidence."

You do not need the person to relate their assessment of the evidence to revise your belief upward based on their statement, you only need to believe that it is more likely that they would make the claim if it were true than if it were not.

The difference here is between having a buddy who is a football buff who tells you what the Sportington Sports beat the Homeland Highlanders by last night -- even though you don't know whether he had access to a means of having said information -- as opposed to the friend you know watched the game who tells you the scores.

Anything that is more likely if a belief is true than if it is false is evidence which should increase your probability estimate of that belief. Have you read An Intuitive Explanation of Bayes' Theorem, or any of the other explanations of Bayesian reasoning on this site?

If you have a buddy who is a football buff who tells you that the Sportington Sports beat the Homeland Highlanders last night, then you should treat this as evidence that the Sportington Sports won, weighted according to your estimate of how likely his claim is to correlate with reality. If you know that he watched the game, you're justified in assuming a very high correlation with reality (although you also have to condition your estimate on information aside from whether he is likely to know, such as how likely he is to lie.) If you do not know that he watched the game last night, you will have a different estimate of the strength of his claim's correlation with reality.

Comment author: Logos01 25 October 2011 04:46:57PM -2 points [-]

Have you read An Intuitive Explanation of Bayes' Theorem, or any of the other explanations of Bayesian reasoning on this site?

I have read them repeatedly, and explained the concepts to others on multiple occassions.

If you have a buddy who is a football buff who tells you that the Sportington Sports beat the Homeland Highlanders last night, then you should treat this as evidence that the Sportington Sports won,

Not until such time as you have a reason to believe that he has a justification for his belief beyond mere opinion. Otherwise, it is a mere assertion regardless of the source -- it cannot have a correlation to reality if there is no vehicle through which the information he claims to have reached him other than his own imagination, however accurate that imagination might be.

You do not need the person to relate their assessment of the evidence to revise your belief upward based on their statement, you only need to believe that it is more likely that they would make the claim if it were true than if it were not.

Which requires a reason to believe that to be the case. Which in turn requires that you have a means of corroborating their claim in some manner; the least-sufficient of which being that they can relate observations that correlate to their claim, in the case of experts that is.

If you want to increase the reliability of your probability estimate, you should ask for a justification.

A probability estimate without reliability is no estimate. Revising beliefs based on unreliable information is unsound. Experts' claims which cannot be corroborated are unsound information, and should have no weighting on your estimate of beliefs solely based on their source.

If an expert's claims are frequently true, then it can become habitual to trust them without examination. However, trusting individuals rather than examining statements is an example of a necessary but broken heuristic. We find the risk of being wrong in such situations acceptable because the expected utility cost of being wrong in any given situation, as an aggregate, is far less than the expected utility cost of having to actually investigate all such claims.

The more such claims, further, fall in line with our own priors -- that is, the less 'extraordinary' the claims appear to be to us -- the more likely we are to not require proper evidence.

The trouble is, this is a failed system. While it might be perfectly rational -- instrumentally -- it is not a means of properly arriving at true beliefs.

I want to take this opportunity to once again note that what I'm describing in all of this is proper argumentation, not proper instrumentality. There is a difference between the two; and Eliezer's many works are, as a whole, targetted at instrumental rationality -- as is this site itself, in general. Instrumental rationality does not always concern itself with what is true as opposed to what is practically believable. It finds the above-described risk of variance in belief from truth an acceptable risk, when asserting beliefs.

This is an area where "Bayesian rationality" is insufficient -- it fails to reliably distinguish between "what I believe" and "what I can confirm is true". It does this for a number of reasons, one of which being a foundational variance between Bayesian assertions about what kind of thing a Bayesian network is measuring when it discussed probabilities as opposed to what a frequentist is asserting is being measured when frequentists discuss probabilities.

I do not fall totally in line with "Bayesian rationality" in this, and various other, topics, for exactly this reason.

Comment author: wedrifid 25 October 2011 05:48:48PM 4 points [-]

There is a difference between the two; and Eliezer's many works are, as a whole, targetted at instrumental rationality

What? No they aren't. They are massively biased towards epistemic rationality. He has written a few posts on instrumental rationality but by and large they tend to be unremarkable. It's the bulk of epistemic rationality posts that he is known for.

Comment author: wedrifid 25 October 2011 05:51:49PM *  1 point [-]

Have you read An Intuitive Explanation of Bayes' Theorem, or any of the other explanations of Bayesian reasoning on this site?

I have read them repeatedly, and explained the concepts to others on multiple occassions.

Really? In that case you should hopefully be able to interact correctly with probabilities like p(Elias asserts X | X is true) and p(Elias asserts X | X is false).

It ought to prevent you from making errors like this:

Appeals to authority are always fallacious

Comment author: Desrtopa 25 October 2011 06:27:29PM 2 points [-]

Really? In that case you should hopefully be able to interact correctly with probabilities like p(Elias asserts X | X is true) and p(Elias asserts X | X is false).

Assuming he was able to explain them correctly, which I think we have a lot of reason to doubt.

Comment author: Desrtopa 25 October 2011 06:17:14PM *  0 points [-]

Not until such time as you have a reason to believe that he has a justification for his belief beyond mere opinion. Otherwise, it is a mere assertion regardless of the source -- it cannot have a correlation to reality if there is no vehicle through which the information he claims to have reached him other than his own imagination, however accurate that imagination might be.

If you know that your friend more often makes statements such as this when they are true than when they are false, then you know that his claim is relevant evidence, so you should adjust your confidence up. If he reliably either watches the game, or finds out the result by calling a friend or checking online, and you have only known him to make declarations about which team won a game when he knows which team won, then you have reason to believe that his statement is strongly correlated with reality, even if you don't know the mechanism by which he came to decide to say that the Sportington Sports won.

If you happen to know that your friend has just gotten out of a locked room with no television, phone reception or internet access where he spent the last couple of days, then you should assume an extremely low correlation of his statement with reality. But if you do not know the mechanism, you must weight his statement according to the strength that you expect his mechanism for establishing correlation with the truth has.

There is a permanent object outside my window. You do not know what it is, and if you try to assign probabilities to all the things it could be, you will assign a very low probability to the correct object. You should assign pretty high confidence that I know what the object outside my window is, so if I tell you, then you can assign much higher probability to that object than before I told you, without my having to tell you why I know. You have reason to have a pretty high confidence in the belief that I am an authority on what is outside my window, and that I have reliable mechanisms for establishing it.

If I tell you what is outside my window, you will probably guess that the most likely mechanism by which I found out was by looking at it, so that will dominate your assessment of my statement's correlation with the truth (along with an adjustment for the possibility that I would lie.) If I tell you that I am blind, type with a braille keyboard, and have a voice synthesizer for reading text to me online, and I know what is outside my window because someone told me, then you should adjust your probability that my claim of what is outside my window is correct downwards, both on increased probability that I am being dishonest, and on the decreased reliability of my mechanism (I could have been lied to.) If I tell you that I am blind and psychic fairies told me what is outside my window, you should adjust your probability that my claim is correlated with reality down much further.

The "trust mechanism," as you call it, is not a device that exists separate from issues of evidence and probability. It is one of the most common ways that we reason about probabilities, basing our confidence in others' statements on what we know about their likely mechanisms and motives.

This is an area where "Bayesian rationality" is insufficient -- it fails to reliably distinguish between "what I believe" and "what I can confirm is true".

You can't confirm that anything is true with absolute certainty, you can only be more or less confident. If your belief is not conditioned on evidence, you're doing something wrong, but there is no point where a "mere belief" transitions into confirmed knowledge. Your probability estimates go up and down based on how much evidence you have, and some evidence is much stronger than others, but there is no set of evidence that "counts for actually knowing things" separate from that which doesn't.

Comment author: Logos01 25 October 2011 06:25:32PM -1 points [-]

If you know that your friend more often makes statements such as this when they are true than when they are false, then you know that his claim is relevant evidence

This is like claiming that because a coin came up heads twenty times and tails ten times it is 2x more likely to come up heads this time. Absent some other reason to justify the correlation between your friend's accuracy and the current instance, such beliefs are invalid.

If he reliably either watches the game, or finds out the result by calling a friend or checking online, and you have only known him to make declarations about which team won a game when he knows which team won,

Yup. I said as much.

The "trust mechanism," as you call it, is not a device that exists separate from issues of evidence and probability.

Yes, actually, it is a separate mechanism.

You can't confirm that anything is true with absolute certainty, you can only be more or less confident.

Yes, yes. That is the Bayesian standard statement. I'm not persuaded by it. It is, by the way, a foundational error to assert that absolute knowledge is the only form of knowledge. This is one of my major objections to standard Bayesian doctrine in general; the notion that there is no such thing as knowledge but only beliefs of varying confidence.

Bayesian probability assessments work very well for making predictions and modeling unknowns, but that's just not sufficient to the question of what constitutes knowledge, what is known, and/or what is true.

And with that, I'm done here. This conversation's gotten boring, to be quite frank, and I'm tired of having people essentially reiterate the same claims over and over at me from multiple angles. I've heard it before, and it's no more convincing now than it was previously.