Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
Ursus00

Thanks for the response.

However, I think you misunderstood what I was attempting to say. I see I didn't use the term "filtered evidence", and am wondering if my comment showed up somewhere other than the article "what evidence filtered evidence": http://lesswrong.com/lw/jt/what_evidence_filtered_evidence/ Explaining how I got a response so quickly when commenting on a 5 year old article! If so, my mistake as my comment was then completely misleading!

When the information does not come from a filtered source, I agree with you. If I find out that there is evidence that will be in the up (or down) direction of a belief, this will modify my priors based on the degree of entanglement between the source and the matter of the belief. After seeing the evidence then the probability assessment will on average remain the same; if it was weaker/stronger it will be lower/higher (or higher/lower, if evidence was downward) but it will of course not pass over the initial position before I heard of the news, unless of course it turns out to be evidence for the opposite direction.



What my question was about was filtered evidence.
Filtered evidence is a special case, in which that entanglement between the source and matter of the belief = 0.
Using Eliezer_Yudkowsky's example from "the bottom line":
http://lesswrong.com/lw/js/the_bottom_line/
The only entanglement between the claim of which sort of evidence I will be presented with, "box B contains the diamond!", is with whether the owner of box A or box B bid higher (owner of box B, apparently).

Not with the actual content of those boxes (unless there is some relation between willingness to bid and things actually entangled with presence/absence of a diamond).


Therefore, being told of this will not alter my belief about whether or not box B is the one that contains the diamond.
This is the lead-in to the questions posed in my previous/first post.

Knowing that the source is filtered, every single piece of evidence he gives you will support the conclusion the diamond is in box B. Yet you simply cannot expect every single piece of evidence to on average increase your belief that it is in box B.
While the arguments are actually entangled, their selective presentation means P(E)=1 and P(~E)=0.
You can't balance the equations other than by not modifying your beliefs at all with each piece of evidence.
Probability of the diamond being in box B = probability of the diamond being in box B given that you are shown evidence, meaning the evidence doesn't matter.
The direction of the evidence that passes through the filter (the clever arguer) is entangled with which box-owner bid more money. Not with which box actually contains the diamond.
Thus, it sounds like you simply should not modify your beliefs when faced with a clever arguer who filters evidence. No entanglement between evidence's direction and the truth/conservation of expected evidence.

My problem: not taking into account evidence entangled with reality doesn't sit well with me. It sounds as though it should ultimately be taken into account, but I can't immediately think of an effective process by which to do it.

Using drugs instead of boxes if that is an example you prefer: imagine a clever arguer hired by Merck to argue about what a great drug Rofecoxib is. The words "cardiovascular", "stroke", and "heart attack" wont ever come up. With the help of selectively drawing from trials, a CA can paint a truly wonderful picture of the drug that has limited baring on reality.

Before seeing his evidence he tells you "Rofecoxib is wonderful!" This shouldn't modify your belief, as it only tells you he is on Merck's payroll. Now how do you appropriately modify your belief on the drug's quality and merits with the introduction of each piece of evidence this clever arguer presents to you?

Ursus10

I'm trying to incorporate this with conservation of expected evidence: http://lesswrong.com/lw/ii/conservation_of_expected_evidence/

For example: "On average, you must expect to be exactly as confident as when you started out. Equivalently, the mere expectation of encountering evidence—before you've actually seen it—should not shift your prior beliefs." -Eliezer_Yudkowsky AND "Your actual probability starts out at 0.5, rises steadily as the clever arguer talks (starting with his very first point, because that excludes the possibility he has 0 points)" -Eliezer_Yudkowsky

Appear to be contradictions, given that each point= a piece of evidence (shininess of box, presence of blue stamp, etc).

The cherry picking problem appears to be similar to the witch trial problem. In the latter any piece of evidence is interpreted to support the conclusion, while in the former evidence is only presented if it supports the conclusion.

You can't expect your probabilities to on average be increased before seeing/hearing the evidence.

I think only if you do have a large background of knowledge, with a high probability that you are already aware of any given piece of evidence. But if you hear a repeat evidence, it simply shouldn't alter your probabilities, rather than lower it. I'm having a hard time coming up with a way to properly balance the equation.

The only thing I can think of is if you count the entire argument as one piece of evidence, and use a strategy like suggested by g for updating your priors based on the entire sum?

But you don't necessarily listen to the entire argument. Knowing about hypothetical cut off points below which they wont spend the time to present and explain evidence means with enough info you could still construct probabilities. If time is limited, can you update with each single piece of evidence based on strength relative to expected?

What if you are unfamiliar with the properties of boxes and how they are related to likelihood of the presence of a diamond? Any guesstimates seem like they'd be well my abilities at least.

Unless I already know a lot, I have a hard time justifying updating my priors at all based on CA's arguments. If I do know a lot, I still can't think of a way to justifiably not expect the probability to increase, which is a problem. Help, ideas?

PS. Thankfully not everyone is a clever arguer. Ideally, scientists/teachers teaching you about evolution (for example) will not be selective in giving evidence. The evidence will simply be lopsided because of nature being lopsided in how it produces evidence (entangled with truth). I don't think one has to actually listen to a creationist, assuming it is known that the scientists/source material the teacher is drawing from are using good practice.

Also, this is my first post here, so if I am ignorant please let me know and direct me to how I can improve!