If we're talking about the probabilities of X and Y, as you say here, then the evidence against them lowers those probabilities
Not if it's just a matter of choosing X or Y. It's impossible in such a situation for a piece of evidence to lower both probabilities.
Perhaps an example will make it clearer:
Let's suppose that a victim is found dead in a pool of blood, apparently having died from a gunshot wound.
There are two possibilities: (1) He was shot from a distance with a rifle; and (2) He was shot at close range with a small caliber handgun. I favor the first hypothesis and you favor the second.
Ok, now let's suppose we find a new piece of evidence: There is no bullet found inside or around the victim's body. Further, it is known that if somebody is shot from a distance with a rifle, a bullet will be find in or around the person's body 99.99% of the time.
In common parlance, one might say that such a piece of evidence contradicts or undermines the hypothesis that the person was shot from a distance with a rifle. Since we have just seen something which is totally unexpected if our hypothesis is correct.
On the other hand, suppose we know that being shot at close range with a handgun carries a 99.999% chance of finding a bullet in or around the victim's body. In that case, what has been reasonably described as "contradictory evidence" actually increases the chances that the first hypothesis is correct.
Hope that makes things clear for you.
The probability of both, in that case, plummets, and you should start looking at other explanations. Like, say, that the victim was shot with a rifle at close range, which only leaves a bullet in the body 1% of the time (or whatever).
It might be true that, between two hypotheses one is now more likely to be true than the other, but the probability for both still dropped, and your confidence in your pet hypothesis should still drop right along with its probability of being correct.
So say you have hypothesis X at 60% confidence and hypotheses Y at 40% New ...
There are few places where society values rational, objective decision making as much as it values it in judges. While there is a rather cynical discipline called legal realism that says the law is really based on quirks of individual psychology, "what the judge had for breakfast," there's a broad social belief that the decision of judges are unbiased. And where they aren't unbiased, they're biased for Big, Important, Bad reasons, like racism or classism or politics.
It turns out that legal realism is totally wrong. It's not what the judge had for breakfast. It's how recently the judge had breakfast. A a new study (media coverage) on Israeli judges shows that, when making parole decisions, they grant about 65% after meal breaks, and almost all the way down to 0% right before breaks and at the end of the day (i.e. as far from the last break as possible). There's a relatively linear decline between the two points.
Think about this for a moment. A tremendously important decision, determining whether a person will go free or spend years in jail, appears to be substantially determined by an arbitrary factor. Also, note that we don't know if it's the lack of food, the anticipation of a break, or some other factor that is responsible for this. More interestingly, we don't know where the optimal result occurred. It's probably not the near 0% at the end of each work period. But is it the post-break high of 65%? Or were judges being too nice? We know there was bias, but we still don't know when bias occurred.
There are at least two lessons from this. The little, obvious one is to be aware of one's own physical limitations. Avoid making big decisions when tired or hungry - though this doesn't mean you should try to make decisions right after eating. For particularly important decisions, consider contemplating them at different times, if you can. Think about one thing Monday morning, then Wednesday afternoon, then Saturday evening, going only to the point of getting an overall feel for an answer, and not to the point of really making a solid conclusion. Take notes, and then compare them. This may not work perfectly, but it may help you realize inconsistencies, which could help. For big questions, the wisdom of crowds may be helpful - unless it's been a while since most of the crowd had breakfast.
The bigger lesson is one of humility. This provides rather stark evidence that our decisions are not under our control to the extent we believe. We can be influenced by factors we don't even suspect. Even knowing we have been biased, we may still be unable to identify what the correct answer was. While using formal rules and logic may be one of the best approaches to minimizing such errors, even formal rules can fail when applied by biased agents. The biggest, most condemnable biases - like racism - are in some ways less dangerous, because we know we need to look out for them. It's the bias you don't even suspect that can get you. The authors of the study think they basically got lucky with these results - if the effect had been to make decisions arbitrary rather than to increase rejections, this would not have shown up.
When those charged with making impartial decisions that control people's lives are subject to arbitrary forces they never suspected, it shows how important it is and much more we can do to be less wrong.