Solomon's Problem and varients thereof are often cited as criticism of Evidential decision theory.
For background, here's Solomon's Problem: King Solomon wants to sleep with another man's wife. However, he knows that uncharismatic leaders frequently sleep with other men's wives, and charismatic leaders almost never do. Furthermore, uncharismatic leaders are frequently overthrown, and charismatic leaders rarely are. On the other hand, sleeping with other men's wives does not cause leaders to be overthrown. Instead, high charisma decreases the chance that a leader will sleep with another man's wife and the chance that the leader will be overthrown separately. Not getting overthrown is more important to King Solomon than getting the chance to sleep with the other guy's wife.
Causal decision theory holds that King Solomon can go ahead and sleep with the other man's wife because it will not directly cause him to be overthrown. Timeless decision theory holds that he can sleep with the woman because it will not cause his overthrow in any timeless sense either. Conventional wisdom holds that Evidential decision theory would have him refrain from her, because updating on the fact that he slept with her would suggest a higher probability that he will get overthrown.
The problem with that interpretation is that it assumes that King Solomon only updates his probability distributions based on information about him that is accessible to others. He cannot change whether or not he would sleep with another man's wife given no other disincentives by refraining from doing so in response to other disincentives. The fact that he is faced with the dilemma already indicates that he would. Updating on this information, he knows that he is probably uncharismatic, and thus likely to get overthrown. Updating further on his decision after taking into account the factors guiding his decision will not change the correct probability distribution.
This more complete view of Evidential decision theory is isomorphic to Timeless decision theory (edit: shown to be false in comments). I'm slightly perplexed as to why I have not seen it elsewhere. Is it flawed? Has it been mentioned elsewhere and I haven't noticed? If so, why isn't it so widely known?
I think the vanilla tickle-defence still two-boxes on Newcomb's problem. The logic goes something like:
Apply some introspection to deduce whether you are likely to be a one-boxer or a two-boxer (in the same way that Solomon introspects to see whether he is charismatic) and then use this information to deduce whether the money is in the box. Now you are facing the transparent-box version of the dilemma, in which EDT two-boxes.
Tickle defence works by gradually screening off all non-causal paths emerging from the 'action' node while CDT simply ignores them. This means they make decisions based on different information, so they aren't entirely identical, although they are similar in that they both always choose the dominant strategy when there is one (which incidentally proves that Tickle Defence is not TDT, since TDT does not always choose a dominant strategy).
Oh, I see. I hadn't even thought about the fact that EDT fails Newcomb's problem if the prediction is revealed beforehand.
Edit: Wait a minute, I'm not sure that works. The predictor's decision depends on what your final decision will be, so noting your inclination to one-box or two-box does not completely screen off your final decision from the contents of the box that may or may not contain $1 million.
The transparent Newcomb's problem is still a fatal flaw in EDT, though.