I don't understand why the Smoking Lesion is a problem for evidential decision theory. I would simply accept that in the scenario given, you shouldn't smoke. And I don't see why you assert that this doesn't lessen your chances of getting cancer, except in the same sense that two-boxing doesn't lessen your chances of getting the million.
I would just say: in the scenario give, you should not smoke, and this will improve your chances of not getting cancer.
If you doubt this, consider if the correlation were known to be 100%; every person who ever smoked up til...
I see you've moved this discussion off-site. FWIW, I commend you for trying to organize the various decision theory issues into a more accessible and organized sequence. I'd like to suggest that you take some of this and use it to improve the (almost comically sparse) decision theory articles on the LW Wiki. If that's really going to be the go-to place for LW knowledge, your efforts to summarize and present this info could really be useful there, and any redundancy with existing blog posts would be a non-issue.
I'm confused as to why you said you weren't continuing this on Less Wrong, then posted it on Less Wrong.
I've read the smoking lesion thing before, and what occurred to be is that even under EDT, the reasoning in there is wrong. What I mean was that one shouldn't simply reason it out by comparing to the average stats, but take into account the fact that they're using EDT itself. ie, they should say "given that a person is using EDT, then what's the correlation between etc etc..."
Worth referencing:
The Smoking Lesion on the wiki.
Timeless Decision Theory and Meta-Circular Decision Theory, where Eliezer discusses this problem (among others)
(By the way, your blog has some interesting posts!)
You left out some steps in your argument. It appears you were going for a disjunction elimination, but if so I'm not convinced of one premise. Let me lay out more explicitly what I think your argument is supposed to be, then I'll show where I think it's gone wrong.
A = "The rational decision is to two-box" B = "Omega has set me to one-box" C = "The rational decision is to one-box" D = "Omega has set me to two-box" E = "I must not be deciding rationally"
1. (A∧B)→E
2. (C∧D)→E
3. (A∧B)∨(C∧D)
4. ∴ E
I'll grant #1 and #2. This is a valid argument, but the dubious proposition is #3. It is entirely possible that (A∧D) or that (C∧B). And in those cases, E is not guaranteed.
In short, you might decide rationally in cases where you're set to one-box and it's rational to one-box.
Proposition 3 is only required to be possible, not to be true, and is supported by the existence of both paths of the scenario: the scenario requires that both A and B are possible.
It is possible that I will make the rational decision in one path of the scenario. But the scenario contains both paths. In one of the two paths I must be deciding irrationally.
Given as it was stated that I will use my normal thought-processes in both paths, my normal thought-processes must, in order for this scenario to be possible, be irrational.
This is part of a sequence titled "An introduction to decision theory". The previous post was Newcomb's Problem: A problem for Causal Decision Theories
For various reasons I've decided to finish this sequence on a seperate blog. This is principally because there were a large number of people who seemed to feel that this sequence either wasn't up to the Less Wrong standard or felt that it was simply covering ground that had already been covered on Less Wrong.
The decision to post it on another blog rather than simply discontinuing it came down to the fact that other people seemed to feel that the sequence had value. Those people can continue reading it at "The Smoking Lesion: A problem for evidential decision theory".
Alternatively, there is a sequence index available: Less Wrong and decision theory: sequence index