That is, if you have an option of trading Doom for UFAI, while forsaking only negligible probability of FAI, you should take it.
Fascinating! Do you still agree with what you wrote there? Are you still researching this issues and do you plan on writing a progress report or an open problems post? Would you be willing to write a survey paper on decision theoretic issues related to acausal trade?
My best guess about what's preferable to what is still this way, but I'm significantly less certain of its truth (there are analogies that make the answer come out differently, and level of rigor in the above comment is not much better than these analogies). In any case, I don't see how we can actually use these considerations. (I'm working in a direction that should ideally make questions like this more clear in the future.)
This thread is for the discussion of Less Wrong topics that have not appeared in recent posts. If a discussion gets unwieldy, celebrate by turning it into a top-level post.