This is a thought that occured to me on my way to classes today; sharing it for feedback.
Omega appears before you, and after presenting an arbitrary proof that it is, in fact, a completely trustworthy superintelligence of the caliber needed to play these kinds of games, presents you with a choice between two boxes. These boxes do not contain money, they contain information. One box is white and contains a true fact that you do not currently know; the other is black and contains false information that you do not currently believe. Omega advises you that the the true fact is not misleading in any way (ie: not a fact that will cause you to make incorrect assumptions and lower the accuracy of your probability estimates), and is fully supported with enough evidence to both prove to you that it is true, and enable you to independently verify its truth for yourself within a month. The false information is demonstrably false, and is something that you would disbelieve if presented outright, but if you open the box to discover it, a machine inside the box will reprogram your mind such that you will believe it completely, thus leading you to believe other related falsehoods, as you rationalize away discrepancies.
Omega further advises that, within those constraints, the true fact is one that has been optimized to inflict upon you the maximum amount of long-term disutility for a fact in its class, should you now become aware of it, and the false information has been optimized to provide you with the maximum amount of long-term utility for a belief in its class, should you now begin to believe it over the truth. You are required to choose one of the boxes; if you refuse to do so, Omega will kill you outright and try again on another Everett branch. Which box do you choose, and why?
(This example is obviously hypothetical, but for a simple and practical case, consider the use of amnesia-inducing drugs to selectively eliminate traumatic memories; it would be more accurate to still have those memories, taking the time and effort to come to terms with the trauma... but present much greater utility to be without them, and thus without the trauma altogether. Obviously related to the valley of bad rationality, but since there clearly exist most optimal lies and least optimal truths, it'd be useful to know which categories of facts are generally hazardous, and whether or not there are categories of lies which are generally helpful.)
Would the following be a True Fact that is supported by evidence?
You open the white box, and are hit by a poison dart, which causes you to drop into a irreversible, excruciatingly painful, minimally aware, coma, where by all outward appearances you look fine, and you find out the world goes downhill, while you get made to live forever, while still having had enough evidence that Yes, the dart DID in fact contain a poison that drops you into an:
irreversible(Evidence supporting this, you never come out of a coma),
excruciatingly painful(Evidence supporting this, your nerves are still working inside your head, you can feel this excruciating pain)
minimally aware (Evidence supporting this, while you are in the coma you are still vaugely aware that you can confirm all of this and hear about bad news that makes you feel worse on a level in addition to physical pain, such as being given the old poison dart because someone thinks it's a treasured memento instead of a constant reminder that you are an idiot.)
coma(Evidence supporting this, you can't actually act upon the outer world as if you were conscious),
where by all outward appearances you look fine(Evidence supporting this, no one appears to be aware that you are in utter agony to the point where you would gladly accept a mercy kill.)
and you find out the world goes downhill (Evidence supporting this, while in a minimally aware state, you hear about the world going downhill, UFAI, brutal torture, nuclear bombs, whatever bad things you don’t want to hear about.)
while you get made to live forever: (Evidence supporting this, you never, ever die.)
I mean, the disutility would probably be worse than that, but... surely you never purposely pick a CERTAINTY of such an optimized maximum disutility, regardless of what random knowledge it might comes with. It would be one thing if the knowledge was such that it was going to be helpful, but since it comes as part and parcel of a optimized maximum disutility, the knowledge is quite likely to be something useless or worse, like “Yes, this dart really did contain a poison to hit you with optimized maximum disutility, and you are now quite sure that is true." (You would probably have been sure of that well before now even if it wasn't explicitly given to you as a true fact by Omega!)
And Omega didn't mislead you, the dart REALLY was going to be that bad in the class of facts about darts!
Since that (or worse) seems likely to be the White Box, I'll probably as carefully as possible select the Black box while trying to be extremely sure that I didn't accidentally have a brain fart and flip the colors of the boxes by mistake in sheer panic. Anyone who would pick the White box intentionally doesn't seem to be giving enough credence to just how bad Omega can make a certainty of optimized maximum disutility and how useless Omega can select the true fact to be.
It does seem to me that the question, which box, is is your utility associated with knowing truth able to overcome your disutility associated with fear of the unknown. If you are afraid enough, I don't have to torture you to break you, I only have to show you my dentist tools and talk to you about what might be in the white box.