A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
I really don't think that the OP can be called "obviously wrong". For example, your brain is imperfect, so it may be that believing some true things makes it less likely that you will believe other more important true things. Then, even if your core value is to believe true things, you are going to want to be careful about letting the dangerous beliefs into your head.
And the circularity that WrongBot and Vladimir Nesov have pointed out rears its head here, too. Suppose that the possibility that I pose above is true. Then, if you knew this, it might undermine the extent to which you hold believing true things to be a core value. That is precisely the kind of unwanted utility-function change that Wrongbot is warning us about.
It's probably too pessimistic to say that you could never believe the dangerous true things. But it seems reasonably possible that some true beliefs are too dangerous unless you are very careful about the way in which you come to believe them. It may be unwise to just charge in and absorb true facts willy-nilly.
Here's another way to come at WrongBot's argument. It's obvious that we sometimes should keep secrets. Sometimes more harm than good would result if someone else knew something that we know. It's not obvious, but it is at least plausible, that the "harm" could be that the other person's utility function would change in a way that we don't want. At least, this is certainly not obviously wrong. The final step in the argument is then to acknowledge that the "other person" might be the part of yourself over which you do not have perfect control — which is, after all, most of you.
I believe some other people's reports that there are things they would prefer not to know and would be inclined to honor their preference if I knew such a secret but I can't think of any examples of such secrets for myself. In almost all cases I can think of I would want to be informed of any true information that was being withheld from me. The only possible exceptions are 'pleasant surprises' that are being kept se... (read more)