I'm pleased to announce Existential Risk and Existential Hope: Definitions, a short new FHI technical report.
We look at the strengths and weaknesses of two existing definitions of existential risk, and suggest a new definition based on expected value. This leads to a parallel concept: ‘existential hope’, the chance of something extremely good happening.
When someone is ignorant of the actual chance of a catastrophic event happening, even if they consider it possible, they will have fairly high EV. When they update significantly toward the chance of that event happening, their EV will drop very significantly. This change itself meets the definition of 'existential catastrophe'.
Sounds like evidential decision theory again. According to that argument, you should maintain high EV by avoiding looking into existential risks.