Wei_Dai comments on You only need faith in two things - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (86)
But even one epistemic error is enough to cause an arbitrarily large loss in utility. Suppose you think that with 99% probability, unless you personally join a monastery and stop having any contact with the outside world, God will put everyone who ever existed into hell on 1/1/2050. So you do that instead of working on making a positive Singularity happen. Since you can't update away this belief until it's too late, it does seem important to have "reasonable" priors instead of just a non-superexponentially-tiny probability to "induction works".
This is always true.
I'd say more that besides your one reasonable prior you also need to not make various sorts of specifically harmful mistakes, but this only becomes true when instrumental welfare as well as epistemic welfare are being taken into account. :)
Do you think it's useful to consider "epistemic welfare" independently of "instrumental welfare"? To me it seems that approach has led to a number of problems in the past.