Will_Newsome comments on Is Rationality Maximization of Expected Value? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (64)
The occurrence of very low probability events is also indicative of unaccounted for structural uncertainty. Taking into account both where I find myself in the multiverse as well as thinking seriously about anthropic reasoning led to me being really confused (and I still am, but less so). I think it was good that I became confused and didn't just think "Oh, according to my model, a really low probability event just happened to me, how cool is that?" It wouldn't surprise me all that much if there was a basic evolutionary adaptation not to trust one's models after heavily unanticipated events, and this may generalize to being distrustful of small probabilities in general. (But I'm postulating an evolutionary adaptation for rationality based on almost no evidence, which is most often a byproduct of thinking "What would I do if I was evolution?", which is quite the fallacy.)