bokov comments on What makes us think _any_ of our terminal values aren't based on a misunderstanding of reality? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (89)
So, to clarify:
We don't know what a perfectly rational agent would do if confronted with all goals being epistemically irrational, but there is no instrumental value in answering this question because if we found ourselves in such a situation we wouldn't care.
Is that a fair summary? I don't yet know if I agree or disagree, right now I'm just making sure I understand your position.
I believe that is a fair summary of my beliefs.
Side note: Before I was convinced by EY's stance on compatibilism of free will, I believed in free will for a similar reason.